Purpose
The purpose of this paper is to develop a conceptual framework of waste management by using total interpretive structural modeling (TISM) technique in the Indian organizational context.
...Design/methodology/approach
TISM technique has been used to develop a conceptual framework of waste management in the organizational context, where the waste management factors have been identified and verified through content analysis.
Findings
The conceptual framework of waste management in the organization has been developed using TISM that contributes to the development of important links and the hierarchical relationships among the factors. In addition to it, the model also figures out the driving and dependent factors of waste management.
Research limitations/implications
This study has its implication for both organizations and policy makers. It provides the important factors for managing waste in the organization which must be considered before planning the waste management practice in an organization. For policy makers, it highlights the waste management paths and important linkages required for the development of waste management policies in the organizations.
Originality/value
This study has made its contribution by providing the conceptual framework for waste management in the Indian organizational context which has been developed through qualitative modeling technique. The conceptual framework also provides important paths of managing waste in the organization which is a new effort in the Indian organizational context.
Cloud Computing revolves around internet based acquisition and release of resources from a data center. Being internet based dynamic computing; cloud computing also may suffer from overloading of ...requests. Load balancing is an important aspect which concerns with distribution of resources in such a manner that no overloading occurs at any machine and resources are optimally utilized. However this aspect of cloud computing has not been paid much attention yet. Although load balancing is being considered as an important aspect for other allied internet based computing environments such as distributed computing, parallel computing etc. Many algorithms had been proposed for finding the solution of load balancing problem in these fields. But very few algorithms are proposed for cloud computing environment. Since cloud computing is significantly different from these other types of environments, separate load balancing algorithm need to be proposed to cater its requirements. This work proposes an Autonomous Agent Based Load Balancing Algorithm (A2LB) which provides dynamic load balancing for cloud environment. The proposed mechanism has been implemented and found to provide satisfactory results.
We consider the problem of global optimization of an unknown non-convex smooth function with noisy zeroth-order feedback. We propose a local minimax framework to study the fundamental difficulty of ...optimizing smooth functions with adaptive function evaluations. We show that for functions with fast growth around their global minima, carefully designed optimization algorithms can identify a near global minimizer with many fewer queries than worst-case global minimax theory predicts. For the special case of strongly convex and smooth functions, our implied convergence rates match the ones developed for zeroth-order convex optimization problems. On the other hand, we show that in the worst case no algorithm can converge faster than the minimax rate of estimating an unknown function in the ℓ ∞ -norm. Finally, we show that non-adaptive algorithms, though optimal in a global minimax sense, do not attain the optimal local minimax rate.
Knowledge of a network's topology and internal characteristics such as delay times or losses is crucial to maintain seamless operation of network services. Network tomography is a useful approach to ...infer such knowledge from end-to-end measurements between nodes at the periphery of the network, as it does not require cooperation of routers and other internal nodes. Most current tomography algorithms are single-source methods, which use multicast probes or synchronized unicast packet trains to measure covariances between destinations from a single vantage point and recover a tree topology from these measurements. Multi-source tomography, on the other hand, uses pairwise hop counts or latencies and consequently overcomes the difficulties associated with obtaining measurements for single-source methods. However, topology recovery is complicated by the fact that the paths along which measurements are taken do not form a tree in the network. Motivated by recent work suggesting that these measurements can be well-approximated by tree metrics, we present two algorithms that use selective pairwise distance measurements between peripheral nodes to construct a tree whose end-to-end distances approximate those in the network. Our first algorithm accommodates measurements perturbed by additive noise, while our second considers a novel noise model that captures missing measurements and the network's deviations from a tree topology. Both algorithms provably use O (p polylog p) pairwise measurements to construct a tree approximation on p end hosts. We present extensive simulated and real-world experiments to evaluate both of our algorithms.
Summary
Blundell, Pistaferri, and Preston (American Economic Review, 2008, 98(5), 1887–1921) report an estimate of household consumption insurance with respect to permanent income shocks of 36%. In ...replicating findings for their model and data, we find that this estimate is distorted by a code error and is not robust to weighting scheme for generalized method of moments (GMM) or consideration of quasi maximum likelihood estimation (QMLE), which produces a significantly higher estimate of consumption insurance at 55%. For sub‐groups by age and education, the differences between estimates across methods are even more pronounced, and QMLE provides new insights into heterogeneity across households compared to the original study. Monte Carlo experiments using non‐normal shocks suggest that consumption insurance estimates for the model are more accurate for QMLE than GMM, including when correcting for bias and especially given a smaller sample such as is only available when looking at sub‐groups.
Ontologies play a vital role in knowledge representation in artificial intelligent systems. With emergence and acceptance of semantic web and associated services offered to the users, more and more ...ontologies have been developed by various stack-holders. Different ontologies need to be mapped for various systems to communicate with each other. Ontology mapping is an open research issue in web semantics. Exact mapping of ontologies is rare to achieve so it's an optimization problem. This work presents and optimized ontology mapping mechanism which deploys genetic algorithm.
Summary
We extend a widely used semi‐structural model to identify and estimate dynamic consumption elasticities with respect to transitory income shocks. Applying our model to household survey data, ...we find a structural break in marginal propensities to consume following the end of the housing market boom, with the average across households increasing significantly. There is important heterogeneity by different household balance sheet characteristics, and the increase in the average appears to be driven by higher short‐run consumption elasticities for homeowners with low liquid wealth. The change in consumption behavior is consistent with tighter borrowing constraints more than a shift in wealth distributions.
In domains like automation, particularly in advanced driver-assistance and collision avoidance systems for vehicles, the need for reliable sensing and predictive capabilities is paramount. While ...current sensor technologies allow for quick responses to safety-critical events, there is a growing emphasis on predictive capabilities to anticipate and prevent such incidents. This opens new ideas for the development of sophisticated RF-sensors and algorithms capable of accurately predicting potential harms before they occur. Moreover, along with the benefits of enhanced sensing capabilities comes the imperative to safeguard user-privacy. Various techniques, including encryption and differential privacy, are employed to ensure that RF-based services do not compromise user data, while innovative approaches such as discarding irrelevant data and utilizing privacy-preserving measurements like received signal strength and Doppler shift offer promising avenues for balancing functionality with privacy concerns.In this thesis, we integrate the power of radio frequency sensing and rapid progress of the data-driven learning to bring forth ideas that can be applied to safety-critical applications for enhancing efficiency and user experience in an increasingly interconnected world. We present three products that are built using three types of radio devices with differing in their operating principles and thus, their capabilities to be utilized in three unique scenarios; network-based cooperative sensing, standalone sensing, and sensing as a surveyor/monitor.Our first novel contribution is an infrastructure-free approach to collision prediction using ultra-wideband (UWB) signals and inertial sensing. They employ a cooperative strategy based on pairwise ranges and velocities to predict future collisions, utilizing an improved algorithm to estimate relative kinematics despite noisy measurements. This method is complementary to existing technologies dependent on object properties, with UWB chosen for its precise measurements and independence from indoor object properties.We continue our endeavor by introducing a systematic shift in the type of sensor used, by instead using a standalone sensor, such as radar, for collision prediction in noisy, cluttered environments with dynamic motion. We utilize the radar-Doppler data and a convolutional neural network (CNN) to predict collisions, adapting features from the environment to handle inaccuracies. Online learning and automated labeling techniques are employed to make the CNN adaptable, with experiments resulting in a labeled dataset for validation against other methods.Finally, in order to provide a method to implement these safety-critical applications, we investigate how the sensors can 'police' or survey an area and build applications from extracting only the relevant data from the RF-signal measurements through a new measurement called Doppler spread. Outdoor experiments in a densely populated area are conducted, generating a labeled database and examining Doppler spread's effectiveness for a privacy-preserving localization system based on fingerprinting.
Mycobacterium tuberculosis attenuates many defence responses from alveolar macrophages to create a niche at sites of infection in the human lung. Levels of Heat Shock Proteins have been reported to ...increase many folds in the serum of active TB patients than in latently infected individuals. Here we investigated the regulation of key defence responses by HSPs during mycobacterial infection. We show that infection of macrophages with M. bovis BCG induces higher expression of HSP-27 and HSP-70. Inhibiting HSP-27 and HSP-70 prior to mycobacterial infection leads to a significant decrease in mycobacterial growth inside macrophages. Further, inhibiting HSPs resulted in a significant increase in intracellular oxidative burst levels. This was accompanied by an increase in the levels of T cell activation molecules CD40 and IL-12 receptor and a concomitant decrease in the levels of T cell inhibitory molecules PD-L1 and IL-10 receptor. Furthermore, inhibiting HSPs significantly increased the expression of key proteins in the autophagy pathway along with increased activation of pro-inflammatory promoting transcription factors NF-κB and p-CREB. Interestingly, we also show that both HSP-27 and HSP-70 are associated with anti-apoptotic proteins Bcl-2 and Beclin-1. These results point towards a suppressive role for host HSP-27 and HSP-70 during mycobacterial infection.