Importance analysis deals with the investigation of influence of individual system components on system operation. This investigation can be qualitative or quantitative. The qualitative analysis ...focuses on finding scenarios in which a degradation/improvement of a specific component or a group of components results in a degradation/improvement of the whole system, while the quantitative one deals with numerical estimation of components importance. In this paper, we propose a new approach for importance analysis of Multi-State Systems (MSSs). The qualitative or quantitative importance analysis is based on the identification of critical states. The new approach can be used for calculation of all types of critical states. It is based on the application of direct partial logic derivatives. These derivatives are one of the methods of logical differential calculus. In this paper, the specifics of these derivatives for importance analysis of MSSs are considered in detail.
•The summarization of well-known importance measures are considered.•Direct Partial Logical Derivative is developed as IDPLD.•New method for calculation of well-known and new importance measures are proposed by IDPLD.•The application of proposed method is illustrated by simple example.
Monte Carlo methods represent the
de facto
standard for approximating complicated integrals involving multidimensional target distributions. In order to generate random realizations from the target ...distribution, Monte Carlo techniques use simpler proposal probability densities to draw candidate samples. The performance of any such method is strictly related to the specification of the proposal distribution, such that unfortunate choices easily wreak havoc on the resulting estimators. In this work, we introduce a
layered
(i.e., hierarchical) procedure to generate samples employed within a Monte Carlo scheme. This approach ensures that an appropriate equivalent proposal density is always obtained automatically (thus eliminating the risk of a catastrophic performance), although at the expense of a moderate increase in the complexity. Furthermore, we provide a general unified importance sampling (IS) framework, where multiple proposal densities are employed and several IS schemes are introduced by applying the so-called deterministic mixture approach. Finally, given these schemes, we also propose a novel class of adaptive importance samplers using a population of proposals, where the adaptation is driven by independent parallel or interacting Markov chain Monte Carlo (MCMC) chains. The resulting algorithms efficiently combine the benefits of both IS and MCMC methods.
Importance sampling (IS) methods are broadly used to approximate posterior distributions or their moments. In the standard IS approach, samples are drawn from a single proposal distribution and ...weighted adequately. However, since the performance in IS depends on the mismatch between the targeted and the proposal distributions, several proposal densities are often employed for the generation of samples. Under this multiple importance sampling (MIS) scenario, extensive literature has addressed the selection and adaptation of the proposal distributions, interpreting the sampling and weighting steps in different ways. In this paper, we establish a novel general framework with sampling and weighting procedures when more than one proposal is available. The new framework encompasses most relevant MIS schemes in the literature, and novel valid schemes appear naturally. All the MIS schemes are compared and ranked in terms of the variance of the associated estimators. Finally, we provide illustrative examples revealing that, even with a good choice of the proposal densities, a careful interpretation of the sampling and weighting procedures can make a significant difference in the performance of the method.
Linear regression is often used as a diagnostic tool to understand the relative contributions of operational variables to some key performance indicator or response variable. However, owing to the ...nature of plant operations, predictor variables tend to be correlated, often highly so, and this can lead to significant complications in assessing the importance of these variables. Shapley regression is seen as the only axiomatic approach to deal with this problem but has almost exclusively been used with linear models to date. In this paper, the approach is extended to random forests, and the results are compared with some of the empirical variable importance measures widely used with these models, i.e., permutation and Gini variable importance measures. Four case studies are considered, of which two are based on simulated data and two on real world data from the mineral process industries. These case studies suggest that the random forest Shapley variable importance measure may be a more reliable indicator of the influence of predictor variables than the other measures that were considered. Moreover, the results obtained with the Gini variable importance measure was as reliable or better than that obtained with the permutation measure of the random forest.
Evaluating the importance of nodes for complex networks is of great significance to the research of survivability and robusticity of networks. This paper proposes an effective ranking method based on ...degree value and the importance of lines. It can well identify the importance of bridge nodes with lower computational complexity. Firstly, the properties of nodes that are connected to a line are used to compute the importance of the line. Then, the contribution of nodes to the importance of lines is calculated. Finally, degree of nodes and the contribution of nodes to the importance of lines are considered to rank the importance of nodes. Five real networks are used as test data. The experimental results show that our method can effectively evaluate the importance of nodes for complex networks.
•A Node Importance ranking method (DIL) is proposed based on local information.•The importance of line is considered to evaluate the importance of node.•DIL can well identify the importance of nodes especially the bridge nodes.•DIL can be used in large-scale networks with lower computational complexity.
•Stratified importance sampling and adaptive Kriging are combined for reliability analysis.•Importance sampling density is constructed through an adaptive Kriging method.•Variable for stratification ...is chosen based on Kriging surrogate to decrease the cost.•Numerical and practical examples show the efficiency of the proposed method.
In reliability engineering, estimating the failure probability of a system is one of the most challenging tasks. Since many applied engineering tasks are computationally expensive, it is challenging to estimate failure probabilities using acceptable computational costs. In this paper, to reduce computational cost, we combine a stratified importance sampling method with an adaptive Kriging strategy to estimate failure probabilities. Compared to the importance sampling method, stratified importance sampling needs fewer samples to get an estimate of failure probability with the same coefficient of variation. In the proposed method, we improve the importance sampling density and determine the best input variable for stratification through a Kriging-based model surrogate technique (like a Gaussian process regression). Then, the Kriging surrogate is further adaptively improved to get an accurate estimate of failure probability. The efficiency of the proposed method is demonstrated using several analytic examples and then transferred to a carbon dioxide storage benchmark problem.
In this paper, we propose a novel node importance evaluation method from the perspective of the existence of mutual dependence among nodes. The node importance comprises its initial importance and ...the importance contributions from both the adjacent and non-adjacent nodes according to the dependence strength between them. From the simulation analyses on an example network and the ARPA network, we observe that our method can well identify the node importance. Then, the cascading failures on the Netscience and E-mail networks demonstrate that the networks are more vulnerable when continuously removing the important nodes identified by our method, which further proves the accuracy of our method.
•We propose a novel node importance evaluation method, which considers multi-layer and uneven node importance contributions.•Experiments demonstrate the feasibility and validity of our method.•The cascading failures cause the worse network invulnerability under our method.
•The paper proposes a novel dynamic graph recurrent convolutional neural network model, named Dynamic-GRCNN, to deeply capture the spatio-temporal traffic flow features for more accurately predicting ...urban passenger traffic flows.•The paper presents incidence dynamic graph structures based on historically passenger traffic flows to model traffic station relationships. Different from existing traffic transportation network topological structures based graph relationships between stations, the incidence dynamic graph structures firstly model the traffic relationships from historical passenger flows.•For real urban passenger traffic flows, the paper demonstrates that dynamic spatial-temporal incidence graphs are more suitable to model external changes and influences.•The paper compares Dynamic-GRCNN with state-of-the-art deep learning approaches on three benchmark datasets which contain different types of passenger traffic flows for evaluation. The results show that Dynamic-GRCNN significantly outperforms all the baselines in both effectiveness and efficiency in urban passenger traffic flows prediction.
Accurate and real-time traffic passenger flows forecasting at transportation hubs, such as subway/bus stations, is a practical application and of great significance for urban traffic planning, control, guidance, etc. Recently deep learning based methods are promised to learn the spatial-temporal features from high non-linearity and complexity of traffic flows. However, it is still very challenging to handle so much complex factors including the urban transportation network topological structures and the laws of traffic flows with spatial and temporal dependencies. Considering both the static hybrid urban transportation network structures and dynamic spatial-temporal relationships among stations from historical traffic passenger flows, a more effective and fine-grained spatial-temporal features learning framework is necessary. In this paper, we propose a novel spatial-temporal incidence dynamic graph neural networks framework for urban traffic passenger flows prediction. We first model dynamic traffic station relationships over time as spatial-temporal incidence dynamic graph structures based on historically traffic passenger flows. Then we design a novel dynamic graph recurrent convolutional neural network, namely Dynamic-GRCNN, to learn the spatial-temporal features representation for urban transportation network topological structures and transportation hubs. To fully utilize the historical passenger flows, we sample the short-term, medium-term and long-term historical traffic data in training, which can capture the periodicity and trend of the traffic passenger flows at different stations. We conduct extensive experiments on different types of traffic passenger flows datasets including subway, taxi and bus flows in Beijing. The results show that the proposed Dynamic-GRCNN effectively captures comprehensive spatial-temporal correlations significantly and outperforms both traditional and deep learning based urban traffic passenger flows prediction methods.
Component importance measures are relevant to improve the system design and to develop optimal replacement policies. Birnbaum's importance measure is one of the most relevant measures. If the ...components are (stochastically) independent, this measure can be defined using several equivalent expressions. However, in many practical situations, the independence assumption is unrealistic. It also turns out that in the case of dependent components, different Birnbaum's measure definitions lead to different concepts. In this paper, we extend Birnbaum's importance measure to the case of dependent components in a way allowing us to obtain relevant properties including connections and comparisons with other measures proposed and studied recently. The dependence is modeled through copulas and the new measure is based on the contribution of the component to the system reliability.
The growth of literature in the field of quality of service in the public transport (PT) sector shows increasing concern for a better understanding of the factors affecting service quality (SQ) in PT ...organizations and companies. A large variety of approaches to SQ have been developed in recent years owing to the complexity of the concept; the broad range of attributes required to evaluate SQ; and the imprecision, subjectivity, and heterogeneous nature of the data used to analyze it. Most of these approaches are based on customer satisfaction surveys. This paper seeks to summarize the evolution of research and current thinking as it relates to the different methodological approaches for SQ evaluation in the PT sector over the years and to provide a discussion of future directions.