Fault prognostic determines whether a failure is impending and estimates how soon an incident will occur; it is nowadays recognized as a key feature in maintenance strategies. For slowly time-varying ...autocorrelated fault process, the fault degradation process can be revealed for fault prognostic. Based on this assumption, a fault degradation modeling and online fault prognostic strategy is developed in this paper. A stability factor (SF) is defined to evaluate the changing characteristics of process status and a SF-based non-steady faulty variable identification method is developed to find critical-to-fault-degradation variables. A fault degradation-oriented Fisher discriminant analysis is proposed on the selected variables to model the fault evolution process. Uninformative fault effects that do not present degradation are excluded, so that the critical fault degradation information can be focused on. The proposed method is verified by three cases, including a numerical case, cut-made process of cigarette, and the well-known Tennessee Eastman benchmark chemical process.
For industrial processes, there are always some specific faults which are not easy to be detected by the conventional PCA algorithm since the monitoring models are defined based on the general ...distribution information of normal data which may not highlight the abnormal changes. For these specific faults, if fault data are available and used for model development, more meaningful directions may be extracted for monitoring which can improve fault detection sensitivity. In the present work, a fault-relevant principal component analysis (FPCA) algorithm is proposed for statistical modeling and process monitoring by using both normal and fault data. The key is how to extract and supervise the fault-influential data distribution directions. By analyzing the relative changes from normal to fault with available fault data, the new model structure further decomposes the original PCA systematic subspace and residual subspace into two parts respectively. The part that will present larger variation relative to the normal case under the disturbance of fault is regarded to be more informative for fault detection (called fault-relevant part here). It is then separated from the fault-irrelevant part and highlighted for online monitoring which is deemed to be more effective for fault detection. The proposed method provides a detailed insight into the decomposition of the original normal process information from the fault-relevant perspective. Its sensitivity to fault detection is illustrated by data from a numerical example and the Tennessee Eastman process.
•A fault-relevant principal component analysis (FPCA) algorithm is proposed.•Fault effects are highlighted by analyzing relative changes from normal to fault.•Fault-relevant monitoring directions are separated from the fault-irrelevant ones.•Different types of variations are decomposed for online monitoring separately.•Its sensitivity to fault detection is illustrated by numerical and simulator data.
Complex industrial processes may be formulated with hybrid correlations, indicating that linear and nonlinear relationships simultaneously exist among process variables, which brings great challenges ...for process monitoring. However, previous work did not consider the hybrid correlations and treated all the variables as a single subject in which single linear or nonlinear analysis method was employed based on prior process knowledge or some evaluation results, which may degrade the model accuracy and monitoring performance. Therefore, for complex processes with hybrid correlations, this paper proposes a linearity evaluation and variable subset partition based hierarchical modeling and monitoring method. First, linear variable subsets are separated from nonlinear subsets through an iterative variable correlation evaluation procedure. Second, hierarchical models are developed to capture linear patterns and nonlinear patterns in different levels. Third, a hierarchical monitoring strategy is proposed to monitor linear feature and nonlinear feature separately. By separating and modeling different types of variable correlations, the proposed method can explore more accurate process characteristics and thus improve the fault detection ability. Numerical examples and industrial applications are presented to illustrate its efficiency.
Competition and demand for consistent and high-quality product have spurred the development of quality prediction methods for industrial manufacturing processes. Multiplicity of phases is, in ...general, common nature of many batch manufacturing processes. Considering that different phases may have different effects on qualities, one of the key issues is how to partition the whole batch process into multiple phases. In the present work, an automatic quality-relevant step-wise sequential phase partition (QSSPP) algorithm is developed for phase-based regression modeling and quality prediction. It considers the time sequence of operation phases and can capture the time-varying quality prediction relationships. Using this algorithm, phases are separated in order from quality-relevant perspective, revealing different quality prediction relationships. The phase-based regression system is set up for online quality prediction and the online prediction results are quantitatively evaluated for each phase. The feasibility and performance of the proposed algorithm are illustrated by an important manufacturing process, injection molding.
Long‐term monitoring of fetal heart is vital for early clinical diagnosis and timely treatment of fetus. However, the collected signals from the abdomen of the pregnant woman are always corrupted by ...many interfering sources, most significantly affected by the maternal electrocardiograph (ECG). Similar with blind source separation (BSS), fetal ECG extraction can be considered as the separation of fetal ECG from the recordings. This letter proposes a novel multi‐layer polynomial network to extract the fetal ECG waveform, where the non‐linear mixing process is approximated with multiple polynomials. The network starts with the first‐order polynomial layer, and further requirements to decrease the error between the output and target signal is achieved by increasing the number of layers. The proposed approach turns the inverse transformation of traditional BSS into a model learning problem, which is then solved by a simple convex optimization algorithm. Compared with the state‐of‐the‐art BSS methods, experimental results show that the proposed method has better performance.
This article studies an efficient nonlinear model-predictive control (NMPC) scheme for trajectory tracking control of a quadrotor unmanned aerial vehicle (UAV). By augmenting the desired trajectory ...to a reference dynamical system, we can make the tracking task fit into the standard NMPC framework. In order to alleviate the heavy computational burden caused by solving the corresponding NMPC optimization problem online, we develop an improved continuation/generalized minimal residual (<inline-formula> <tex-math notation="LaTeX">{i}\text{C} </tex-math></inline-formula>/GMRES) algorithm. Compared with the standard C/GMRES method, the inequality constraint is relaxed by imposing the penalty term on the cost function. To guarantee the closed-loop system stability, we introduce a contraction constraint. Based on the proposed numerical algorithm and the stability constraint, we develop a novel efficient-NMPC algorithm to achieve acceptable control performance with reduced computational complexity. The numerical convergence of <inline-formula> <tex-math notation="LaTeX">{i}\text{C} </tex-math></inline-formula>/GMRES solutions and the closed-loop stability of efficient-NMPC are theoretically analyzed in the presence of the input constraint. Finally, the numerical simulations, software-in-the-loop (SIL) simulations, and the real-time experiment are given to demonstrate the effectiveness of the proposed <inline-formula> <tex-math notation="LaTeX">{i}\text{C} </tex-math></inline-formula>/GMRES algorithm and efficient-NMPC scheme.
The development of the Internet of Things, cloud computing, and artificial intelligence has given birth to industrial artificial intelligence (IAI) technology, which enables us to obtain fine ...perception and in-depth understanding capabilities for the operating conditions of industrial processes, and promotes the intelligent transformation of modern industrial production processes. At the same time, modern industry is facing diversified market demand instead of ultra-large-scale demand, resulting in typical variable conditions, which enhances the nonstationary characteristics of modern industry, and brings great challenges to the monitoring of industrial processes. In this regard, this paper analyzes the complex characteristics of nonstationary industrial operation, reveals the effects on operating condition monitoring, and summarizes the difficulties faced by varying condition monitoring. Furthermore, by reviewing the recent 30 years of development of data-driven methods for industrial process monitoring, we sorted out the evolution of nonstationary monitoring methods, and analyzed the features, advantages and disadvantages of the methods at different stages. In addition, by summarizing the existing related research methods by category, we hope to provide reference for monitoring methods of nonstationary process. Finally, combined with the development trend of industrial artificial intelligence technologies, some promising research directions are given in the field of nonstationary process monitoring.
In this paper, the use of multiple variable spaces is proposed for monitoring modern industrial processes where data for a large number of process variables may be collected from different sources to ...reveal different characteristics. The easiest method of modeling a process is to treat all variables in a single data space, but then the information inherent in different types of variables would be mixed together and there would be no local view of each variable space. An extended algorithm based on the concept of total projection to latent structures, which we call multispace T-PLS (MsT-PLS), is thus developed to treat variables in multiple data spaces. Multiple variable spaces that are separated from the measurement space are composed of different sets of process variables measured at the same time and responsible for the same response data. Using the proposed algorithm, the relationships among multiple variable spaces are studied under the supervision of quality characteristics. Thus, comprehensive information decomposition is obtained in each variable space, which can be separated into four systematic subspaces in response to the cross-space common and specific process variability and one final residual subspace. The theoretical support for MsT-PLS is analyzed in detail and its statistical characteristics are compared with those of single-space T-PLS (SsT-PLS) algorithm. A process monitoring strategy is developed based on the MsT-PLS subspace decomposition result and applied to the Tennessee Eastman process for illustration purposes.
•Enriched hydrogenotrophic methanogens in the inoculum accounted for the raised CH4.•The highest methane yield was 411 mL/g-VS with the substrate degradation of 91.2%.•The optimal ...carbohydrate/protein/cellulose ratio for CH4 generation was 50:45:5.•Modified Gompertz model can predict the biomethanation of carbohydrate-rich waste.•Anaerobic granular sludge inoculum was fit for carbohydrate-rich waste digestion.
This study firstly evaluated the microbial role when choosing the acclimated anaerobic granular sludge (AGS) and waste activated sludge (WAS) as microbial and nutritional regulators to improve the biomethanation of fruit and vegetable wastes (FVW). Results showed that the enriched hydrogenotrophic methanogens, and Firmicutes and Spirochaeta in the AGS were responsible for the enhanced methane yield. A synthetic waste representing the mixture of WAS and FVW was then used to investigate the influences of different substrate composition on methane generations. The optimal mass ratio of carbohydrate/protein/cellulose was observed to be 50:45:5, and the corresponding methane yield was 411mL/g-VSadded. Methane kinetic studies suggested that the modified Gompertz model fitted better with those substrates of carbohydrate- than protein-predominated. Parameter results indicated that the maximum methane yield and production rate were enhanced firstly and then reduced with the decreasing carbohydrate and increasing protein percentages; the lag phase time however increased continuously.
This paper is motivated by the filtering estimation for a class of nonlinear stochastic systems in the case that the measurements are randomly delayed by one sampling time. Through presenting ...Gaussian approximation about the one-step posterior predictive probability density functions (PDFs) of the state and delayed measurement, a novel Gaussian approximation (GA) filter is derived, which recursively operates by analytical computation and Gaussian weighted integrals. The proposed GA filter gives a general and common framework since: (1) it is applicable for both linear and nonlinear systems, (2) by setting the delay probability as zero, it automatically reduces to the standard Gaussian filter without the randomly delayed measurements, and (3) many variations of the proposed GA filter can be developed through utilizing different numerical technologies for computing such Gaussian weighted integrals, including the previously existing EKF and UKF methods, as well as the improved cubature Kalman filter (CKF) in our paper using the spherical–radial cubature rule. The performance of the new method is demonstrated with a simulation example of the high-dimensional GPS/INS integrated navigation.