Rotor systems are important parts of rotating machinery. Real-time health monitoring of rotor systems is essential for safe operation. Data-driven modeling based on sensor data is currently the focus ...of health monitoring, which solves the problem that traditional physical modeling cannot be applied to complex mechanical equipment. Parameters uncertainty of data-driven modeling is inevitable, and analyzing and utilizing this uncertainty is critical to improving the adaptability of data-driven monitoring methods. This paper focuses on the uncertainty quantification of fault features under the uncertainty of data-driven model parameters. Combined with Monte Carlo simulation and Latin Hypercube Sampling, the influence mechanism of system performance degradation on the quantification of feature uncertainty is analyzed. Finally, a Jeffcott rotor test rig with a rub-impact device was built, and the feature uncertainty caused by the NARX (Nonlinear AutoRegressive with eXogenous input) model parameter uncertainty is explained. The experimental results show that the reliability of the proposed fault features based on the uncertainty is significantly better than the traditional signal-based features, which significantly improves the adaptability of the data-driven method in the health assessment of the rotor system.
Tool condition monitoring (TCM) plays a vital role in maintaining product quality and improving productivity in advanced manufacturing. However, complex machining environments often limit the ...monitoring accuracy of conventional monitoring systems. In the present study, a new diagnostic framework is proposed for TCM during machining using a novel regularization-based sensor data modelling and model frequency analysis. For the first time, the physical information of the underlying machining process is incorporated into the modelling procedure for the design of the associated regularization parameter. This ensures that significant underlying physics can be taken into account during the modelling so as to enhance the TCM performance. This idea is referred to as tool condition monitoring-oriented regularization (TCMoR). After a model has been identified from TCMoR-based sensor data modelling, the frequency domain properties of the model are extracted to reveal unique and physically meaningful features of the underlying machining process for the TCM purpose. The effectiveness of the proposed diagnostic framework is validated by extensive in-situ experimental studies under both variable and controlled tool-workpiece engagement conditions, demonstrating its advantages over conventional TCM methods and its potential applications in industry.
Deep learning techniques for fluid flow modelling have gained significant attention in recent years. Advanced deep learning techniques achieve great progress in rapidly predicting fluid flows without ...prior knowledge of the underlying physical relationships. However, most of existing researches focused mainly on either sequence learning or spatial learning, rarely on both spatial and temporal dynamics of fluid flows (Reichstein et al., 2019). In this work, an Artificial Intelligence (AI) fluid model based on a general deep convolutional generative adversarial network (DCGAN) has been developed for predicting spatio-temporal flow distributions. In deep convolutional networks, the high-dimensional flows can be converted into the low-dimensional “latent” representations. The complex features of flow dynamics can be captured by the adversarial networks. The above DCGAN fluid model enables us to provide reasonable predictive accuracy of flow fields while maintaining a high computational efficiency. The performance of the DCGAN is illustrated for two test cases of Hokkaido tsunami with different incoming waves along the coastal line. It is demonstrated that the results from the DCGAN are comparable with those from the original high fidelity model (Fluidity). The spatio-temporal flow features have been represented as the flow evolves, especially, the wave phases and flow peaks can be captured accurately. In addition, the results illustrate that the online CPU cost is reduced by five orders of magnitude compared to the original high fidelity model simulations. The promising results show that the DCGAN can provide rapid and reliable spatio-temporal prediction for nonlinear fluid flows.
•A deep convolutional GAN (DCGAN) is developed for large data-driven fluid modelling.•First use of DCGANs for predicting spatio-temporal nonlinear fluid flows.•Predictive results from DCGAN and high fidelity model are in a good agreement.•Using DCGAN the computational cost is reduced by five orders of magnitude.•The DCGAN is a robust and efficient tool for predictive modelling of fluid flows.
•New convolutional model achieves state-of-the-art results on ETH and TrajNet datasets.•Random rotations and Gaussian noise are the best data augmentation techniques.•Coordinates with the origin in ...the last observation point better represent trajectory.
Predicting the future trajectories of pedestrians is a challenging problem that has a range of application, from crowd surveillance to autonomous driving. In literature, methods to approach pedestrian trajectory prediction have evolved, transitioning from physics-based models to data-driven models based on recurrent neural networks. In this work, we propose a new approach to pedestrian trajectory prediction, with the introduction of a novel 2D convolutional model. This new model outperforms recurrent models, and it achieves state-of-the-art results on the ETH and TrajNet datasets. We also present an effective system to represent pedestrian positions and powerful data augmentation techniques, such as the addition of Gaussian noise and the use of random rotations, which can be applied to any model. As an additional exploratory analysis, we present experimental results on the inclusion of occupancy methods to model social information, which empirically show that these methods are ineffective in capturing social interaction.
Machine learning algorithms (MLAs) such us artificial neural networks (ANNs), regression trees (RTs), random forest (RF) and support vector machines (SVMs) are powerful data driven methods that are ...relatively less widely used in the mapping of mineral prospectivity, and thus have not been comparatively evaluated together thoroughly in this field.
The performances of a series of MLAs, namely, artificial neural networks (ANNs), regression trees (RTs), random forest (RF) and support vector machines (SVMs) in mineral prospectivity modelling are compared based on the following criteria: i) the accuracy in the delineation of prospective areas; ii) the sensitivity to the estimation of hyper-parameters; iii) the sensitivity to the size of training data; and iv) the interpretability of model parameters. The results of applying the above algorithms to epithermal Au prospectivity mapping of the Rodalquilar district, Spain, indicate that the RF outperformed the other MLA algorithms (ANNs, RTs and SVMs). The RF algorithm showed higher stability and robustness with varying training parameters and better success rates and ROC analysis results. On the other hand, all MLA algorithms can be used when ore deposit evidences are scarce. Moreover the model parameters of RF and RT can be interpreted to gain insights into the geological controls of mineralization.
In the current review, an exceptional view on the multi-scale integrated computational modelling and data-driven methods in the Additive manufacturing (AM) of metallic materials in the framework of ...integrated computational materials engineering (ICME) is discussed. In the first part of the review, process simulation (P-S linkage), structure modelling (S-P linkage), property simulation (S-P linkage), and integrated modelling (PSP and PSPP linkages) are elaborated considering different physical phenomena (multi-physics) in AM and at micro/meso/macro scales (multi-scale modelling). The second part provides an extensive discussion of a data-driven framework, which involves extracting existing data from databases and texts, data pre-processing, high throughput screening, and, therefore, database construction. A data-driven workflow that integrates statistical methods, including ML, artificial intelligence (AI), and neural network (NN) models, has great potential for completing PSPP linkages. This review paper provides an insight for both academic and industrial researchers, working on the AM of metallic materials.
Linearising the dynamics of nonlinear mechanical systems is an important and open research area. A common approach is feedback linearisation, which is a nonlinear control method that transforms the ...input–output response of a nonlinear system into an equivalent linear one. The main problem with feedback linearisation is that it requires an accurate first-principles model of the system, which are typically hard to obtain. In this paper, we design an alternative control approach that exploits data-driven models to linearise the input–output response of nonlinear mechanical systems. Specifically, a model-based reference tracking architecture is developed for nonlinear feedback systems with output nonlinearities. The overall methodology shows a high degree of performance combined with significant robustness against imperfect modelling and extrapolation. These findings are demonstrated using large set of synthetic experiments conducted on a asymmetric Duffing oscillator and using an experimental prototype of a high-precision motion system.
•Novel control method that linearises input–output dynamics of mechanical systems.•Nonlinear data-driven state-space models are used; linear part is preserved.•Nonlinear distortion analysis quantifies efficacy over frequency range of interest.•Simulations and experiments show high performance and robustness properties.
Nonlinear stochastic modelling with Langevin regression Callaham, J L; Loiseau, J-C; Rigas, G ...
Proceedings of the Royal Society. A, Mathematical, physical, and engineering sciences,
06/2021, Letnik:
477, Številka:
2250
Journal Article
Recenzirano
Odprti dostop
Many physical systems characterized by nonlinear multiscale interactions can be modelled by treating unresolved degrees of freedom as random fluctuations. However, even when the microscopic governing ...equations and qualitative macroscopic behaviour are known, it is often difficult to derive a stochastic model that is consistent with observations. This is especially true for systems such as turbulence where the perturbations do not behave like Gaussian white noise, introducing non-Markovian behaviour to the dynamics. We address these challenges with a framework for identifying interpretable stochastic nonlinear dynamics from experimental data, using forward and adjoint Fokker-Planck equations to enforce statistical consistency. If the form of the Langevin equation is unknown, a simple sparsifying procedure can provide an appropriate functional form. We demonstrate that this method can learn stochastic models in two artificial examples: recovering a nonlinear Langevin equation forced by coloured noise and approximating the second-order dynamics of a particle in a double-well potential with the corresponding first-order bifurcation normal form. Finally, we apply Langevin regression to experimental measurements of a turbulent bluff body wake and show that the statistical behaviour of the centre of pressure can be described by the dynamics of the corresponding laminar flow driven by nonlinear state-dependent noise.
Display omitted
•Popularly used ML-based AD models are ANN, SVM, RF, and XGBOOST.•Predicted variables are biogas yield, process stability, and effluent characteristics.•Global and local ...model-agnostic explainability approaches are reviewed.•Potential applications are process parameter optimization, fault detection, and LCA.•It is necessary to inform ML models with biokinetic equations to improve accuracy.
Anaerobic digestion (AD) is a promising technology for recovering value-added resources from organic waste, thus achieving sustainable waste management. The performance of AD is dictated by a variety of factors including system design and operating conditions. This necessitates developing suitable modelling and optimization tools to quantify its off-design performance, where the application of machine learning (ML) and soft computing approaches have received increasing attention. Here, we succinctly reviewed the latest progress in black-box ML approaches for AD modelling with a thrust on global and local model interpretability metrics (e.g., Shapley values, partial dependence analysis, permutation feature importance). Categorical applications of the ML and soft computing approaches such as what-if scenario analysis, fault detection in AD systems, long-term operation prediction, and integration of ML with life cycle assessment are discussed. Finally, the research gaps and scopes for future work are summarized.