Two novel, origami-inspired, metamaterials were designed, mechanically tested, and modelled. One novel origami model was folded using a triangular based crease pattern and the other was folded using ...a rectangular based crease pattern. The origami-inspired metamaterial sheets were fabricated from polylactic acid using fused deposition additive manufacturing. Several configurations, parameterized by varying the fold angle, were mechanically tested under compression and impact loads. It was found that the specific elastic compression modulus of these novel designs was higher, ranging from 594 MPa/kg to 926 MPa/kg, than existing origami-inspired structures made based on the popular Ron-Resch design, which had specific elastic compression moduli between 15 MPa/kg to 365 MPa/kg. A finite element model further analysed the stress distribution of the core structures under compression loads. The impact testing results showed that the pattern of the tessellated cores affected the amount of impact force transferred through the samples, whereas the fold angle of the origami-inspired design had little impact on the results. The rectangular structure was shown to transfer approximately 50–75% of the force transferred by the triangular structure under impact loads.
Display omitted
•Two novel, origami-inspired, tessellated patterns were designed for use as light-weight cores in sandwich structures.•Compressions tests showed that increasing fold angle improved the metamaterials’ resistance to compression loads.•The elastic compression moduli of both new designs were higher than that of existing Miura-ori and Ron Resch designs.•The ability to absorb impact force was dependant on the tessellated design pattern and independent of the origami fold angle.
Spatial modelling of storm dust provenance is essential to mitigate its on-site and off-site effects in the arid and semi-arid environments of the world. Therefore, the main aim of this study was to ...apply eight data mining algorithms including random forest (RF), support vector machine (SVM), bayesian additive regression trees (BART), radial basis function (RBF), extreme gradient boosting (XGBoost), regression tree analysis (RTA), Cubist model and boosted regression trees (BRT) and an ensemble modelling (EM) approach for generating spatial maps of dust provenance in the Khuzestan province, a main region with active sources for producing dust in southwestern Iran. This study is the first attempt at predicting storm dust provenance by applying individual data mining models and ensemble modelling. We identified and mapped in a geographic information system (GIS), 12 potential effective factors for dust emissions comprising two for climate (wind speed, precipitation), five soil characteristics (texture, bulk density, Ec, organic matter (OM), available water capacity (AWC)), a normalized difference vegetation index (NDVI), land use, geology, a digital elevation model (DEM) and land type, and used a mean decrease accuracy measure (MDAM) to determine the corresponding importance scores (IS). A multicollinearity test (including the variance inflation factor (VIF) and tolerance coefficient (TC)) was applied to assess relationships between the effective factors, and an existing map of dust provenance was randomly categorized into two groups consisting of training (70%) and validation (30%) data. The individual data mining models were validated using the area under the curve (AUC). Based on the TC and VIF results, no collinearity was detected among the 12 effective factors for dust emissions. The prediction accuracies of the eight data mining models and an EM assessed by the AUC were as follows: EM (with AUC = 99.8%) > XGBoost > RBF > Cubist > RF > BART > SVM > BRT > RTA (with AUC = 79.1%). Among all models, the EM was found to provide the highest accuracy for predicting storm dust provenance. Using the EM, areas classified as being low, moderate, high and very high susceptibility for storm dust provenance comprised 36, 13, 23 and 28% of the total mapped area, respectively. Based on MDAM results, the highest and lowest IS were obtained for the wind speed (IS = 23) and geology (IS = 6.5) factors, respectively. Overall, the modelling techniques used in this research are helpful for predicting storm dust provenance and thereby targeting mitigation. Therefore, we recommend applying data mining EM approaches to the spatial mapping of storm dust provenance worldwide.
Display omitted
•Data mining algorithms and ensemble modelling used to map dust provenance spatially.•Mean decrease accuracy measure used to determine importance scores for the controls.•Ensemble model provided the highest accuracy for predicting storm dust provenance.•Techniques used are helpful for targeting mitigation.
Abstract
The digital twin (DT) is a relatively new concept that is finding increased acceptance in industry. A DT is generally considered as comprising a physical entity, its virtual replica, and ...two-way digital data communications in-between. Its primary purpose is to leverage the process intelligence captured within digital models—or usually their faster-solving surrogates—towards generating increased value from the physical entities. The surrogate models are created using machine learning based on data obtained from the field, experiments and digital models, which may be physics-based or statistics-based. Anomaly detection and correction, and diagnostic closed-loop process control are examples of how a process DT can be deployed. In the manufacturing industry, its use can achieve improvements in product quality and process productivity. Metal additive manufacturing (AM) stands to gain tremendously from the use of DTs. This is because the AM process is inherently chaotic, resulting in poor repeatability. However, a DT acting in a supervisory role can inject certainty into the process by actively keeping it within bounds through real-time control commands. Closed-loop feedforward control is achieved by observing the process through sensors that monitor critical parameters and, if there are any deviations from their respective optimal ranges, suitable corrective actions are triggered. The type of corrective action (e.g. a change in laser power or a modification to the scanning speed) and its magnitude are determined by interrogating the surrogate models. Because of their artificial intelligence (AI)-endowed predictive capabilities, which allow them to foresee a future state of the physical twin (e.g. the AM process), DTs proactively take context-sensitive preventative steps, whereas traditional closed-loop feedback control is usually reactive. Apart from assisting a build process in real-time, a DT can help with planning the build of a part by pinpointing the optimum processing window relevant to the desired outcome. Again, the surrogate models are consulted to obtain the required information. In this article, we explain how the application of DTs to the metal AM process can significantly widen its application space by making the process more repeatable (through quality assurance) and cheaper (by getting builds right the first time).
•35 performance and emission parameters were simultaneously predicted.•Conditions of random points of whole engine performance map were modelled.•Obtained predictions provided MAPE below 8.5% with ...the exception of two parameters.•Calculation time for 24 × 35 testing data was 0.109 s.•Low/high temperature water system, oil system and exhaust gas systems provided MAPE under 4.5%.
Marine incidents given in recent years have been in part caused by propulsion issues. In this context, incipient propulsion faults may be identified by deviations between real values and healthy engine values provided by an accurate model. Engine modelling techniques have thus become a topic of interest in the last decade. On this basis, Machine learning approaches such as Artificial Neural Networks (ANN) have proved to be accurate and fast in terms of calculation times. However, up to now most research work has focused on predicting a few parameters for specific operation points. In order to analyse the generalization capability of ANN when predicting multiple outputs in real engine conditions, 35 different performance and emission parameters were simultaneously predicted in this study with an ANN. To do so, different engine operation points were tested in a six-cylinder marine diesel engine, characterizing the whole engine performance map. Additionally, some points from random regions throughout the entire engine performance map were tested to later analyse ANN performance on them. After defining network optimum structure and training and validating the Artificial Neural Network with 1000 data samples, the ANN was tested with data extracted from unseen random regions of the performance map. Mean Absolute Percentage Errors obtained for testing samples from random points of the engine performance map remained below 8.5% for all parameters with the exception of CO and NO2 emissions predictions. For low temperature and high temperature cooling systems, oil system and exhaust gas system, MAPE values obtained were below 4.3%. Calculation time for 24 testing samples containing 35 parameters was 0.109 s, which along with the high accuracy level obtained demonstrated that ANN can predict multiple outputs throughout the whole engine performance map.
Exascale applications: skin in the game Alexander, Francis; Almgren, Ann; Bell, John ...
Philosophical transactions of the Royal Society of London. Series A: Mathematical, physical, and engineering sciences,
03/2020, Volume:
378, Issue:
2166
Journal Article
Peer reviewed
Open access
As noted in Wikipedia,
refers to having 'incurred risk by being involved in achieving a goal', where '
is a synecdoche for the person involved, and
is the metaphor for actions on the field of play ...under discussion'. For exascale applications under development in the US Department of Energy Exascale Computing Project, nothing could be more apt, with the
being exascale applications and the
being delivering comprehensive science-based computational applications that effectively exploit exascale high-performance computing technologies to provide breakthrough modelling and simulation and data science solutions. These solutions will yield high-confidence insights and answers to the most critical problems and challenges for the USA in scientific discovery, national security, energy assurance, economic competitiveness and advanced healthcare. This article is part of a discussion meeting issue 'Numerical algorithms for high-performance computational science'.
Agent-based modelling has the potential to provide insight into complex energy transition dynamics. Despite a recent emphasis of research on agent-based modelling and on energy transitions, an ...overview of how the methodology may be of value to understanding transition processes is still missing from the literature. This systematic review evaluates the potential of agent-based modelling to understanding energy transitions from a social-scientific perspective, based on a set of 62 articles. Six topic areas were identified, addressing different components of the energy system: Electricity Market, Consumption Dynamics/ Consumer Behaviour, Policy and Planning, New Technologies/ Innovation, Energy System, Transitions. Distribution of articles across topic areas was indicative of a continuing interest in electricity market related enquiries, and an increasing number of studies in the realm of policy and planning. Based on the relevance of energy transition specific complexities to the choice of ABM as a methodology, four complexity categories (1–4) were identified. Indicating the degree of association between the complexity of energy transitions and ABM’s ability to address these, the categorisation revealed that 35 of the 62 studies directly linked the choice of ABM to energy transition complexities (complexity category 1) or were set in the context of energy transitions (complexity category 2). The review further showed that the greatest potential contribution of ABM to energy transition studies lies in its practical application to decision-making in policy and planning. More interdisciplinary collaboration in model development is recommended to address the discrepancy between the relevance of social factors to modelling energy transitions and the ability of the social sciences to make effective use of ABM.
Higher penetration of solar PV and wind generation in distribution networks may change the conservation voltage reduction (CVR) capabilities. The uncertainties associated with the renewable ...generations and system loads are neglected in the deterministic CVR assessment. In this paper, a probabilistic framework for CVR assessment has been presented to assess the impact of uncertainties associated with renewable generations and system loads. A theoretical framework has been presented by establishing a mathematical relationship between the probability distribution of renewable generations (solar PV and wind) and the probability distribution of static exponential load model parameters. The simulation results have confirmed that the penetration of non-Gaussian solar PV and wind generation leads to non-Gaussian static exponential load model parameter distribution, which is validated by the normality tests (quantile-quantile plot, skewness, and kurtosis). Subsequently, the magnitude and probability distribution of the CVR capabilities of the network changes with the penetration of renewable generations, where the higher renewable penetration scenarios lead to higher CVR values and non-Gaussian (asymmetric) distribution.
Bisphenol A (BPA) is one of the best studied industrial chemicals in terms of exposure, toxicity, and toxicokinetics. This renders it an ideal candidate to exploit the recent advancements in ...physiologically based pharmacokinetic (PBPK) modelling to support risk assessment of BPA specifically, and of other consumer-relevant hazardous chemicals in general. Using the exposure from thermal paper as a case scenario, this study employed the multi-phase multi-layer mechanistic dermal absorption (MPML MechDermA) model available in the Simcyp® Simulator to simulate the dermal toxicokinetics of BPA at local and systemic levels. Sensitivity analysis helped to identify physicochemical and physiological factors influencing the systemic exposure to BPA. The iterative modelling process was as follows: (i) development of compound files for BPA and its conjugates, (ii) setting-up of a PBPK model for intravenous administration, (iii) extension for oral administration, and (iv) extension for exposure via skin (i.e., hand) contact. A toxicokinetic study involving hand contact to BPA-containing paper was used for model refinement. Cumulative urinary excretion of total BPA had to be employed for dose reconstruction. PBPK model performance was verified using the observed serum BPA concentrations. The predicted distribution across the skin compartments revealed a depot of BPA in the stratum corneum (SC). These findings shed light on the role of the SC to act as temporary reservoir for lipophilic chemicals prior to systemic absorption, which inter alia is relevant for the interpretation of human biomonitoring data and for establishing the relationship between external and internal measures of exposure.
•Pharmacokinetic modelling of dermal absorption of bisphenol A (BPA) from thermal paper.•Physiologically based kinetic modelling of dermal exposure to BPA via hand contact.•Preferential partitioning of BPA from skin surface into the stratum corneum (SC).•Delayed and long-lasting transfer of BPA from SC into systemic circulation.•The thick SC of the palms acts as temporary reservoir.