In this study, a novel approach of processing μCT images to create digital material twins is presented. A deep convolutional neural network (DCNN) was implemented and used to segment μCT images of ...two different types of reinforcement (2D glass and 3D carbon). The DCNN successfully segmented the images based on multi-scale features extracted using data-driven convolutional filters. The network was trained using scanned μCT images, along with images extracted from computer-generated virtual models of the reinforcements. One of the convolutional layers of the trained network was utilized to extract features to be used in creating a machine learning-based model. The extracted features and the raw gray-scale data were used to train a supervised k-nearest neighbor (k-NN) model for pixel-wise classification. The performance of both approaches was evaluated by comparing the results with manually segmented images. The trained deep neural network was able to provide faster and superior predictions of different features of the reinforcements as compared to the conventional machine learning approach.
The two-dimensional population balance model (2D PBM) introduces the enhanced capability of modeling two different sets of growth kinetics, which can offer advantages to modeling crystallization ...processes that generate needle-like particles, a commonly encountered active pharmaceutical ingredient (API) morphology. Although one-dimensional population balance model (1D PBM) can be utilized effectively to model nonequant morphologies with the selection of appropriate shape factors, it cannot account for morphology or aspect ratio changes that can occur during the crystallization process. In this work, the advantage of the 2D PBM for an industrial crystallization process that generates needle-shaped API is highlighted by comparing the 1D PBM and 2D PBM results. The API utilized for this work had extremely slow desupersaturation and was not able to achieve solubility concentration despite an ∼50 h seed bed age. While the 1D PBM is useful in optimizing the crystallization process to enhance desupersaturation behavior, the 1D PBM did not match the particle size quantiles and, thus, could not be utilized to probe the impact of crystallization process parameters on the particle aspect ratio (AR). The 2D PBM was necessary to model the particle size quantiles and was utilized to further optimize process conditions for minimizing the particle aspect ratio. Simulations utilizing the 2D PBM indicated that, regardless of antisolvent addition rate or seed morphology, the final material would still have a high aspect ratio. This knowledge saved the investment of much time and effort in trying to minimize particle AR with changes in crystallization processing parameters alone and, thus, highlights the utility of the 2D PBM to the pharmaceutical industry.
The competitiveness of hydrogen as a sustainable energy carrier depends greatly on its transportation and storage costs. Liquefying hydrogen offers advantages such as enhanced purity, versatility, ...and higher density, yet current industrial liquefaction processes face efficiency and cost challenges. Although various large-scale and efficient liquefaction concepts exist in the literature, they often overlook the economic and technical viability of such plants. Here, we addresses this issue by establishing a framework for modeling a large-scale hydrogen liquefaction concept and conducting both technical and economic assessments, with a specific focus on 125 tonnes per day (TPD) high-pressure hydrogen Claude-cycle concept. The technical analysis involves preliminary designs of key process components, while the economic assessment utilizes Aspen Process Economic Analyzer. Our findings indicate that at an electricity price of €0.1/kWh, the Claude-cycle liquefier concept yields a specific liquefaction cost (SLC) of €1.55/kgLH2. A sensitivity analysis was performed, which shows that electricity price has a significant influence on the economics. Further investigation on the compressors design shows that incorporating high-speed centrifugal compressors could reduce the SLC by 5.42% and potentially more. Scaling up to 250 and 500 TPD reveals further cost improvements, while cost projections indicate substantial declines as the technology matures. Ultimately, this paper presents novel cost-scaling and experience curves of hydrogen liquefaction technology, demonstrating the compelling economic viability of integrating large-scale hydrogen liquefaction into sustainable energy infrastructure.
Display omitted
•A process model of a large-scale Claude-cycle hydrogen liquefaction plant has been developed for both technical assessment and economic analysis.•The technical evaluation primarily centers on the preliminary design of key process equipment, such as compressors, turbo-expanders, and heat exchangers.•The techno-economic assessment is performed based on the process simulation and equipment preliminary design.•Capital cost estimation is carried out using the Aspen Process Economics Analyzer.•These cost estimations serve as the basis for predicting cost and experience curves of hydrogen liquefaction technology.
For this work, the surface-layer states of turned AISI 4140 QT were investigated by means of surface roughness and microhardness measurements. Different machining conditions are regarded, namely ...cutting velocity, feed rate, tool wear and the tool corner radius, as well as the tempering state of the workpiece. The resulting data is analyzed by multiple algorithms, in order to create analytical models for a real time process control. Modeling approaches applied are linear regression, stepwise regression, LASSO and Elastic Net. Finally, the models are evaluated in terms of quality, complexity and physical plausibility.
Industrial process data are usually mixed with missing data and outliers which can greatly affect the statistical explanation abilities for traditional data-driven modeling methods. In this sense, ...more attention should be paid on robust data mining methods so as to investigate those stable and reliable modeling prototypes for decision-making. This paper gives a systematic review of various state-of-the-art data preprocessing tricks as well as robust principal component analysis methods for process understanding and monitoring applications. Afterwards, comprehensive robust techniques have been discussed for various circumstances with diverse process characteristics. Finally, big data perspectives on potential challenges and opportunities have been highlighted for future explorations in the community.
Although the importance of edaphic factors and habitat structure for plant growth and survival is known, both are often neglected in favor of climatic drivers when investigating the spatial patterns ...of plant species and diversity. Yet, especially in mountain ecosystems with complex topography, missing edaphic and habitat components may be detrimental for a sound understanding of biodiversity distribution. Here, we compare the relative importance of climate, soil and land cover variables when predicting the distributions of 2,616 vascular plant species in the European Alps, representing approximately two-thirds of all European flora. Using presence-only data, we built point-process models (PPMs) to relate species observations to different combinations of covariates. We evaluated the PPMs through block cross-validations and assessed the independent contributions of climate, soil, and land cover covariates to predict plant species distributions using an innovative predictive partitioning approach. We found climate to be the most influential driver of spatial patterns in plant species with a relative influence of ~58.5% across all species, with decreasing importance from low to high elevations. Soil (~20.1%) and land cover (~21.4%), overall, were less influential than climate, but increased in importance along the elevation gradient. Furthermore, land cover showed strong local effects in lowlands, while the contribution of soil stabilized at mid-elevations. The decreasing influence of climate with elevation is explained by increasing endemism, and the fact that climate becomes more homogeneous as habitat diversity declines at higher altitudes. In contrast, soil predictors were found to follow the opposite trend. Additionally, at low elevations, human-mediated land cover effects appear to reduce the importance of climate predictors. We conclude that soil and land cover are, like climate, principal drivers of plant species distribution in the European Alps. While disentangling their effects remains a challenge, future studies can benefit markedly by including soil and land cover effects when predicting species distributions.
Display omitted
•Autothermal operation of the global CCLG system is realized.•Design of a heat exchange plan that obtains maximum process heat recovery.•Exergy destruction of the autothermal CCLG ...system is mainly from the FR.•Flue gas circulation causes high solids circulation and low thermodynamic efficiency.
Chemical looping gasification (CLG) is an innovative clean coal conversion technology with industrial application prospects. The technical reliability of coal CLG (CCLG) in an autothermal state remains to be investigated. To clarify the process characteristics of an autothermal CCLG process, a CCLG autothermal operation system was established based on a CLG industrial demonstration unit. Process modeling of the CCLG reactor system and economizer, air preheater, and other heat exchange units was performed, to ensure autothermal operation of the global CCLG process. The optimal operating parameters of the system were established, the optimal heat exchange network was designed, and the source and distribution of thermodynamic irreversibility were clarified. The results showed that the optimal coal feed flow rate, air feed temperature and flue gas circulation temperature were 437.07 kg/h, 550 °C and 450 °C, respectively. The fuel reactor (FR) flue gas circulation was not conducive to system autothermal, resulting in weakened gasification performance. The optimal heat exchange plan allowed the system to operate without an external heat source. The thermodynamic irreversibility mainly originated from the redox process. The system heat efficiency was 81.63 %, the product exergy efficiency was 78.06 %, the total exergy efficiency was 41.10 %, and the total exergy destruction rate was 55.40 %. Additionally, the FR flue gas circulation caused high exergy loss, resulting in reduced system thermodynamic efficiency.
Process modeling plays a central role in the development of today’s process-aware information systems both on the management level (e.g., providing input for requirements elicitation and fostering ...communication) and on the enactment level (providing a blue-print for process execution and enabling simulation). The literature comprises a variety of process modeling approaches proposing different modeling languages (i.e., imperative and declarative languages) and different types of process artifact support (i.e., process models, textual process descriptions, and guided simulations). However, the use of an individual modeling language or a single type of process artifact is usually not enough to provide a clear and concise understanding of the process. To overcome this limitation, a set of so-called “hybrid” approaches combining languages and artifacts have been proposed, but no common grounds have been set to define and categorize them. This work aims at providing a fundamental understanding of these hybrid approaches by defining a unified terminology, providing a conceptual framework and proposing an overarching overview to identify and analyze them. Since no common terminology has been used in the literature, we combined existing concepts and ontologies to define a “Hybrid Business Process Representation” (HBPR). Afterwards, we conducted a Systematic Literature Review (SLR) to identify and investigate the characteristics of HBPRs combining imperative and declarative languages or artifacts. The SLR resulted in 30 articles which were analyzed. The results indicate the presence of two distinct research lines and show common motivations driving the emergence of HBPRs, a limited maturity of existing approaches, and diverse application domains. Moreover, the results are synthesized into a taxonomy classifying different types of representations. Finally, the outcome of the study is used to provide a research agenda delineating the directions for future work.
•A novel conceptual framework for Hybrid Business Process Representations (HBPRs).•A clear-cut distinction between hybrid languages and hybrid process artifacts.•A systematic literature of 30 articles covering two distinct research lines.•A descriptive taxonomy classifying different types of HBPRs.•A research agenda delineating the directions for future work.
Display omitted
Spray drying is one of the widely used manufacturing processes in pharmaceutical industry. While there are voluminous experimental studies pertaining to the impact of various ...process-formulation parameters on the quality attributes of spray dried powders such as particle size, morphology, density, and crystallinity, there is scant information available in the literature regarding process scale-up. Here, we first analyze salient features of scale-up attempts in literature. Then, spray drying process is analyzed considering the fundamental physical transformations involved, i.e., atomization, drying, and gas-solid separation. Each transformation is scrutinized from a scale-up perspective with non-dimensional parameters & multi-scale analysis, and comprehensively discussed in engineering context. Successful scale-up entails similar key response variables from each transformation across various scales. These variables are identified as droplet size distribution, outlet temperature, relative humidity, separator pressure loss coefficient, and collection efficiency. Instead of trial-and-error-based approaches, this review paper advocates the use of mechanistic models and scale-up rules for establishing design spaces for the process variables involved in each transformation of spray drying. While presenting a roadmap for process development and scale-up, the paper demonstrates how to bridge the current gap in spray drying scale-up via a rational understanding of the fundamental transformations.
Display omitted
•PFOS removal by NF process was well modeled by machine learning algorithms.•RF, GBM and AdaBoost procedures were robust models for the NF process.•Permutation variable importance ...method quantified relative importance of parameters.•Pressure, initial PFOS concentration and membrane type are the most important variables.
Per- and polyfluoroalkyl substances (PFAS) are hazardous chemicals that have been widely used in different industries and released into the environment through contaminated effluents. Nanofiltration (NF) is a promising process for removing PFAS from the effluents. This study aimed to model and analyze the performance of the NF membrane process in perfluorooctanesulfonic acid (PFOS) removal from contaminated effluents using machine learning (ML) algorithms. The modeling output of seven ML algorithms was evaluated using statistical indexes of determination coefficient (R2) and mean squared error (MSE) for robustness. The results demonstrated that random forest (RF), gradient boosting machine (GBM), and AdaBoost models were the most robust for the NF process. Accordingly, the optimization of these procedures was accomplished using a grid search. The optimized models were deeply analyzed using permutation variable importance (PVI) to quantify the relative importance of operating variables. The three ML procedures (RF, GBM, AdaBoost) presented high prediction strength for PFOS removal from contaminated effluents with low MSE values (4.726, 2.450, 2.879) and high R2 values (0.930, 0.975, 0.968). In addition, PVI-RF showed decreasing importance of pressure, initial PFOS concentration, membrane type, trivalent cation, pH, divalent cation and monovalent cation consecutively.