The enantioseparation of chiral molecules is a crucial and challenging task in the field of experimental chemistry, often requiring extensive trial and error with different experimental settings. To ...overcome this challenge, here we show a research framework that employs machine learning techniques to predict retention times of enantiomers and facilitate chromatographic enantioseparation. A documentary dataset of chiral molecular retention times in high-performance liquid chromatography (CMRT dataset) is established to handle the challenge of data acquisition. A quantile geometry-enhanced graph neural network is proposed to learn the molecular structure-retention time relationship, which shows a satisfactory predictive ability for enantiomers. The domain knowledge of chromatography is incorporated into the machine learning model to achieve multi-column prediction, which paves the way for chromatographic enantioseparation prediction by calculating the separation probability. The proposed research framework works well in retention time prediction and chromatographic enantioseparation facilitation, which sheds light on the application of machine learning techniques to the experimental scene and improves the efficiency of experimenters to speed up scientific discovery.
•The TgGAN is trained with data while being simultaneously constrained with theories.•The TgGAN achieves better predictability, reliability, and generalizability than GAN.•The TgGAN model is robust ...and reliable for deep learning of dynamic problems.
Generative adversarial network (GAN) has been shown to be useful in various applications, such as image recognition, text processing and scientific computing, due its strong ability to learn complex data distributions. However, the ability of standard GAN to process dynamic data is limited. In this study, a theory-guided generative adversarial network (TgGAN) is proposed to solve dynamic partial differential equations (PDEs). Different from standard GANs, the training term is no longer the true data and the generated data, but rather their residuals. In addition, such theories as governing equations and other physical constraints are encoded into the loss function of the generator to ensure that the prediction does not only honor the training data, but also obey these theories. TgGAN is proposed for dynamic subsurface flow with heterogeneous model parameters, and the data at each time step are treated as a two-dimensional image. In this study, several numerical cases are introduced to test the performance of the TgGAN. Predicting the future response, label-free learning and learning from noisy data can be realized easily by the TgGAN model, and the effects of the number of training data and the collocation points are also discussed. In order to improve the efficiency of TgGAN, the transfer learning algorithm is also employed. Moreover, the sensitivity of TgGAN to the hydraulic conductivity field is studied. Numerical results demonstrate that the TgGAN model is both robust and reliable for deep learning of dynamic PDEs.
The probabilistic collocation method (PCM) has drawn wide attention for stochastic analysis recently. Its results may become inaccurate in case of a strongly nonlinear relation between random ...parameters and model responses. To tackle this problem, we proposed a location‐based transformed PCM (xTPCM) and a displacement‐based transformed PCM (dTPCM) in previous parts of this series. Making use of the transform between response and space, the above two methods, however, have certain limitations. In this study, we introduce a time‐based transformed PCM (tTPCM) employing the transform between response and time. We conduct numerical experiments to investigate its performance in uncertainty quantification. The results show that the tTPCM greatly improves the accuracy of the traditional PCM in a cost‐effective manner and is more general and convenient than the xTPCM/dTPCM.
Key Points:
UQ methods may give bad results due to strong nonlinearity and non‐Gaussianity
A new method is developed in case of strong nonlinearity and non‐Gaussianity
The tTPCM yields accurate statistics and produces reasonable realizations
Protecting the whole small intestine from radiation-induced intestinal injury during the radiotherapy of abdominal or pelvic solid tumors remains an unmet clinical need. Amifostine is a promising ...selective radioprotector for normal tissues. However, its oral application in intestinal radioprotection remains challenging. Herein, we use microalga Spirulina platensis as a microcarrier of Amifostine to construct an oral delivery system. The system shows comprehensive drug accumulation and effective radioprotection in the whole small intestine that is significantly superior to free drug and its enteric capsule, preventing the radiation-induced intestine injury and prolonging the survival without influencing the tumor regression. It also shows benefits on the gut microbiota homeostasis and long-term safety. Based on a readily available natural microcarrier, this work presents a convenient oral delivery system to achieve effective radioprotection for the whole small intestine, providing a competitive strategy with great clinical translation potential.
Although the Brunauer-Emmett-Teller (BET) equation is a classic adsorption model for describing the adsorption of gases in adsorbents, it cannot be applied in supercritical conditions because the ...saturation vapor pressure (p0) in this equation is not defined when T > Tc. In this study, a modified BET equation is proposed, and can be applied to investigate supercritical methane adsorption mechanisms in shale by using density instead of pressure in this equation. The observed (excess) high-pressure methane adsorption isotherms always can be well-fitted by the modified BET model when the adsorbed-phase density (ρa) is not fixed. The fitted results show that the number of adsorption layers (n) ranges from 1.79 to 2.42, with an average value of 2.12, indicating a double-layer adsorption mechanism approximately. Moreover, we compare this novel model with the commonly used Langmuir and DR models, and find that all the three models can fit the excess adsorption isotherms equally well. However, a critical advantage of this new model is that it can calculate the number of adsorption layers (n), while other models cannot. It is this advantage that makes it possible to analyze the shale gas adsorption mechanism experimentally. Moreover, the average number of adsorption layers (θ) is much smaller than the number of adsorption layers (n), indicating that there are many empty adsorption sites in the adsorption space and the density of the second layer must be less than the first layer, which is consistent with the molecular simulation results.
•A modified BET equation is proposed to investigate methane adsorption mechanisms in shale.•The experimental high-pressure adsorption isotherms can be well-fitted by the new model.•Supercritical methane is found to be approximately double-layer adsorbed in shale.
Random reconstruction of three-dimensional (3D) digital rocks from two-dimensional (2D) slices is crucial for elucidating the microstructure of rocks and its effects on pore-scale flow in terms of ...numerical modeling, since massive samples are usually required to handle intrinsic uncertainties. Despite remarkable advances achieved by traditional process-based methods, statistical approaches and recently famous deep learning-based models, few works have focused on producing several kinds of rocks with one trained model and allowing the reconstructed samples to approximately satisfy certain given properties, such as porosity. To fill this gap, we propose a new framework with deep learning, named RockGPT, which is composed of VQ-VAE and conditional GPT, to synthesize 3D samples based on a single 2D slice from the perspective of video generation. The VQ-VAE is utilized to compress high-dimensional input video, i.e., the sequence of continuous rock slices, to discrete latent codes and reconstruct them. In order to obtain diverse reconstructions, the discrete latent codes are modeled using conditional GPT in an autoregressive manner, while incorporating conditional information from a given slice, rock type, and porosity. We conduct two experiments on five kinds of rocks, and the results demonstrate that RockGPT can produce different kinds of rocks with a single model, and the porosities of reconstructed samples can distribute around specified targets with a narrow range. In a broader sense, through leveraging the proposed conditioning scheme, RockGPT constitutes an effective way to build a general model to produce multiple kinds of rocks simultaneously that also satisfy user-defined properties.
•Fully coupled thermo-hydro-mechanical model for hydraulic fracturing.•Fracture propagation in mixed tensile and shear failures for 3D modeling.•Thermally induced secondary fractures for the ...enhancement of hydraulic stimulation.•Anisotropic model for rock deformation, fluid-heat flow, and thermal expansion.
Commercial development of enhanced geothermal systems in low-permeability rocks relies on fracturing treatments to create complex-fracture networks and an appropriate circulation strategy to maintain high flow rates at sufficiently high temperatures. However, it remains challenging to model complex-fracture propagation and heat energy extraction as a whole. This paper develops a fully coupled thermo-hydro-mechanical model to simulate reservoir stimulation and heat production in naturally fractured geothermal reservoirs. The proposed model is validated against a widely used model, TOUGH2, concerning heat sweep in a vertical fracture. This model is then applied to study multi-staged fracturing and geothermal extraction related to a doublet of horizontal wells. The hydro-geomechanical properties are chosen from the Soultz geothermal reservoir at a depth of approximately 3600 m. Numerical results demonstrate that: (1) mixed tensile and shear fracturing can constitute an important stimulation mechanism for naturally fractured geothermal reservoirs; (2) well interlinked, zigzag artificial fractures between injection and production wells readily lead to channeling flow; (3) keeping a segment of horizontal wells open and placing them further apart are beneficial to the formation of sufficiently diffuse flow pathways; (4) increasing well spacing tends to improve thermal performance; however, for the case of a one-stage opening, the improvement of heat sweep efficiency is not significant; and (5) an alternating circulation scheme could achieve superior thermal performance. This study establishes an effective modeling workflow for the design and optimization of naturally fractured geothermal reservoirs, and provides an integrated modeling framework for evaluating recoverable energy potential from geothermal reservoirs.
•We proposed an analytical method for upscaling hydraulic conductivity using perturbation expansion techniques and Fourier analysis.•The proposed method generates the same results as from the ...numerical method in a finite difference scheme with periodic boundary conditions efficiently.•The proposed method is validated for various cases considering anisotropy, heterogeneity and geometry in general.
Modern geological modeling techniques represent anisotropic heterogeneous formations by high-resolution grids, which can be computationally prohibitive. This motivates the upscaling process that scales-up properties defined at a fine-scale system to equivalent properties defined at a coarse-scale system. In general, analytical methods are very efficient but limited to assumptions and approximations, whereas numerical methods are more robust albeit more time-consuming.
In this work, we developed an analytical method to approximate numerical solutions in a finite difference scheme with periodic boundary conditions for two-dimensional problem. Using perturbation expansion techniques and Fourier analysis, the method generates explicit formulas of tensorial equivalent conductivity considering heterogeneity and anisotropy of two-dimensional space, as well as geometry of gridblocks. It is applicable for various cases with different covariance/variagram models and a wide range of log-conductivity variances, correlation lengths, rotation angles, anisotropy ratios of fine grid conductivity, anisotropy ratios of fine grid size, and the number of fine gridblocks in a coarse gridblock. The analytical method matched well with the numerical method for the estimation of the conductivity tensor, hydraulic head, and discharge velocity. The coefficients in the analytical method need to be computed only once for any given statistics, which makes the proposed method much more efficient than the numerical method.
Catchment scale hydrological models are critical decision support tools for water resources management and environment remediation. However, the reliability of hydrological models is inevitably ...affected by limited measurements and imperfect models. Data assimilation techniques combine complementary information from measurements and models to enhance the model reliability and reduce predictive uncertainties. As a sequential data assimilation technique, the ensemble Kalman filter (EnKF) has been extensively studied in the earth sciences for assimilating in-situ measurements and remote sensing data. Although the EnKF has been demonstrated in land surface data assimilations, there are no systematic studies to investigate its performance in distributed modeling with high dimensional states and parameters. In this paper, we present an assessment on the EnKF with state augmentation for combined state-parameter estimation on the basis of a physical-based hydrological model, Soil and Water Assessment Tool (SWAT). Through synthetic simulation experiments, the capability of the EnKF is demonstrated by assimilating the runoff and other measurements, and its sensitivities are analyzed with respect to the error specification, the initial realization and the ensemble size. It is found that the EnKF provides an efficient approach for obtaining a set of acceptable model parameters and satisfactory runoff, soil water content and evapotranspiration estimations. The EnKF performance could be improved after augmenting with other complementary data, such as soil water content and evapotranspiration from remote sensing retrieval. Sensitivity studies demonstrate the importance of consistent error specification and the potential with small ensemble size in the data assimilation system.