•A new model based on LSTM is developed for predicting water table depth.•Only a very simple data pre-processing method is required in our proposed model.•The dropout strategy is adopted to prevent ...over-fitting significantly.•Our model shows superiority over the classic FFNN and the Double-LSTM models.
Predicting water table depth over the long-term in agricultural areas presents great challenges because these areas have complex and heterogeneous hydrogeological characteristics, boundary conditions, and human activities; also, nonlinear interactions occur among these factors. Therefore, a new time series model based on Long Short-Term Memory (LSTM), was developed in this study as an alternative to computationally expensive physical models. The proposed model is composed of an LSTM layer with another fully connected layer on top of it, with a dropout method applied in the first LSTM layer. In this study, the proposed model was applied and evaluated in five sub-areas of Hetao Irrigation District in arid northwestern China using data of 14 years (2000–2013). The proposed model uses monthly water diversion, evaporation, precipitation, temperature, and time as input data to predict water table depth. A simple but effective standardization method was employed to pre-process data to ensure data on the same scale. 14 years of data are separated into two sets: training set (2000–2011) and validation set (2012–2013) in the experiment. As expected, the proposed model achieves higher R2 scores (0.789–0.952) in water table depth prediction, when compared with the results of traditional feed-forward neural network (FFNN), which only reaches relatively low R2 scores (0.004–0.495), proving that the proposed model can preserve and learn previous information well. Furthermore, the validity of the dropout method and the proposed model’s architecture are discussed. Through experimentation, the results show that the dropout method can prevent overfitting significantly. In addition, comparisons between the R2 scores of the proposed model and Double-LSTM model (R2 scores range from 0.170 to 0.864), further prove that the proposed model’s architecture is reasonable and can contribute to a strong learning ability on time series data. Thus, one can conclude that the proposed model can serve as an alternative approach predicting water table depth, especially in areas where hydrogeological data are difficult to obtain.
► Reviewed the TPS-RPM algorithm in terms of its outlier handling. ► Provided mathematical proof of a double-sided outlier handling approach for the TPS-RPM algorithm. ► Extended the TPS-RPM ...algorithm to image registration by including an intensity matching term. ► Demonstrated the importance of additional information in robust outlier handling for the TPS-RPM algorithm.
This paper reviews the TPS-RPM algorithm (
Chui and Rangarajan, 2003) for robustly registering two sets of points and demonstrates from a theoretical point of view its inherent limited performance when outliers are present in both point sets simultaneously. A double-sided outlier handling approach is proposed to overcome this limitation with a rigorous mathematical proof as the underlying theoretical support. This double-sided outlier handling approach is proved to be equivalent to the original formulation of the point matching problem. For a practical application, we also extend the TPS-RPM algorithms to non-rigid image registration by registering two sets of sparse features extracted from images. The intensity information of the extracted features are incorporated into feature matching in order to reduce the impact from outliers. Our experiments demonstrate the double-sided outlier handling approach and the efficiency of intensity information in assisting outlier detection.
This study evaluated three algorithms of the iterative ensemble Kalman filter (EnKF). They are Confirming EnKF, Restart EnKF, and modified Restart EnKF developed to resolve the inconsistency problem ...(i.e., updated model parameters and state variables do not follow the Richards equation) in vadose zone data assimilation due to model nonlinearity. While Confirming and Restart EnKF were adapted from literature, modified Restart EnKF was developed in this study to reduce computational costs by calculating only the mean simulation, not all the ensemble realizations, from time t = 0. A total of 11 cases were designed to investigate the performance of EnKF, Confirming EnKF, Restart EnKF, and modified Restart EnKF with different types and spatial configurations of observations (pressure head and water content) and different values of observation error variance, initial guess of ensemble mean and variance, ensemble size, and damping factor. The numerical study showed that Confirming EnKF produced considerable inconsistency for the nonlinear unsaturated flow problem, which differs from the apparent consensus opinion that Confirming EnKF can resolve the inconsistency problem. In contrast, Restart EnKF and its modification can resolve the inconsistency problem. Restart EnKF and its modification outperformed EnKF and Confirming EnKF in the various cases considered in this study. It ws also found that combining different types of observations can achieve better assimilation results, which is useful for monitoring network design.
The purpose of this study was to determine the significance of interscanner variability in CT image radiomics studies.
We compared the radiomics features calculated for non-small cell lung cancer ...(NSCLC) tumors from 20 patients with those calculated for 17 scans of a specially designed radiomics phantom. The phantom comprised 10 cartridges, each filled with different materials to produce a wide range of radiomics feature values. The scans were acquired using General Electric, Philips, Siemens, and Toshiba scanners from 4 medical centers using their routine thoracic imaging protocol. The radiomics feature studied included the mean and standard deviations of the CT numbers as well as textures derived from the neighborhood gray-tone difference matrix. To quantify the significance of the interscanner variability, we introduced the metric feature noise. To look for patterns in the scans, we performed hierarchical clustering for each cartridge.
The mean CT numbers for the 17 CT scans of the phantom cartridges spanned from -864 to 652 Hounsfield units compared with a span of -186 to 35 Hounsfield units for the CT scans of the NSCLC tumors, showing that the phantom's dynamic range includes that of the tumors. The interscanner variability of the feature values depended on both the cartridge material and the feature, and the variability was large relative to the interpatient variability in the NSCLC tumors for some features. The feature interscanner noise was greatest for busyness and least for texture strength. Hierarchical clustering produced different clusters of the phantom scans for each cartridge, although there was some consistent clustering by scanner manufacturer.
The variability in the values of radiomics features calculated on CT images from different CT scanners can be comparable to the variability in these features found in CT images of NSCLC tumors. These interscanner differences should be considered, and their effects should be minimized in future radiomics studies.
Due to rapid advances in radiation therapy (RT), especially image guidance and treatment adaptation, a fast and accurate segmentation of medical images is a very important part of the treatment. ...Manual delineation of target volumes and organs at risk is still the standard routine for most clinics, even though it is time consuming and prone to intra- and interobserver variations. Automated segmentation methods seek to reduce delineation workload and unify the organ boundary definition. In this paper, the authors review the current autosegmentation methods particularly relevant for applications in RT. The authors outline the methods’ strengths and limitations and propose strategies that could lead to wider acceptance of autosegmentation in routine clinical practice. The authors conclude that currently, autosegmentation technology in RT planning is an efficient tool for the clinicians to provide them with a good starting point for review and adjustment. Modern hardware platforms including GPUs allow most of the autosegmentation tasks to be done in a range of a few minutes. In the nearest future, improvements in CT-based autosegmentation tools will be achieved through standardization of imaging and contouring protocols. In the longer term, the authors expect a wider use of multimodality approaches and better understanding of correlation of imaging with biology and pathology.
Radiomics is the use of quantitative imaging features extracted from medical images to characterize tumor pathology or heterogeneity. Features measured at pretreatment have successfully predicted ...patient outcomes in numerous cancer sites. This project was designed to determine whether radiomics features measured from non-small cell lung cancer (NSCLC) change during therapy and whether those features (delta-radiomics features) can improve prognostic models. Features were calculated from pretreatment and weekly intra-treatment computed tomography images for 107 patients with stage III NSCLC. Pretreatment images were used to determine feature-specific image preprocessing. Linear mixed-effects models were used to identify features that changed significantly with dose-fraction. Multivariate models were built for overall survival, distant metastases, and local recurrence using only clinical factors, clinical factors and pretreatment radiomics features, and clinical factors, pretreatment radiomics features, and delta-radiomics features. All of the radiomics features changed significantly during radiation therapy. For overall survival and distant metastases, pretreatment compactness improved the c-index. For local recurrence, pretreatment imaging features were not prognostic, while texture-strength measured at the end of treatment significantly stratified high- and low-risk patients. These results suggest radiomics features change due to radiation therapy and their values at the end of treatment may be indicators of tumor response.
•A method to assess potential migration capacity of salt in frozen soil is proposed.•Chemical reactions of multi-component ions in frozen soil is simulated by FREZCHEM.•Convection and dispersion ...effects of each ion on salt transport are quantified.•Cl, Na and SO4 are less affected by precipitation-dissolution and are more mobile.•Dispersion effect becomes great at end of freeze period driving salt move downward.
Current researches on estimating soil salt movement during the freeze and thaw period are mainly based on total salt concentration, while ignoring the phase change of multi-component salts. In this study, the potential transport capability of soil water and salt ions in the frozen layer was investigated based on chemical characteristics and solute convection–diffusion theory. Field experiments were implemented to measure the change of total soil water, soil salt and its ion components. The soil water and salt content in liquid and solid phases were calculated by the FREZCHEM model, and then the movement of liquid soil water and soluble salt were simulated by the HYDRUS-1D model. The potential migration capability of soil water and salt components were then estimated by comparing the distribution profiles obtained by HYDRUS-1D and FREZCHEM. The results show that more than 74.5% of Cl, Na and SO4 were in liquid phase, which were less affected by precipitation-dissolution reaction and owned strong mitigation. The liquid soil water, soluble salt and salt ions in the frozen layer tended to decrease at the beginning of the freeze period and increase significantly at the end. The migration direction and quantity of various salt ions were different due to their concentration gradient and diffusion coefficient, causing the larger potential convection and dispersion quantity of Na+, Cl− and SO42- than those of Ca2+, Mg2+ and HCO3–. This study provides a new perspective for soil salt movement in frozen soil in agricultural areas with shallow groundwater table depth.
•The reason of numerical divergence for simulating infiltration into dry soil is revealed.•The influencing factors of allowed maximum time step sizes are analyzed.•A more robust and cost-effective ...modified iteration algorithm is proposed.
Numerical models based on Richards’ equation are often employed to simulate the soil water dynamics. Among them, those Picard iteration models which use the head as primary variable are widely adopted due to their simplicity and capability for handling partially saturated flow conditions. However, it is well-known that those models are prone to convergence failure in some unfavorable flow conditions, especially when simulating infiltration into initially dry soils. Here we analyze the reasons that give rise to the numerical difficulty. Moreover, several modifications to the mass-conservative Picard iteration method are proposed so that numerical difficulty is avoided in these unfavorable flow conditions. Our proposed modifications do not degrade the simulated results, while they lead to more robust convergence performances and cost-effective simulations.
•A mass balance model for agricultural salt accumulation and leaching is developed.•The model has looser restrictions on the discretization of space and time.•The calculated soil salt are about 20% ...better when considering the immobile region.
Soil salinization has become a widespread problem that seriously affects the sustainable development of agriculture. To deal with the salt accumulation and leaching problems in practical agricultural systems, a new soil water and solute transport model adopting the mass balance scheme is developed in this study. The soil water movement module is based on a previously published soil water mass balance model, named as UBMOD. The mobile-immobile assumption has been adopted to develop the soil solute transport module, which can describe the obvious anomalous transport phenomenon in practice. The advection, chemical reactions of solute, solute transfer between the mobile and immobile regions, and dispersion in the mobile region are considered in the solute transport module, while only chemical reactions and solute transfer between the mobile and immobile regions are considered in the immobile region. Three cases were designed to evaluate the performance of the model by comparing the simulation results with those obtained by experimental results, HYDRUS-1D, and practical observations. The results demonstrate that the developed model can solve the homogeneous and heterogeneous soil water movement and solute transport effectively, and it has looser restrictions on the discretization of space and time than the numerical solution of Richards’ equation and can keep the mass balance well, which make it more suitable for practical conditions. Moreover, the real-world application of salt accumulation and leaching problems in Hetao Irrigation District, Inner Mongolia, China shows that the RMSE of the simulated soil salt content with the immobile present region is about 20% better than that when ignoring the immobile present region. Therefore, it is necessary to take the significant anomalous transport into account when studying the salt accumulation and leaching processes in practical agricultural systems.
Purpose:
Radiomics, which is the high‐throughput extraction and analysis of quantitative image features, has been shown to have considerable potential to quantify the tumor phenotype. However, at ...present, a lack of software infrastructure has impeded the development of radiomics and its applications. Therefore, the authors developed the imaging biomarker explorer (ibex), an open infrastructure software platform that flexibly supports common radiomics workflow tasks such as multimodality image data import and review, development of feature extraction algorithms, model validation, and consistent data sharing among multiple institutions.
Methods:
The ibex software package was developed using the matlab and c/c++ programming languages. The software architecture deploys the modern model‐view‐controller, unit testing, and function handle programming concepts to isolate each quantitative imaging analysis task, to validate if their relevant data and algorithms are fit for use, and to plug in new modules. On one hand, ibex is self‐contained and ready to use: it has implemented common data importers, common image filters, and common feature extraction algorithms. On the other hand, ibex provides an integrated development environment on top of matlab and c/c++, so users are not limited to its built‐in functions. In the ibex developer studio, users can plug in, debug, and test new algorithms, extending ibex’s functionality. ibex also supports quality assurance for data and feature algorithms: image data, regions of interest, and feature algorithm‐related data can be reviewed, validated, and/or modified. More importantly, two key elements in collaborative workflows, the consistency of data sharing and the reproducibility of calculation result, are embedded in the ibex workflow: image data, feature algorithms, and model validation including newly developed ones from different users can be easily and consistently shared so that results can be more easily reproduced between institutions.
Results:
Researchers with a variety of technical skill levels, including radiation oncologists, physicists, and computer scientists, have found the ibex software to be intuitive, powerful, and easy to use. ibex can be run at any computer with the windows operating system and 1GB RAM. The authors fully validated the implementation of all importers, preprocessing algorithms, and feature extraction algorithms. Windows version 1.0 beta of stand‐alone ibex and ibex’s source code can be downloaded.
Conclusions:
The authors successfully implemented ibex, an open infrastructure software platform that streamlines common radiomics workflow tasks. Its transparency, flexibility, and portability can greatly accelerate the pace of radiomics research and pave the way toward successful clinical translation.