The ATSAS software suite encompasses a number of programs for the processing, visualization, analysis and modelling of small‐angle scattering data, with a focus on the data measured from biological ...macromolecules. Here, new developments in the ATSAS 3.0 package are described. They include IMSIM, for simulating isotropic 2D scattering patterns; IMOP, to perform operations on 2D images and masks; DATRESAMPLE, a method for variance estimation of structural invariants through parametric resampling; DATFT, which computes the pair distance distribution function by a direct Fourier transform of the scattering data; PDDFFIT, to compute the scattering data from a pair distance distribution function, allowing comparison with the experimental data; a new module in DATMW for Bayesian consensus‐based concentration‐independent molecular weight estimation; DATMIF, an ab initio shape analysis method that optimizes the search model directly against the scattering data; DAMEMB, an application to set up the initial search volume for multiphase modelling of membrane proteins; ELLLIP, to perform quasi‐atomistic modelling of liposomes with elliptical shapes; NMATOR, which models conformational changes in nucleic acid structures through normal mode analysis in torsion angle space; DAMMIX, which reconstructs the shape of an unknown intermediate in an evolving system; and LIPMIX and BILMIX, for modelling multilamellar and asymmetric lipid vesicles, respectively. In addition, technical updates were deployed to facilitate maintainability of the package, which include porting the PRIMUS graphical interface to Qt5, updating SASpy – a PyMOL plugin to run a subset of ATSAS tools – to be both Python 2 and 3 compatible, and adding utilities to facilitate mmCIF compatibility in future ATSAS releases. All these features are implemented in ATSAS 3.0, freely available for academic users at https://www.embl‐hamburg.de/biosaxs/software.html.
ATSAS is a comprehensive software suite for the processing, visualization, analysis and modelling of small‐angle scattering data. This article describes developments in the ATSAS 3.0 release, including new programs for data simulation and for the structural modelling of lipids, nucleic acids and polydisperse systems.
Many attempts to predict the potential range of species rely on environmental niche (or 'bioclimate envelope') modelling, yet the effects of using different niche-based methodologies require further ...investigation. Here we investigate the impact that the choice of model can have on predictions, identify key reasons why model output may differ and discuss the implications that model uncertainty has for policy-guiding applications. The Western Cape of South Africa. We applied nine of the most widely used modelling techniques to model potential distributions under current and predicted future climate for four species (including two subspecies) of Proteaceae. Each model was built using an identical set of five input variables and distribution data for 3996 sampled sites. We compare model predictions by testing agreement between observed and simulated distributions for the present day (using the area under the receiver operating characteristic curve (AUC) and kappa statistics) and by assessing consistency in predictions of range size changes under future climate (using cluster analysis). Our analyses show significant differences between predictions from different models, with predicted changes in range size by 2030 differing in both magnitude and direction (e.g. from 92% loss to 322% gain). We explain differences with reference to two characteristics of the modelling techniques: data input requirements (presence/absence vs. presence-only approaches) and assumptions made by each algorithm when extrapolating beyond the range of data used to build the model. The effects of these factors should be carefully considered when using this modelling approach to predict species ranges. We highlight an important source of uncertainty in assessments of the impacts of climate change on biodiversity and emphasize that model predictions should be interpreted in policy-guiding applications along with a full appreciation of uncertainty.
AIM: The Hutchinsonian hypervolume is the conceptual foundation for many lines of ecological and evolutionary inquiry, including functional morphology, comparative biology, community ecology and ...niche theory. However, extant methods to sample from hypervolumes or measure their geometry perform poorly on high‐dimensional or holey datasets. INNOVATION: We first highlight the conceptual and computational issues that have prevented a more direct approach to measuring hypervolumes. Next, we present a new multivariate kernel density estimation method that resolves many of these problems in an arbitrary number of dimensions. MAIN CONCLUSIONS: We show that our method (implemented as the ‘hypervolume’ R package) can match several extant methods for hypervolume geometry and species distribution modelling. Tools to quantify high‐dimensional ecological hypervolumes will enable a wide range of fundamental descriptive, inferential and comparative questions to be addressed.
Summary
The field of ecological niche modelling or species distribution modelling has seen enormous activity and attention in recent years, in the light of exciting biological inferences that can be ...drawn from correlational models of species' environmental requirements (i.e. ecological niches) and inferences of potential geographic distributions. Among the many methods used in the field, one or two are in practice assumed to be ‘best’ and are used commonly, often without explicit testing.
We explore herein implications of the ‘no free lunch’ theorem, which suggests that no single optimization approach will prove to be best under all circumstances: we developed diverse virtual species with known niche and dispersal properties to test a suite of niche modelling algorithms designed to estimate potential areas of distribution.
The result was that (i) indeed, no single ‘best’ algorithm was found and (ii) different algorithms performing very different manners depending on the particularities of the virtual species.
The conclusion is that niche or distribution modelling studies should begin by testing a suite of algorithms for predictive ability under the particular circumstances of the study and choose an algorithm for a particular challenge based on the results of those tests. Studies that do not take this step may use algorithms that are not optimal for that particular challenge.
The driving range of electric vehicles is a complex issue. In simulation, this range is determined by coupling a battery model and a traction model of the vehicle. But most of the battery models used ...for such studies do not take into account the interaction between the temperature of the battery (impacting its electrical parameters) and the traction model of a studied EV. In this paper, a battery electro-thermal model (with all electrical parameters dependent upon the temperature) is dynamically coupled with the traction model of an electric vehicle. Simulation results are provided to show the impact of the temperature on the driving range. The initial battery model (without temperature dependence) leads to an overestimation of the driving range of 3.3% at -5 °C and an underestimation of 2.5% at 40 °C.
During the past decade, the application of agricultural production systems modelling has rapidly expanded while there has been less emphasis on model improvement. Cropping systems modelling has ...become agricultural modelling, incorporating new capabilities enabling analyses in the domains of greenhouse gas emissions, soil carbon changes, ecosystem services, environmental performance, food security, pests and disease losses, livestock and pasture production, and climate change mitigation and adaptation. New science has been added to the models to support this broadening application domain, and new consortia of modellers have been formed that span the multiple disciplines.
There has not, however, been a significant and sustained focus on software platforms to increase efficiency in agricultural production systems research in the interaction between the software industry and the agricultural modelling community. This paper describes the changing agricultural modelling landscape since 2002, largely from a software perspective, and makes a case for a focussed effort on the software implementations of the major models.
•The agricultural modelling community has broadened its scientific focus over the last decade.•The software implementations of the leading agricultural models hasn't changed significantly in the last decade.•A focussed effort on agricultural modelling software and process is needed.
This open access book is based on selected presentations from Topic Study Group 21: Mathematical Applications and Modelling in the Teaching and Learning of Mathematics at the 13th International ...Congress on Mathematical Education (ICME 13), held in Hamburg, Germany on July 24–31, 2016. It contributes to the theory, research and teaching practice concerning this key topic by taking into account the importance of relations between mathematics and the real world. Further, the book addresses the “balancing act” between developing students’ modelling skills on the one hand, and using modelling to help them learn mathematics on the other, which arises from the integration of modelling into classrooms. The contributions, prepared by authors from 9 countries, reflect the spectrum of international debates on the topic, and the examples presented span schooling from years 1 to 12, teacher education, and teaching modelling at the tertiary level. In addition the book highlights professional learning and development for in-service teachers, particularly in systems where the introduction of modelling into curricula means reassessing how mathematics is taught. Given its scope, the book will appeal to researchers and teacher educators in mathematics education, as well as pre-service teachers and school and university educators
•We integrated mechanistic wildfire simulation into a forest landscape model.•We simulated 50 years of forest succession, management, and wildfire.•We found no tipping points in area burned under ...contemporary fire.•The landscape exhibited a steady decline in potential fire intensity and severity.•The modelling framework is a useful tool to examine forest policy issues.
We developed and applied a wildfire simulation package in the Envision agent-based landscape modelling system. The wildfire package combines statistical modelling of fire occurrence with a high-resolution, mechanistic wildfire spread model that can capture fine scale effects of fire feedbacks and fuel management, and replicate restoration strategies at scales that are meaningful to forest managers. We applied the model to a landscape covering 1.2 million ha of fire prone area in central Oregon, USA where wildland fires are increasingly impacting conservation, amenity values and developed areas. We conducted simulations to examine the effect of human versus natural ignitions on future fire regimes under current restoration programs, and whether contemporary fire regimes observed in the past 20 years are likely to change as result of fire feedbacks and management activities. The ignition prediction model revealed non-linear effects of location and time of year, and distinct spatiotemporal patterns for human versus natural ignitions. Fire rotation interval among replicate simulations varied from 78 to 170 years and changed little over the 50-yr simulation, suggesting a stable but highly variable and uncertain future fire regime. Interestingly, the potential for fire-on-fire feedbacks was higher for human versus natural ignitions due to human ignition hotspots within the study area. We compare the methods and findings with other forest landscape simulation model (FLSM) studies and discuss future application of FLSMs to address emerging wildfire management and policy issues on fire frequent forests in the western US.
This review paper is focused on the various modelling techniques for the solar dryer system. The modelling techniques are very important to develop, increase drying efficiency, analyse and predict ...the performance of different kinds of solar drying system. The modelling techniques are also important for predicting the temperature of crop moisture content, drying rate, quality of crop and colour of crops. Computational fluid dynamics (CFD can be applied for analysing and investigating of air flow and spry of temperature in the drying system. Adaptive-network-based fuzzy inference system (ANFIS) can be used to predict the behaviour of the solar drying system. ANN is used to calculate the mass of the dried crops on hourly basis. FUZZY is very important software for using the simulation of drying system. That can also be used to accurately predict the results with a minimum error. The mathematical modelling techniques are used for testing the drying behaviour of crops in the laboratory. It act in effect tool between scientists and investigators. It helps short of spending vast amount of time, energy and money in experimental events. Before fabrication the modelling techniques are very supportive in simulation of different types of solar drying system. Thus, analysis on the base of modelling techniques is not only save time but also save the capital investment in solar drying system. The advantage and future scope of modelling techniques is also discussed.
Direct steam generation (DSG) in parabolic-trough collectors (PTC) is one of the most attractive technologies in concentrated solar power plants. Its appeal stems from its ability to reduce the ...operational and maintenance costs compared with other heat transfer fluids. Modelling and simulation (M&S) tools, together with the development of experimental real-scale set-ups have played a key role in the advancement of this solar technology. The aim of this review is to summarize and analyse the thermohydraulic, thermal and optical models implemented in M&S tools for DSG in PTC in order to identify the contribution that these models could provide towards the improvement of the technology in the future. Thermohydraulic models have been, in most cases, developed under the three-equation homogeneous equilibrium model (HEM) approach, successfully for recirculation mode. The more complete six-equation two-fluid model (TFM) approach, has also been properly applied, to a lesser extent, to modelling the once-through solar field operation mode, considering water/steam two-phase flow patterns. Although these advancements have contributed to the design and operation of the first commercial solar steam power plant with PTC for electricity generation, there are however some technological gaps still to be overcome to consolidate the technology. In recirculation solar field operation mode, the use of HEM has shown to be adequate to model the DSG process in PTC integrated with thermal energy storage systems and into solar hybrid power plants. For once-through operation mode, the distributed-parameter thermohydraulic models, especially under TFM approach, involving a detailed flow pattern map, have demonstrated to be suitable tools for solving the uncertainties related to the two-phase flow, especially at the endpoint of the evaporation section.
•Technological advances of direct steam generation in parabolic-trough collectors together with current challenges are hereby described.•Thermohydraulic modelling approaches and the mathematical models implemented in modelling and simulation tools for direct steam generation in parabolic-trough collectors are discussed in detail.•Thermal and optical modelling approaches in the receiver tube of parabolic-trough collectors, as well as the ones applied for direct steam generation, are examined.•The potential contribution of modelling and simulation tools towards the improvement of the technology is identified both in recirculation and once-through operation modes.