Spatial modelling techniques are increasingly used in species distribution modelling. However, the implemented techniques differ in their modelling performance, and some consensus methods are needed ...to reduce the uncertainty of predictions. In this study, we tested the predictive accuracies of five consensus methods, namely Weighted Average (WA), Mean(All), Median(All), Median(PCA), and Best, for 28 threatened plant species. North-eastern Finland, Europe. The spatial distributions of the plant species were forecasted using eight state-of-the-art single-modelling techniques providing an ensemble of predictions. The probability values of occurrence were then combined using five consensus algorithms. The predictive accuracies of the single-model and consensus methods were assessed by computing the area under the curve (AUC) of the receiver-operating characteristic plot. The mean AUC values varied between 0.697 (classification tree analysis) and 0.813 (random forest) for the single-models, and from 0.757 to 0.850 for the consensus methods. WA and Mean(All) consensus methods provided significantly more robust predictions than all the single-models and the other consensus methods. Consensus methods based on average function algorithms may increase significantly the accuracy of species distribution forecasts, and thus they show considerable promise for different conservation biological and biogeographical applications.
An increasing number of power-electronic-based distributed generation systems and loads generate not only characteristic harmonics but also unexpected harmonics. Several methods, such as ...impedance-based analysis, which are derived from the conventional average model, are introduced to perform research about the harmonic interaction. However, it is found that the linear-time-invariant-based model analysis makes it difficult to analyze these phenomena because of the time-varying properties of the power-electronic-based systems. This paper investigates a grid-connected converter by using the harmonic state-space (HSS) small-signal model, which is based on a linear time-varying periodically theory. The proposed model can include the switching behavior of the model, where it makes the model possible to analyze how harmonics are transferred into both the ac-side and dc-side circuits. Furthermore, a harmonic matrix of the grid-connected converter is developed to analyze the harmonic interaction at the steady-state behavior. Besides, the frequency-domain results are compared with time-domain simulation results by using the HSS modeling to verify the theoretical analysis. Experimental results are finally discussed to verify the proposed model and study.
In response to growing concerns surrounding the relationship between climate change and escalating flood risk, there is an increasing urgency to develop precise and rapid flood prediction models. ...Although high-resolution flood simulations have made notable advancements, they remain computationally expensive, underscoring the need for efficient machine learning surrogate models. As a result of sparse empirical observation and expensive data collection, there is a growing need for the models to perform effectively in ‘small-data’ contexts, a characteristic typical of many scientific problems. This research combines the latest developments in surrogate modelling and physics-informed machine learning to propose a novel Physics-Informed Neural Network-based surrogate model for hydrodynamic simulators governed by Shallow Water Equations. The proposed method incorporates physics-based prior information into the neural network structure by encoding the conservation of mass into the model without relying on calculating continuous derivatives in the loss function. The method is demonstrated for a high-resolution inland flood simulation model and a large-scale regional tidal model. The proposed method outperforms the existing state-of-the-art data-driven approaches by up to 25 %. This research demonstrates the benefits and robustness of physics-informed approaches in surrogate modelling for flood and hydroclimatic modelling problems.
Display omitted
•Novel flexible PINNs model without requiring direct calculation of derivatives is developed.•Improvement over the data-driven CNN model without increased computational overheads•Physics-informed regularisation extends existing data-driven surrogate modelling.•Novel methodology is developed to improve functional API over previous PINNs models.
AIM: The Hutchinsonian hypervolume is the conceptual foundation for many lines of ecological and evolutionary inquiry, including functional morphology, comparative biology, community ecology and ...niche theory. However, extant methods to sample from hypervolumes or measure their geometry perform poorly on high‐dimensional or holey datasets. INNOVATION: We first highlight the conceptual and computational issues that have prevented a more direct approach to measuring hypervolumes. Next, we present a new multivariate kernel density estimation method that resolves many of these problems in an arbitrary number of dimensions. MAIN CONCLUSIONS: We show that our method (implemented as the ‘hypervolume’ R package) can match several extant methods for hypervolume geometry and species distribution modelling. Tools to quantify high‐dimensional ecological hypervolumes will enable a wide range of fundamental descriptive, inferential and comparative questions to be addressed.
Many attempts to predict the potential range of species rely on environmental niche (or 'bioclimate envelope') modelling, yet the effects of using different niche-based methodologies require further ...investigation. Here we investigate the impact that the choice of model can have on predictions, identify key reasons why model output may differ and discuss the implications that model uncertainty has for policy-guiding applications. The Western Cape of South Africa. We applied nine of the most widely used modelling techniques to model potential distributions under current and predicted future climate for four species (including two subspecies) of Proteaceae. Each model was built using an identical set of five input variables and distribution data for 3996 sampled sites. We compare model predictions by testing agreement between observed and simulated distributions for the present day (using the area under the receiver operating characteristic curve (AUC) and kappa statistics) and by assessing consistency in predictions of range size changes under future climate (using cluster analysis). Our analyses show significant differences between predictions from different models, with predicted changes in range size by 2030 differing in both magnitude and direction (e.g. from 92% loss to 322% gain). We explain differences with reference to two characteristics of the modelling techniques: data input requirements (presence/absence vs. presence-only approaches) and assumptions made by each algorithm when extrapolating beyond the range of data used to build the model. The effects of these factors should be carefully considered when using this modelling approach to predict species ranges. We highlight an important source of uncertainty in assessments of the impacts of climate change on biodiversity and emphasize that model predictions should be interpreted in policy-guiding applications along with a full appreciation of uncertainty.
Summary
The field of ecological niche modelling or species distribution modelling has seen enormous activity and attention in recent years, in the light of exciting biological inferences that can be ...drawn from correlational models of species' environmental requirements (i.e. ecological niches) and inferences of potential geographic distributions. Among the many methods used in the field, one or two are in practice assumed to be ‘best’ and are used commonly, often without explicit testing.
We explore herein implications of the ‘no free lunch’ theorem, which suggests that no single optimization approach will prove to be best under all circumstances: we developed diverse virtual species with known niche and dispersal properties to test a suite of niche modelling algorithms designed to estimate potential areas of distribution.
The result was that (i) indeed, no single ‘best’ algorithm was found and (ii) different algorithms performing very different manners depending on the particularities of the virtual species.
The conclusion is that niche or distribution modelling studies should begin by testing a suite of algorithms for predictive ability under the particular circumstances of the study and choose an algorithm for a particular challenge based on the results of those tests. Studies that do not take this step may use algorithms that are not optimal for that particular challenge.
The driving range of electric vehicles is a complex issue. In simulation, this range is determined by coupling a battery model and a traction model of the vehicle. But most of the battery models used ...for such studies do not take into account the interaction between the temperature of the battery (impacting its electrical parameters) and the traction model of a studied EV. In this paper, a battery electro-thermal model (with all electrical parameters dependent upon the temperature) is dynamically coupled with the traction model of an electric vehicle. Simulation results are provided to show the impact of the temperature on the driving range. The initial battery model (without temperature dependence) leads to an overestimation of the driving range of 3.3% at -5 °C and an underestimation of 2.5% at 40 °C.
Direct steam generation (DSG) in parabolic-trough collectors (PTC) is one of the most attractive technologies in concentrated solar power plants. Its appeal stems from its ability to reduce the ...operational and maintenance costs compared with other heat transfer fluids. Modelling and simulation (M&S) tools, together with the development of experimental real-scale set-ups have played a key role in the advancement of this solar technology. The aim of this review is to summarize and analyse the thermohydraulic, thermal and optical models implemented in M&S tools for DSG in PTC in order to identify the contribution that these models could provide towards the improvement of the technology in the future. Thermohydraulic models have been, in most cases, developed under the three-equation homogeneous equilibrium model (HEM) approach, successfully for recirculation mode. The more complete six-equation two-fluid model (TFM) approach, has also been properly applied, to a lesser extent, to modelling the once-through solar field operation mode, considering water/steam two-phase flow patterns. Although these advancements have contributed to the design and operation of the first commercial solar steam power plant with PTC for electricity generation, there are however some technological gaps still to be overcome to consolidate the technology. In recirculation solar field operation mode, the use of HEM has shown to be adequate to model the DSG process in PTC integrated with thermal energy storage systems and into solar hybrid power plants. For once-through operation mode, the distributed-parameter thermohydraulic models, especially under TFM approach, involving a detailed flow pattern map, have demonstrated to be suitable tools for solving the uncertainties related to the two-phase flow, especially at the endpoint of the evaporation section.
•Technological advances of direct steam generation in parabolic-trough collectors together with current challenges are hereby described.•Thermohydraulic modelling approaches and the mathematical models implemented in modelling and simulation tools for direct steam generation in parabolic-trough collectors are discussed in detail.•Thermal and optical modelling approaches in the receiver tube of parabolic-trough collectors, as well as the ones applied for direct steam generation, are examined.•The potential contribution of modelling and simulation tools towards the improvement of the technology is identified both in recirculation and once-through operation modes.
Flood hazard mapping methods: A review Mudashiru, Rofiat Bunmi; Sabtu, Nuridah; Abustan, Ismail ...
Journal of hydrology (Amsterdam),
December 2021, 2021-12-00, Letnik:
603
Journal Article
Recenzirano
•State-of-the-art review of related publications from 2000 to 2021 for the three discussed methods in FHM.•The review study gave a comprehensive overview of the methodologies in flood hazard ...mapping.•Explained the strengths and limitations of the physically-based, physical, and empirical modeling methods in flood hazard modeling.•Presented case studies of the methods, uncertainties, recent trends, and future directions associated with the methods in the review study.
Flood hazard mapping (FHM) has undergone significant development in terms of approach and capacity of the result to meet the target of policymakers for accurate prediction and identification of flood-prone or affected regions. FHM is a vital tool in flood hazard and risk management analysis. Previous review studies have focused on flood inundation modelling methods. This present study presents a thorough and current review of the physically-based, empirical, and physical modelling methods in FHM. The study gives insight into strengths, limitations, case studies, and uncertainties associated with the methods. It further discusses the approaches in handling uncertainties related to each method, and its recent development. The review study is targeted at enlightening researchers and decision-makers with an extensive understanding of the methods and of the recent improvements in FHM thereby empowering flood management agencies, decision-makers, design engineers, early warning system agencies, and responders in addressing and making accurate decisions in flood-related problems, employing best management practices in flood management, and adaption of climate decision-making towards building resilient infrastructures.
During the past decade, the application of agricultural production systems modelling has rapidly expanded while there has been less emphasis on model improvement. Cropping systems modelling has ...become agricultural modelling, incorporating new capabilities enabling analyses in the domains of greenhouse gas emissions, soil carbon changes, ecosystem services, environmental performance, food security, pests and disease losses, livestock and pasture production, and climate change mitigation and adaptation. New science has been added to the models to support this broadening application domain, and new consortia of modellers have been formed that span the multiple disciplines.
There has not, however, been a significant and sustained focus on software platforms to increase efficiency in agricultural production systems research in the interaction between the software industry and the agricultural modelling community. This paper describes the changing agricultural modelling landscape since 2002, largely from a software perspective, and makes a case for a focussed effort on the software implementations of the major models.
•The agricultural modelling community has broadened its scientific focus over the last decade.•The software implementations of the leading agricultural models hasn't changed significantly in the last decade.•A focussed effort on agricultural modelling software and process is needed.