When evaluating the reliability of an ensemble prediction system, it is common to compare the root-mean-square error of the ensemble mean to the average ensemble spread. While this is indeed good ...practice, two different and inconsistent methodologies have been used over the last few years in the meteorology and hydrology literature to compute the average ensemble spread. In some cases, the square root of average ensemble variance is used, and in other cases, the average of ensemble standard deviation is computed instead. The second option is incorrect. To avoid the perpetuation of practices that are not supported by probability theory, the correct equation for computing the average ensemble spread is obtained and the impact of using the wrong equation is illustrated.
Celotno besedilo
Dostopno za:
BFBNIB, DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, UILJ, UKNU, UL, UM, UPUK
One of the main problems of neural networks is the lack of consensus on how to best implement them. This work targets the question of the transfer function selection—a vital part of neural network ...providing nonlinear mapping potential. Three nonlinear transfer functions bounded by −1 and 1 are selected for testing, based on a literature review: the Elliott sigmoid, the bipolar sigmoid, and the tangent sigmoid. They are used to design multilayer perceptron neural networks for multistep ahead streamflow forecasting over five diverse watersheds and lead times from 1 to 5 days. All multilayer perceptrons have shown a good performance on the account of the four selected criteria, which confirms that the selected multilayer perceptron implementation procedure was adequate, namely, the data set length, the Kohonen network clustering method to create the training and testing sets, and the Levenberg-Marquardt back-propagation training procedure with Bayesian regularization. Specifically, results endorsed the tangent sigmoid as the most pertinent transfer function for streamflow forecasting, over the bipolar (logistic) and Elliott sigmoids, but the latter requires less computing time and as such may be a valuable option for operational hydrology. Also, results averaged over five lead times confirmed the universal approximation theorem that a linear transfer function is suitable for the output layer—a nonlinear transfer function in the output layer failed to improve performance values.
In most of Environment and Climate Change Canada's (ECCC) current operational systems, inland water physical processes are simulated using a simple water scheme. Water surface temperatures and ice ...cover fractions are updated daily using analyses. However, ECCC recognizes the need for interactive lakes in its weather and environmental prediction systems, such as those used to forecast surface conditions and floods. As a first step toward this goal, the current study evaluates the impact of the Canadian Small Lake Model (CSLM) in an offline context on surface water temperature, ice phenology and near‐surface atmospheric conditions. The use of CSLM increases lake surface temperatures and decreases its RMSE during ice‐free months, which has a direct impact on the 2‐m air temperature by reducing the cold bias observed in the simulation without CSLM, particularly over larger lakes. CSLM improves ice cover in subgrid lakes, while having a neutral impact on intermediate lakes. On large lakes, CSLM tends to degrade ice cover simulation in southernmost lakes, while improving ice cover in northernmost lakes. The increased lake ice cover in CSLM, particularly over subgrid lakes and in the northern latitudes, has a strong impact on humidity fluxes at the surface during wintertime with a near‐interruption of evapotranspiration over lakes. In summertime, increased water temperature with CSLM leads to a 38% increase in evapotranspiration. With these results, it is expected that the synergy of CSLM and lake‐related observations will improve the simulation and initialization of lake conditions in ECCC's systems.
Plain Language Summary
Compared to most surrounding land, lakes are a higher source of humidity. Their slower response to atmospheric temperature changes induces a warming of the lower atmosphere in winter and a cooling in summer. Lakes can thus have a strong impact on the local climate. Their inclusion in Earth System Models allows for simulated lakes to exchange interactively with the atmosphere and the rest of the water cycle, which is particularly important in Canada since it is home to 62% of all lakes around the world. At Environment and Climate Change Canada (ECCC), the Canadian Small Lake Model (CSLM) has recently been developed to respond to this need. The current study evaluates the impact of lakes simulated by CSLM on lake surface (ice cover, water temperature) and near‐surface atmospheric conditions (temperature, humidity) compared to the current setup in ECCC's systems, in which lakes are not interactive. CSLM outperforms non‐interactive lakes, especially on small lakes: it improves summertime lake water surface temperature and wintertime ice coverage, which leads to an improvement of near‐surface temperatures. However, since CSLM was developed specifically to simulate small lakes, it struggles for very large lakes. This limitation could be resolved through the use of observations.
Key Points
Environment and Climate Change Canada recognizes the need for interactive lakes in its weather and environmental prediction systems
The Canadian Small Lake Model (CSLM) outperforms the current setup on most lakes, particularly on subgrid lakes, in offline mode
It is expected that the synergy of CSLM and lake‐related observations will improve the simulation and initialization of lake conditions
The conventional battery for genotoxicity testing is not well suited to assessing the large number of chemicals needing evaluation. Traditional
tests lack throughput, provide little mechanistic ...information, and have poor specificity in predicting
genotoxicity. New Approach Methodologies (NAMs) aim to accelerate the pace of hazard assessment and reduce reliance on
tests that are time-consuming and resource-intensive. As such, high-throughput transcriptomic and flow cytometry-based assays have been developed for modernized
genotoxicity assessment. This includes: the TGx-DDI transcriptomic biomarker (i.e., 64-gene expression signature to identify DNA damage-inducing (DDI) substances), the MicroFlow
assay (i.e., a flow cytometry-based micronucleus (MN) test), and the MultiFlow
assay (i.e., a multiplexed flow cytometry-based reporter assay that yields mode of action (MoA) information). The objective of this study was to investigate the utility of the TGx-DDI transcriptomic biomarker, multiplexed with the MicroFlow
and MultiFlow
assays, as an integrated NAM-based testing strategy for screening data-poor compounds prioritized by Health Canada's New Substances Assessment and Control Bureau. Human lymphoblastoid TK6 cells were exposed to 3 control and 10 data-poor substances, using a 6-point concentration range. Gene expression profiling was conducted using the targeted TempO-Seq™ assay, and the TGx-DDI classifier was applied to the dataset. Classifications were compared with those based on the MicroFlow
and MultiFlow
assays. Benchmark Concentration (BMC) modeling was used for potency ranking. The results of the integrated hazard calls indicate that five of the data-poor compounds were genotoxic
, causing DNA damage
a clastogenic MoA, and one
a pan-genotoxic MoA. Two compounds were likely irrelevant positives in the MN test; two are considered possibly genotoxic causing DNA damage
an ambiguous MoA. BMC modeling revealed nearly identical potency rankings for each assay. This ranking was maintained when all endpoint BMCs were converted into a single score using the Toxicological Prioritization (ToxPi) approach. Overall, this study contributes to the establishment of a modernized approach for effective genotoxicity assessment and chemical prioritization for further regulatory scrutiny. We conclude that the integration of TGx-DDI, MicroFlow
, and MultiFlow
endpoints is an effective NAM-based strategy for genotoxicity assessment of data-poor compounds.
Environment Canada has been developing a community environmental modelling system (Modélisation Environmentale Communautaire – MEC), which is designed to facilitate coupling between models focusing ...on different components of the earth system. The ultimate objective of MEC is to use the coupled models to produce operational forecasts. MESH (MEC – Surface and Hydrology), a configuration of MEC currently under development, is specialized for coupled land-surface and hydrological models. To determine the specific requirements for MESH, its different components were implemented on the Laurentian Great Lakes watershed, situated on the Canada-US border. This experiment showed that MESH can help us better understand the behaviour of different land-surface models, test different schemes for producing ensemble streamflow forecasts, and provide a means of sharing the data, the models and the results with collaborators and end-users. This modelling framework is at the heart of a testbed proposal for the Hydrologic Ensemble Prediction Experiment (HEPEX) which should allow us to make use of the North American Ensemble Forecasting System (NAEFS) to improve streamflow forecasts of the Great Lakes tributaries, and demonstrate how MESH can contribute to a Community Hydrologic Prediction System (CHPS).
Shifting level models have been suggested in the literature since the late 1970's for stochastic simulation of streamflow data. Parameter estimation for these models has been generally based on the ...method of moments. While this estimation approach has been useful for simulation studies, some limitations are apparent. One is the difficulty of evaluating the uncertainty of the model parameters, and another one is that the proposed model is not amenable to forecasting because the underlying mean of the process, which changes with time, is not estimated. In this paper, we reformulate the original shifting level model to conform to the so-called Hidden Markov Chain models (HMMs). These models are increasingly used in applied statistics and techniques such as Monte-Carlo Markov chain, and in particular Gibbs sampling, are well suited for estimating the parameters of HMMs. We use Gibbs sampling in a Bayesian framework for parameter estimation and show the applicability of the reformulated shifting level model for detection of abrupt regime changes and forecasting of annual streamflow series. The procedure is illustrated using annual flows of the Senegal River in Africa.
We present a new family of hidden Markov models and apply these to the segmentation of hydrological and environmental time series. The proposed hidden Markov models have a discrete state space and ...their structure is inspired from the shifting means models introduced by Chernoff and Zacks and by Salas and Boes. An estimation method inspired from the EM algorithm is proposed, and we show that it can accurately identify multiple change-points in a time series. We also show that the solution obtained using this algorithm can serve as a starting point for a Monte-Carlo Markov chain Bayesian estimation method, thus reducing the computing time needed for the Markov chain to converge to a stationary distribution.
Hydrological forecasting consists in the assessment of future streamflow. Current deterministic forecasts do not give any information concerning the uncertainty, which might be limiting in a ...decision-making process. Ensemble forecasts are expected to fill this gap. In July 2007, the Meteorological Service of Canada has improved its ensemble prediction system, which has been operational since 1998. It uses the GEM model to generate a 20-member ensemble on a 100 km grid, at mid-latitudes. This improved system is used for the first time for hydrological ensemble predictions. Five watersheds in Quebec (Canada) are studied: Chaudière, Châteauguay, Du Nord, Kénogami and Du Lièvre. An interesting 17-day rainfall event has been selected in October 2007. Forecasts are produced in a 3 h time step for a 3-day forecast horizon. The deterministic forecast is also available and it is compared with the ensemble ones. In order to correct the bias of the ensemble, an updating procedure has been applied to the output data. Results showed that ensemble forecasts are more skilful than the deterministic ones, as measured by the Continuous Ranked Probability Score (CRPS), especially for 72 h forecasts. However, the hydrological ensemble forecasts are under dispersed: a situation that improves with the increasing length of the prediction horizons. We conjecture that this is due in part to the fact that uncertainty in the initial conditions of the hydrological model is not taken into account.
From 19 to 22 June 2013, intense rainfall and
concurrent snowmelt led to devastating floods in the Canadian Rockies,
foothills and downstream areas of southern Alberta and southeastern British
...Columbia, Canada. Such an event is typical of late-spring floods in cold-region mountain headwater, combining intense precipitation with rapid
melting of late-lying snowpack, and represents a challenge for hydrological
forecasting systems. This study investigated the factors governing the
ability to predict such an event. Three sources of uncertainty, other than
the hydrological model processes and parameters, were considered: (i) the
resolution of the atmospheric forcings, (ii) the snow and soil moisture initial conditions (ICs) and (iii) the representation of the soil texture.
The Global Environmental Multiscale hydrological modeling platform
(GEM-Hydro), running at a 1 km grid spacing, was used to simulate
hydrometeorological conditions in the main headwater basins of southern
Alberta during this event. The GEM atmospheric model and the Canadian
Precipitation Analysis (CaPA) system were combined to generate atmospheric
forcing at 10, 2.5 and 1 km over southern Alberta. Gridded estimates of snow
water equivalent (SWE) from the Snow Data Assimilation System (SNODAS) were used
to replace the model SWE at peak snow accumulation and generate alternative
snow and soil moisture ICs before the event. Two global soil texture
datasets were also used. Overall 12 simulations of the flooding event
were carried out. Results show that the resolution of the atmospheric
forcing affected primarily the flood volume and peak flow in all river
basins due to a more accurate estimation of intensity and total amount of
precipitation during the flooding event provided by CaPA analysis at
convection-permitting scales (2.5 and 1 km). Basin-averaged snowmelt also
changed with the resolution due to changes in near-surface wind and
resulting turbulent fluxes contributing to snowmelt. Snow ICs were the main
sources of uncertainty for half of the headwater basins. Finally, the soil
texture had less impact and only affected peak flow magnitude and timing for
some stations. These results highlight the need to combine atmospheric
forcing at convection-permitting scales with high-quality snow ICs to provide
accurate streamflow predictions during late-spring floods in cold-region mountain river basins. The predictive improvement by inclusion of high-elevation weather stations in the precipitation analysis and the need for
accurate mountain snow information suggest the necessity of integrated
observation and prediction systems for forecasting extreme events in
mountain river basins.