In 2018, Yale Medicine (YM)—an academic multispecialty practice—and Yale New Haven Health System (YNHH), partnered with the Academy of Communication in Healthcare to develop a one-day ...interprofessional workshop to introduce relationship-centered communication skills to all of their nurses and physicians. Relationship-centered communication skills include showing positive regard, listening actively and expressing empathy and have been demonstrated to improve patient outcomes. A professionally diverse group of 12 nurses and physicians, committed to improving patient experiences, were purposefully selected for training to teach the workshop. Individual interviews with trainers 3 months post training revealed themes reflecting the intrapersonal, interpersonal, and organizational impact of participation in the Train-the-Trainer program. At the intrapersonal level, training contributed to personal growth, skillfulness, and confidence. At the interpersonal level, it expanded and strengthened professional networks. As an organizational catalyst, training transformed the work experience among nurse and physician trainers, thereby supporting YM/YNHH’s vision to provide interprofessional relationship-centered care. Results suggest that trainer training had additional benefits beyond learning to deliver the workshop, including improving the quality of trainers’ personal and professional relationships, and enhancing organizational efficiency and interprofessionalism.
Model intercomparison studies are carried out to test and compare the simulated outputs of various model setups over the same study domain. The Great Lakes region is such a domain of high public ...interest as it not only resembles a challenging region to model with its transboundary location, strong lake effects, and regions of strong human impact but is also one of the most densely populated areas in the USA and Canada. This study brought together a wide range of researchers setting up their models of choice in a highly standardized experimental setup using the same geophysical datasets, forcings, common routing product, and locations of performance evaluation across the 1×106 km2 study domain. The study comprises 13 models covering a wide range of model types from machine-learning-based, basin-wise, subbasin-based, and gridded models that are either locally or globally calibrated or calibrated for one of each of the six predefined regions of the watershed. Unlike most hydrologically focused model intercomparisons, this study not only compares models regarding their capability to simulate streamflow (Q) but also evaluates the quality of simulated actual evapotranspiration (AET), surface soil moisture (SSM), and snow water equivalent (SWE). The latter three outputs are compared against gridded reference datasets. The comparisons are performed in two ways – either by aggregating model outputs and the reference to basin level or by regridding all model outputs to the reference grid and comparing the model simulations at each grid-cell. The main results of this study are as follows:
The comparison of models regarding streamflow reveals the superior quality of the machine-learning-based model in the performance of all experiments; even for the most challenging spatiotemporal validation, the machine learning (ML) model outperforms any other physically based model. While the locally calibrated models lead to good performance in calibration and temporal validation (even outperforming several regionally calibrated models), they lose performance when they are transferred to locations that the model has not been calibrated on. This is likely to be improved with more advanced strategies to transfer these models in space. The regionally calibrated models – while losing less performance in spatial and spatiotemporal validation than locally calibrated models – exhibit low performances in highly regulated and urban areas and agricultural regions in the USA. Comparisons of additional model outputs (AET, SSM, and SWE) against gridded reference datasets show that aggregating model outputs and the reference dataset to the basin scale can lead to different conclusions than a comparison at the native grid scale. The latter is deemed preferable, especially for variables with large spatial variability such as SWE. A multi-objective-based analysis of the model performances across all variables (Q, AET, SSM, and SWE) reveals overall well-performing locally calibrated models (i.e., HYMOD2-lumped) and regionally calibrated models (i.e., MESH-SVS-Raven and GEM-Hydro-Watroute) due to varying reasons. The machine-learning-based model was not included here as it is not set up to simulate AET, SSM, and SWE. All basin-aggregated model outputs and observations for the model variables evaluated in this study are available on an interactive website that enables users to visualize results and download the data and model outputs.
Near real-time quantitative precipitation estimates are required for many applications including weather forecasting, flood forecasting, crop management, forest fire prevention, hydropower ...production, and dam safety. Since April 2011, such a product has been available from Environment and Climate Change Canada for a domain covering all North America. This product, known as the Regional Deterministic Precipitation Analysis, is generated using the Canadian Precipitation Analysis (CaPA) system. Although it was designed for near real-time use, an archive of pre-operational and operational products going back to 2002 is now available and has been used in numerous studies. This paper presents a review of the various scientific publications that have reported either using or evaluating CaPA products. We find that the product is used with success both for scientific studies and operational applications and compares well with other precipitation datasets. We summarize the strengths and weaknesses of the system as reported in the literature. We also provide users with information on how the system works, how it has changed over time, and how the archived and near real-time analyses can be accessed and used. We finally briefly report on recent and upcoming improvements to the product based, in part, on the results of this literature review.
Several data assimilation (DA) approaches exist to generate consistent and continuous precipitation fields valuable for hydrometeorological applications and land data assimilation. Usually, DA is ...based on either static or dynamic approaches. Static methods rely on deterministic forecasts to estimate background error covariance matrices, whereas dynamic approaches use ensemble forecasts. Associating the two methods is known as hybrid DA, and it has proven beneficial for different applications as it combines the advantages of both approaches. The present study intends to explore hybrid DA for the 6 h Canadian Precipitation Analysis (CaPA). Based on optimal interpolation (OI), CaPA blends forecasts and observations from surface stations and ground-based radar datasets to provide precipitation fields over the North American domain. The application of hybrid DA to CaPA consisted of finding the optimal linear combination between (i) an OI based on the Regional Deterministic Prediction System (RDPS) and (ii) an ensemble Kalman filter (EnKF) based on the 20-member Regional Ensemble Prediction System (REPS). The results confirmed the known effectiveness of the hybrid approach when low-density observation networks are assimilated. Indeed, the experiments conducted for the summer without radar datasets and for the winter (characterized by very few observations in CaPA) showed that attributing a relatively high weight to the EnKF (50 % and 70 % for summer and winter, respectively) resulted in better analysis skill and a reduction in false alarms compared with the OI method. A deterioration in the moderate- to high-intensity precipitation bias was, however, observed during summer. Reducing the weight attributed to the EnKF to 30 % alleviated the bias deterioration while improving skill compared with the OI-based CaPA.
In mountains, the precipitation phase greatly varies in space and time and affects the evolution of the snow cover. Snowpack models usually rely on precipitation‐phase partitioning methods (PPMs) ...that use near‐surface variables. These PPMs ignore conditions above the surface thus limiting their ability to predict the precipitation phase at the surface. In this study, the impact on snowpack simulations of atmospheric‐based PPMs, incorporating upper atmospheric information, is tested using the snowpack scheme Crocus. Crocus is run at 2.5‐km grid spacing over the mountains of southwestern Canada and northwestern United States and is driven by meteorological fields from an atmospheric model at the same resolution. Two atmospheric‐based PPMs were considered from the atmospheric model: the output from a detailed microphysics scheme and a post‐processing algorithm determining the snow level and the associated precipitation phase. Two ground‐based PPMs were also included as lower and upper benchmarks: a single air temperature threshold at 0°C and a PPM using wet‐bulb temperature. Compared to the upper benchmark, the snow‐level based PPM improved the estimation of snowfall occurrence by 5% and the simulation of snow water equivalent (SWE) by 9% during the snow melting season. In contrast, due to missing processes, the microphysics scheme decreased performances in phase estimate and SWE simulations compared to the upper benchmark. These results highlight the need for detailed evaluation of the precipitation phase from atmospheric models and the benefit for mountain snow hydrology of the post‐processed snow level. The limitations to drive snowpack models at slope scale are also discussed.
Plain Language Summary
The partitioning of precipitation between rainfall and snowfall is a crucial component of the evolution of the snowpack in mountains. Most snowpack models use the air temperature and humidity near the surface to derive the precipitation phase. However, the phase at the surface is strongly influenced by processes such as melting and refreezing of falling hydrometeors that occur above the surface. Atmospheric models simulate these processes and the corresponding phase at the surface. However, snowpack models rarely use this information. In this study, we considered two estimates of precipitation phase from an atmospheric model and tested them with a physically‐based snow model over the mountains of southwestern Canada and northwestern United States. The results were compared with traditional approaches using the air temperature and humidity near the surface to derive the precipitation phase. Our results showed that the precipitation phase associated with the snow level obtained from the atmospheric model improved snowfall estimate and snowpack prediction compared to the traditional approaches. In contrast, the cloud/precipitation scheme of the atmospheric model decreased performance in phase estimate and snow simulations due to missing physical processes. Our study highlights that snowpack predictions in the mountains can be improved if valuable information is obtained from atmospheric models.
Key Points
Estimates of precipitation phase from an atmospheric model were used to drive snow simulations with a detailed snowpack model
Snowfall prediction and snowpack modeling are improved by using the snow level from post‐processing of the atmospheric model
Direct precipitation phase from the microphysics scheme does not improve snow simulations compared to simpler rain‐snow partitioning schemes
•Tm3+ doped Ga–As–S chalcogenide glass based on high purity As2S3 were produced.•The concentration ratio of Ga and Tm3+ in Ga–As–S glasses is optimized to achieve best luminescence.•Tm3+ doped ...optical fiber was drawn using the Ga–As–S as a new host matrix.•Three emission bands of Tm3+ ions were observed in the Ga based chalcogenide glass fiber.
Tm3+ doped Ga–As–S chalcogenide glass samples were produced using As2S3 pure glass as starting materials. Their photoluminescence properties were characterized and strong emission bands were observed at 1.2μm (1H5→3H6), 1.4μm (3H4→3F4) and 1.8μm (3F4→3H6) under excitation wavelengths of 698nm and 800nm. The thulium and gallium concentrations were optimized to achieve the highest photoluminescence efficiency. From the optimal composition, a Tm3+ doped Ga–As–S fiber was drawn and its optical properties were studied.
A snow model forced by temperature and precipitation is used to simulate the spatial distribution of snow water equivalent (SWE) over a 600 000 km² portion of the province ofQuebec, Canada. We ...propose to improve model simulations by assimilating SWE data from sporadic manual snow surveys with a particle filter. A temporally and spatially correlated perturbation of the meteorological forcing is used to generate the set of particles. The magnitude of the perturbations is fixed objectively. First, the particle filter and direct insertion were both applied on 88 sites for which measured SWE consisted of more or less five values per year over a period of 17 years. The temporal correlation of perturbations enables us to improve the accuracy and the ensemble dispersion of the particle filter, while the spatial correlation leads to a spatial coherence in the particle weights. The spatial estimates of SWE obtained with the particle filter are compared with those obtained through optimal interpolation of the snow survey data, which is the current operational practice in Quebec. Cross-validation results as well as validation against an independent dataset show that the proposed particle filter enables us to improve the spatial distribution of the snow water equivalent compared with optimal interpolation.
Celotno besedilo
Dostopno za:
BFBNIB, DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, UILJ, UKNU, UL, UM, UPUK
Environment and Climate Change Canada has initiated the production of a 1980–2018, 10 km, North American precipitation and surface reanalysis. ERA-Interim is used to initialize the Global ...Deterministic Reforecast System (GDRS) at a 39 km resolution. Its output is then dynamically downscaled to 10 km by the Regional Deterministic Reforecast System (RDRS). Coupled with the RDRS, the Canadian Land Data Assimilation System (CaLDAS) and Precipitation Analysis (CaPA) are used to produce surface and precipitation analyses. All systems used are close to operational model versions and configurations. In this study, a 7-year sample of the reanalysis (2011–2017) is evaluated. Verification results show that the skill of the RDRS is stable over time and equivalent to that of the current operational system. The impact of the coupling between RDRS and CaLDAS is explored using an early version of the reanalysis system which was run at 15 km resolution for the period 2010–2014, with and without the use of CaLDAS. Significant improvements are observed with CaLDAS in the lower troposphere and surface layer, especially for the 850 hPa dew point and absolute temperatures in summer. Precipitation is further improved through an offline precipitation analysis which allows the assimilation of additional observations of 24 h precipitation totals. The final dataset should be of particular interest for hydrological applications focusing on transboundary and northern watersheds, where existing products often show discontinuities at the border and assimilate very few – if any – precipitation observations.
This work explores the potential of the distributed GEM-Hydro runoff modeling platform, developed at Environment and Climate Change Canada (ECCC) over the last decade. More precisely, the aim is to ...develop a robust implementation methodology to perform reliable streamflow simulations with a distributed model over large and partly ungauged basins, in an efficient manner. The latest version of GEM-Hydro combines the SVS (Soil, Vegetation and Snow) land-surface scheme and the WATROUTE routing scheme. SVS has never been evaluated from a hydrological point of view, which is done here for all major rivers flowing into Lake Ontario. Two established hydrological models are confronted to GEM-Hydro, namely MESH and WATFLOOD, which share the same routing scheme (WATROUTE) but rely on different land-surface schemes. All models are calibrated using the same meteorological forcings, objective function, calibration algorithm, and basin delineation. GEM-Hydro is shown to be competitive with MESH and WATFLOOD: the NSE √ (Nash–Sutcliffe criterion computed on the square root of the flows) is for example equal to 0.83 for MESH and GEM-Hydro in validation on the Moira River basin, and to 0.68 for WATFLOOD. A computationally efficient strategy is proposed to calibrate SVS: a simple unit hydrograph is used for routing instead of WATROUTE. Global and local calibration strategies are compared in order to estimate runoff for ungauged portions of the Lake Ontario basin. Overall, streamflow predictions obtained using a global calibration strategy, in which a single parameter set is identified for the whole basin of Lake Ontario, show accuracy comparable to the predictions based on local calibration: the average NSE √ in validation and over seven subbasins is 0.73 and 0.61, respectively for local and global calibrations. Hence, global calibration provides spatially consistent parameter values, robust performance at gauged locations, and reduces the complexity and computation burden of the calibration procedure. This work contributes to the Great Lakes Runoff Inter-comparison Project for Lake Ontario (GRIP-O), which aims at improving Lake Ontario basin runoff simulations by comparing different models using the same input forcings. The main outcome of this study consists in a new generalizable methodology for implementing a distributed hydrologic model with a high computation cost in an efficient and reliable manner, over a large area with ungauged portions, using global calibration and a unit hydrograph to replace the routing component.
Between January 2013 and December 2014, water levels on Lake Superior and Lake Michigan‐Huron, the two largest lakes on Earth by surface area, rose at the highest rate ever recorded for a 2 year ...period beginning in January and ending in December of the following year. This historic event coincided with below‐average air temperatures and extensive winter ice cover across the Great Lakes. It also brought an end to a 15 year period of persistently below‐average water levels on Lakes Superior and Michigan‐Huron that included several months of record‐low water levels. To differentiate hydrological drivers behind the recent water level rise, we developed a Bayesian Markov chain Monte Carlo (MCMC) routine for inferring historical estimates of the major components of each lake's water budget. Our results indicate that, in 2013, the water level rise on Lake Superior was driven by increased spring runoff and over‐lake precipitation. In 2014, reduced over‐lake evaporation played a more significant role in Lake Superior's water level rise. The water level rise on Lake Michigan‐Huron in 2013 was also due to above‐average spring runoff and persistent over‐lake precipitation, while in 2014, it was due to a rare combination of below‐average evaporation, above‐average runoff and precipitation, and very high inflow rates from Lake Superior through the St. Marys River. We expect, in future research, to apply our new framework across the other Laurentian Great Lakes, and to Earth's other large freshwater basins as well.
Key Points
Between January 2013 and December 2014, the two largest lakes on Earth rose at a record‐setting rate
We developed a Bayesian MCMC routine for inferring estimates of the water budget for this period
The cold 2013–2014 winter contributed to reduced evaporation rates and rising water levels