VALUE is an open European collaboration to intercompare downscaling approaches for climate change research, focusing on different validation aspects (marginal, temporal, extremes, spatial, ...process‐based, etc.). Here we describe the participating methods and first results from the first experiment, using “perfect” reanalysis (and reanalysis‐driven regional climate model (RCM)) predictors to assess the intrinsic performance of the methods for downscaling precipitation and temperatures over a set of 86 stations representative of the main climatic regions in Europe. This study constitutes the largest and most comprehensive to date intercomparison of statistical downscaling methods, covering the three common downscaling approaches (perfect prognosis, model output statistics—including bias correction—and weather generators) with a total of over 50 downscaling methods representative of the most common techniques.
Overall, most of the downscaling methods greatly improve (reanalysis or RCM) raw model biases and no approach or technique seems to be superior in general, because there is a large method‐to‐method variability. The main factors most influencing the results are the seasonal calibration of the methods (e.g., using a moving window) and their stochastic nature. The particular predictors used also play an important role in cases where the comparison was possible, both for the validation results and for the strength of the predictor–predictand link, indicating the local variability explained. However, the present study cannot give a conclusive assessment of the skill of the methods to simulate regional future climates, and further experiments will be soon performed in the framework of the EURO‐CORDEX initiative (where VALUE activities have merged and follow on).
Finally, research transparency and reproducibility has been a major concern and substantive steps have been taken. In particular, the necessary data to run the experiments are provided at http://www.value‐cost.eu/data and data and validation results are available from the VALUE validation portal for further investigation: http://www.value‐cost.eu/validationportal.
The largest and most comprehensive to date intercomparison of statistical downscaling methods is presented, with a total of over 50 downscaling methods representative of the most common approaches and techniques. Overall, most of the downscaling methods greatly improve raw model biases and no approach is superior in general, due to the large method‐to‐method variability. The main factors influencing the results are the seasonal calibration of the methods and their stochastic nature, for biases in the mean and variance.
In environmental research the importance of interfaces between the traditional knowledge fields in natural and social sciences is increasingly recognized. In coupled component modelling, the process ...of developing interface designs can support the communicative, social and cognitive integration between representatives of different knowledge fields. The task of integration is thereby not merely an additive procedure but has to be considered as important part of the research process. In our application, the development of a coupled component model facilitated an integrative assessment of the impact of climate change on snow conditions and skiing tourism in a typical Austrian ski resort. We elaborate the integration on two abstraction levels, a theoretical one and an applied one related to the case study. Other than model output, results presented here relate to the inter- and transdisciplinary development of the coupled component model and its interface design. We show how scientists from various disciplines and representatives from diverse societal fields jointly design interface tools. We identify joint model development – taking into consideration the different dimensions of integration – and recursive modelling as keys for successful inter- and transdisciplinary integration. Such integrative interface science can provide new insights which go beyond the sum of what can be learned from its disciplinary components.
Display omitted
•Work of a task force for communicative integration, social integration and cognitive integration.•Examples for interfaces between the disciplinary submodels: variables, indicators and thresholds.•Joint model development and recursive modelling as keys for successful inter- and transdisciplinary integration.
Large historical and future ensemble simulations from the Max-Planck Institute and the Canadian Earth System Models and from CMIP5 have been analysed to investigate the uncertainty due to internal ...variability in multi-decadal temperature and precipitation trends over Europe. Internal variability dominates the uncertainties in temperature and precipitation trends in all seasons at 30-year time scales. Locally, seasonal 30-year temperature trends deviate up to ±3 °C from the ensemble mean trend. Thus, in the entire of Europe, local seasonal temperature changes until year 2050 from below −1 °C up to more than 4 °C are possible according to the model results. Up to 30% of all ensemble members show negative temperature trends until year 2050 in winter, up to 10% of the members in summer. Uncertainties of 30-year precipitation trends due to internal variability exceed the trends almost everywhere in Europe. Only in few European regions more than 75% of the members agree on the sign of the change until year 2050. In southern Sweden, minimum and maximum winter (summer) temperature trends in the next 30 years differ with up to 7 °C (5 °C) between individual members of the large model ensembles. Large positive temperature trends are linked to positive (negative) precipitation trends in winter (summer) in southern Sweden. This variability is attributed to the variability in large scale atmospheric circulation trends, mainly due to internal atmospheric variability. We find only weak linkages between the variability of temperature trends and the dominant decadal to multi-decadal climate modes. This indicates that there is limited potential to predict the multi-decadal variability in climate trends. The main findings from our study are robust across the large ensembles from the different models used in this study but at the local scale, the results depend also on the choice of the model.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, UILJ, UKNU, UL, UM, UPUK
This study investigates present and future European heat wave magnitudes, represented by the Heat Wave Magnitude Index-daily (HWMId), for regional climate models (RCMs) and the driving global climate ...models (GCMs) over Europe. A subset of the large EURO-CORDEX ensemble is employed to study sources of uncertainties related to the choice of GCMs, RCMs, and their combinations. We initially compare the evaluation runs of the RCMs driven by ERA-interim reanalysis to E-OBS (observation-based estimates), finding that the RCMs can capture most of the observed spatial and temporal features of HWMId. With their higher resolution compared to GCMs, RCMs can reveal spatial features of HWMId associated with small-scale processes (e.g., orographic effects); moreover, RCMs represent large-scale features of HWMId satisfactorily (e.g., by reproducing the general pattern revealed by E-OBS with high values at western coastal regions and low values at the eastern part). Our results indicate a clear added value of the RCMs compared to the driving GCMs. Forced with the emission scenario RCP8.5, all the GCM and RCM simulations consistently project a rise in HWMId at an exponential rate. However, the climate change signals projected by the GCMs are generally attenuated when downscaled by the RCMs, with the spatial pattern also altered. The uncertainty in a simulated future change of heat wave magnitudes following global warming can be attributed almost equally to the difference in model physics (as represented by different RCMs) and to the driving data associated with different GCMs. Regarding the uncertainty associated with RCM choice, a major factor is the different representation of the orographic effects. No consistent spatial pattern in the ensemble spread associated with different GCMs is observed between the RCMs, suggesting GCM uncertainties are transformed by RCMs in a complex manner due to the nonlinear nature of model dynamics and physics. In summary, our results support the use of dynamical downscaling for deriving regional climate realization regarding heat wave magnitudes.
OBJECTIVE To test the hypothesis that once-daily oral administration of atenolol would attenuate the heart rate response to isoproterenol for 24 hours. ANIMALS 20 healthy dogs. PROCEDURES A ...double-blind randomized placebo-controlled crossover study was conducted. Dogs were assigned to receive atenolol (1 mg/kg, PO, q 24 h) or a placebo for 5 to 7 days. After a washout period of 7 days, dogs then received the other treatment. Heart rate at rest (HR
) and heart rate induced by administration of isoproterenol (HR
) as a constant rate infusion (0.2 μg/kg/min for 5 to 7 minutes) were obtained by use of ECG 0, 0.25, 3, 6, 12, 18, and 24 hours after administration of the final dose of atenolol or the placebo. A mixed-model ANOVA was used to evaluate effects of treatment, time after drug or placebo administration, treatment-by-time interaction, period, and sequence on HR
and HR
. RESULTS Effects of sequence or period were not detected. There was a significant effect of treatment and the treatment-by-time interaction on HR
. Atenolol significantly attenuated HR
for 24 hours but did so maximally at 3 hours (least squares mean ± SE, 146 ± 5 beats/min and 208 ± 5 beats/min for atenolol and placebo, respectively). The effect at 24 hours was small (193 ± 5 beats/min and 206 ± 5 beats/min for atenolol and placebo, respectively). Atenolol had a small but significant effect on HR
. CONCLUSIONS AND CLINICAL RELEVANCE This study of healthy dogs receiving atenolol supported a recommendation for a dosing interval < 24 hours.
This study evaluated an existing SNOMED-CT model for structured recording of heart murmur findings and compared it to a concept-dependent attributes model using content from SNOMED-CT.
The authors ...developed a model for recording heart murmur findings as an alternative to SNOMED-CT's use of Interprets and Has interpretation. A micro-nomenclature was then created to support each model using subset and extension mechanisms described for SNOMED-CT. Each micro-nomenclature included a partonomy of cardiac cycle timing values. A mechanism for handling ranges of values was also devised. One hundred clinical heart murmurs were recorded using purpose-built recording software based on both models.
Each micro-nomenclature was extended through the addition of the same list of concepts. SNOMED role grouping was required in both models. All 100 clinical murmurs were described using each model. The only major differences between the two models were the number of relationship rows required for storage and the hierarchical assignments of concepts within the micro-nomenclatures.
The authors were able to capture 100 clinical heart murmurs with both models. Requirements for implementing the two models were virtually identical. In fact, data stored using these models could be easily interconverted. There is no apparent penalty for implementing either approach.
Objective-To determine whether pharmacokinetic analysis of data derived from a single IV dose of iohexol could be used to predict creatinine clearance and evaluate simplified methods for predicting ...serum clearance of iohexol with data derived from 2 or 3 blood samples in clinically normal foals. Animals-10 healthy foals. Procedure-Serum disposition of iohexol and exogenous creatinine clearance was determined simultaneously in each foal (5 males and 5 females). A 3-compartment model of iohexol serum disposition was selected via standard methods. Iohexol clearance calculated from the model was compared with creatinine clearance. Separate limited-sample models were created with various combinations of sample times from the terminal slope of the plasma versus time profile for iohexol. Correction factors were determined for the limited-sample models, and iohexol clearance calculated via each method was compared with exogenous creatinine clearance by use of method comparison techniques. Results-Mean exogenous creatinine clearance was 2.17 mL/min/kg. The disposition of iohexol was best described by a 3-compartment open model. Mean clearance value for iohexol was 2.15 mL/min/kg and was not significantly different from mean creatinine clearance. A method for predicting serum iohexol clearance based on a 2-sample protocol (3- and 4-hour samples) was developed. Conclusions and Clinical Relevance-Iohexol clearance can be used to predict exogenous creatinine clearance and can be determined from 2 blood samples taken after IV injection of iohexol. Appropriate correction factors for adult horses and horses with abnormal glomerular filtration rate need to be determined.
Objective-To determine whether a limited sampling time method based on serum iohexol clearance (Cl(iohexol)) would yield estimates of glomerular filtration rate (GFR) in clinically normal horses ...similar to those for plasma creatinine clearance (Cl(creatinine)). Animals-10 clinically normal adult horses. Procedures-A bolus of iohexol (150 mg/kg) was administered IV, and serum samples were obtained 5, 20, 40, 60, 120, 240, and 360 minutes after injection. Urinary clearance of exogenous creatinine was measured during three 20-minute periods. The GFR determined by use of serum Cl(iohexol) and plasma Cl(creatinine) was compared with limits of agreement plots. Results-Values obtained for plasma Cl(creatinine) ranged from 1.68 to 2.69 mL/min/kg (mean, 2.11 mL/min/kg). Mean serum Cl(iohexol) was 2.38 mL/min/kg (range, 1.95 to 3.33 mL/min/kg). Limits of agreement plots indicated good agreement between the methods. Conclusions and Clinical Relevance-Use of serum Cl(iohexol) yielded estimates of GFR in clinically normal adult horses similar to those for plasma Cl(creatinine). This study was the first step in the evaluation of the use of serum Cl(iohexol) for estimating GFR in adult horses.
Two long-lasting high-pressure systems in summer 2018 led to persisting heatwaves over Scandinavia and other parts of Europe and an extended summer period with devastating impacts on agriculture, ...infrastructure, and human life. We use five climate model ensembles and the unique 263-year-long Stockholm temperature time series along with a composite 150-year-long time series for the whole of Sweden to set the latest heatwave in the summer of 2018 into historical perspective. With 263 years of data, we are able to grasp the pre-industrial period well and see a clear upward trend in temperature as well as upward trends in five heatwave indicators. With five climate model ensembles providing 20 580 simulated summers representing the latest 70 years, we analyse the likelihood of such a heat event and how unusual the 2018 Swedish summer actually was.