The implementation of boundary conditions is a key aspect of climate simulations. We describe here how the Climate Model Intercomparison Project Phase 6 (CMIP6) forcing data sets have been processed ...and implemented in Version 6 of the Institut Pierre‐Simon Laplace (IPSL) climate model (IPSL‐CM6A‐LR) as used for CMIP6. Details peculiar to some of the Model Intercomparison Projects are also described. IPSL‐CM6A‐LR is run without interactive chemistry; thus, tropospheric and stratospheric aerosols as well as ozone have to be prescribed. We improved the aerosol interpolation procedure and highlight a new methodology to adjust the ozone vertical profile in a way that is consistent with the model dynamical state at the time step level. The corresponding instantaneous and effective radiative forcings have been estimated and are being presented where possible.
Plain Language Summary
Climate Model Intercomparison Project Phase 6 is an international project to compare the results from climate model simulations performed according to a common protocol. Such simulations require boundary conditions (called “climate forcings”), which are fed to the models in order to represent, for example, long‐lived greenhouse gases, ozone, atmospheric aerosols, or land surface properties. The same forcing data sets are used by the different modeling groups who carry out the Climate Model Intercomparison Project Phase 6 simulations; however, their implementation may differ as it depends on the model structure. This article gives details of how these forcing data were implemented in the IPSL‐CM6A‐LR model. Some of the forcing data are common to all types all simulations, whereas others depend on the runs considered. Radiative forcings, as estimated in the model, are presented for some of the forcing mechanisms.
Key Points
We present how the CMIP6 forcing data were implemented in the IPSL‐CM6A‐LR climate model for the realization of the CMIP6 set of climate simulations
An improved conservative interpolation procedure for emissions is detailed and illustrated to compute tropospheric aerosols
We present a new methodology to adjust the prescribed ozone vertical profile to match the model atmospheric dynamical state around the tropopause
This paper describes the extension of the previously CMIP5 based high-resolution climate projections with additional ones based on the more recent climate projections from the CMIP6 experiment. The ...downscaling method and data processing are the same but the reference dataset is now the ERA5-Land reanalysis (compared to ERA5 previously) allowing to increase the resolution of the new downscaled projections from 0.25° x 0.25° to 0.1°x 0.1°. The extension comprises 5 climate models and includes 2 surface variables at daily resolution: air temperature and precipitation. Three greenhouse gas emissions scenarios are available: Shared Socioeconomic Pathways with mitigation policy (SSP1-2.6), an intermediate one (SSP2-4.5), and one without mitigation (SSP5-8.5).
A high-resolution climate projections dataset is obtained by statistically downscaling climate projections from the CMIP5 experiment using the ERA5 reanalysis from the Copernicus Climate Change ...Service. This global dataset has a spatial resolution of 0.25°x 0.25°, comprises 21 climate models and includes 5 surface daily variables at monthly resolution: air temperature (mean, minimum, and maximum), precipitation, and mean near-surface wind speed. Two greenhouse gas emissions scenarios are available: one with mitigation policy (RCP4.5) and one without mitigation (RCP8.5). The downscaling method is a Quantile Mapping method (QM) called the Cumulative Distribution Function transform (CDF-t) method that was first used for wind values and is now referenced in dozens of peer-reviewed publications. The data processing includes quality control of metadata according to the climate modeling community standards and value checking for outlier detection.
•Dataset for energy sector is created to facilitate the use of climate data.•Sub-ensemble selection is use to reduce the data size without losing information.•All variables are bias-corrected for a ...more effective use into impact studies.•Bias-corrected model simulations indicate increased coherence regarding future projections.•Data are freely accessible through the Earth System Grid Federation (ESGF) nodes.
Climate information is necessary for the energy sector. However, the use of climate projections has remained limited so far for a number of reasons such us the lack of consistency among climate projections, the inadequate temporal and spatial resolution, the climate model biases, the lack of guidance for users, and the size of data sets. In this work, we develop and assess a consistent ensemble of high time and space resolution climate projections that address these problems. First, a methodology for sub-ensemble selection is developed and proposed. Our ensemble dataset includes eleven 12 km-resolution EURO-CORDEX simulations of temperature, precipitation, wind speed and surface solar radiation on 3-hourly and daily time scales. These variables are bias-corrected for a more effective use into impact studies. The assessment of bias-corrected model simulations against observational data indicates reduced biases and increased coherence in projected changes among models compared to the raw climate projections. We provide a well-documented dataset for energy practitioners and decision-makers to facilitate the access and use of energy-relevant high-quality climate information in operation and planning. The new dataset is freely available via the Earth System Grid Federation (ESGF) platform.
The Tuning Strategy of IPSL‐CM6A‐LR Mignot, Juliette; Hourdin, Frédéric; Deshayes, Julie ...
Journal of advances in modeling earth systems,
20/May , Volume:
13, Issue:
5
Journal Article
Peer reviewed
Open access
The assessment of current and future risks for natural and human systems associated with climate change largely relies on numerical simulations performed with state‐of‐the‐art climate models. Various ...steps are involved in the development of such models, from development of individual components of the climate system up to free parameter calibration of the fully coupled model. Here, we describe the final tuning phase for the IPSL‐CM6A‐LR climate model. This phase alone lasted more than 3 years and relied on several pillars: (i) the tuning against present‐day conditions given a small adjustment of the ocean surface albedo to compensate for the current oceanic heat uptake, (ii) the release of successive versions after adjustments of the individual components, implying a systematic and recurrent adjustment of the atmospheric energetics, and (iii) the use of a few metrics based on large scale variables such as near‐global mean temperature, summer Arctic sea‐ice extent, as targets for the tuning. Successes, lessons and prospects of this tuning strategy are discussed.
Plain Language Summary
Evaluating current and future risks for natural and human systems associated with climate change is largely based on numerical simulations performed with models of the climate system, which includes the atmosphere, the land, the ocean, the cryosphere, and the oceanic and terrestrial biosphere. Various steps are involved in the development of such models. First, models for individual components are developed and tested. Second, many aspects are represented with parameterizations that summarize the effect of a missing process, such as those happening on scales that are smaller than the model grid sizes. The parameterizations in turn involve many parameters, sometimes poorly estimated from observations, that have to be calibrated. Here, we describe the final tuning phase of the IPSL‐CM6A‐LR climate model, which includes several novel aspects: first, the choice to calibrate the model against present‐day observations, which implies taking into account the transient nature of the observed climate; second, the systematic and recurrent adjustment of the atmospheric radiative budget; third, the use of a few large scale observable variables as targets. Successes, lessons and prospects of this tuning strategy are discussed.
Key Points
The tuning process of IPSL‐CM6A‐LR under present‐day control conditions is described
The associated continuous atmospheric energetics adjustment is presented
Successes, lessons and prospects of the IPSL‐CM6A‐LR tuning strategy are discussed
This paper analyzes the ensemble of regional climate model (RCM) projections for Europe completed within the EURO‐CORDEX project. Projections are available for the two greenhouse gas concentration ...scenarios RCP2.6 (22 members) and RCP8.5 (55 members) at 0.11° resolution from 11 RCMs driven by eight global climate models (GCMs). The RCM ensemble results are compared with the driving CMIP5 global models but also with a subset of available last generation CMIP6 projections. Maximum warming is projected by all ensembles in Northern Europe in winter, along with a maximum precipitation increase there; in summer, maximum warming occurs in the Mediterranean and Southern European regions associated with a maximum precipitation decrease. The CMIP6 ensemble shows the largest signals, both for temperature and precipitation, along with the largest inter‐model spread. There is a high model consensus across the ensembles on an increase of extreme precipitation and drought frequency in the Mediterranean region. Extreme temperature indices show an increase of heat extremes and a decrease of cold extremes, with CMIP6 showing the highest values and EURO‐CORDEX the finest spatial details. This data set of unprecedented size and quality will provide the basis for impact assessment and climate service activities for the European region.
Key Points
This paper presents the first of this size regional climate model ensemble to investigate and understand the climate change response over the whole of Europe
The paper confirms previous findings for mean and extreme climate change but is able to show the added value information of the high‐resolution regional ensemble
The paper assesses the regional and global model consensus in the projection and presents also the uncertainty of the signal
The extent to which climate conditions influenced the spatial
distribution of hominin populations in the past is highly debated.
General circulation models (GCMs) and archaeological data have been
...used to address this issue. Most GCMs are not currently capable of
simulating past surface climate conditions with sufficiently
detailed spatial resolution to distinguish areas of potential
hominin habitat, however. In this paper, we propose a statistical
downscaling method (SDM) for increasing the resolution of climate
model outputs in a computationally efficient way. Our method uses a
generalised additive model (GAM), calibrated over present-day
climatology data, to statistically downscale temperature and
precipitation time series from the outputs of a GCM simulating the
climate of the Last Glacial Maximum (19 000–23 000 BP) over western
Europe. Once the SDM is calibrated, we first interpolate the
coarse-scale GCM outputs to the final resolution and then use the
GAM to compute surface air temperature and precipitation levels
using these interpolated GCM outputs and fine-resolution
geographical variables such as topography and distance from an
ocean. The GAM acts as a transfer function, capturing non-linear
relationships between variables at different spatial scales and
correcting for the GCM biases. We tested three different techniques
for the first interpolation of GCM output: bilinear, bicubic and
kriging. The resulting SDMs were evaluated by comparing downscaled
temperature and precipitation at local sites with paleoclimate
reconstructions based on paleoclimate archives (archaeozoological
and palynological data) and the impact of the interpolation
technique on patterns of variability was explored. The SDM based on
kriging interpolation, providing the best accuracy, was then
validated on present-day data outside of the calibration period. Our
results show that the downscaled temperature and precipitation
values are in good agreement with paleoclimate reconstructions at
local sites, and that our method for producing fine-grained
paleoclimate simulations is therefore suitable for conducting
paleo-anthropological research. It is nonetheless important to
calibrate the GAM on a range of data encompassing the data to be
downscaled. Otherwise, the SDM is likely to overcorrect the
coarse-grain data. In addition, the bilinear and bicubic
interpolation techniques were shown to distort either the temporal
variability or the values of the response variables, while the
kriging method offered the best compromise. Since climate
variability is an aspect of the environment to which human
populations may have responded in the past, the choice of
interpolation technique is therefore an important consideration.
The use of regional climate model (RCM)‐based projections for providing regional climate information in a research and climate service contexts is currently expanding very fast. This has been ...possible thanks to a considerable effort in developing comprehensive ensembles of RCM projections, especially for Europe, in the EURO‐CORDEX community (Jacob et al., 2014, 2020). As of end of 2019, EURO‐CORDEX has developed a set of 55 historical and scenario projections (RCP8.5) using 8 driving global climate models (GCMs) and 11 RCMs. This article presents the ensemble including its design. We target the analysis to better characterize the quality of the RCMs by providing an evaluation of these RCM simulations over a number of classical climate variables and extreme and impact‐oriented indices for the period 1981–2010. For the main variables, the model simulations generally agree with observations and reanalyses. However, several systematic biases are found as well, with shared responsibilities among RCMs and GCMs: Simulations are overall too cold, too wet, and too windy compared to available observations or reanalyses. Some simulations show strong systematic biases on temperature, others on precipitation or dynamical variables, but none of the models/simulations can be defined as the best or the worst on all criteria. The article aims at supporting a proper use of these simulations within a climate services context.
Plain Language Summary
This study analyses the ability of the unprecedently large ensemble of 55 regional climate simulations to properly simulate the climatology of several variables, extremes, and impact‐oriented indices over the European continent. This analysis should guide the use of regional climate projections in climate services development.
Key Points
Biases of an unprecedentedly large ensemble of 55 European climate simulations using 8 global climate models and 11 regional climate models are assessed
Climate variables, extremes, and impact‐oriented indices are assessed, indicating whether such ensemble can—or cannot—be used in climate service applications
Simulations are generally too wet, too cold and too windy, and the share of contributions to the bias from GCMs and RCMs is found to differ for each variable or index
The distribution of data contributed to the Coupled Model Intercomparison Project Phase 6 (CMIP6) is via the Earth System Grid Federation (ESGF). The ESGF is a network of internationally distributed ...sites that together work as a federated data archive. Data records from climate modelling institutes are published to the ESGF and then shared around the world. It is anticipated that CMIP6 will produce approximately 20 PB of data to be published and distributed via the ESGF. In addition to this large volume of data a number of value-added CMIP6 services are required to interact with the ESGF; for example the citation and errata services both interact with the ESGF but are not a core part of its infrastructure. With a number of interacting services and a large volume of data anticipated for CMIP6, the CMIP Data Node Operations Team (CDNOT) was formed. The CDNOT coordinated and implemented a series of CMIP6 preparation data challenges to test all the interacting components in the ESGF CMIP6 software ecosystem. This ensured that when CMIP6 data were released they could be reliably distributed.
La migraine est l’une des maladies neurologiques les plus fréquentes et invalidantes. Il n’existe pas de traitement curatif de la migraine, les traitements médicamenteux ne sont pas systématiquement ...efficaces et peuvent occasionner des effets indésirables. Malgré une place restreinte dans les recommandations des sociétés savantes, les traitements non médicamenteux (TNM) suscitent de plus en plus d’intérêt de la part du public et des praticiens, en particulier pour les maladies chroniques. Si l’intérêt des thérapies cognitives n’est plus à démontrer chez les migraineux, des travaux plus récents concernant la pratique d’activité physique ou l’acupuncture soulignent leur bénéfice dans la stratégie de soins des patients.