The sensitivity of atmospheric river (AR)‐induced precipitation to climate change is primarily driven by increases in atmospheric water vapor with warming. However, the rate at which AR‐based ...precipitation intensifies with warming and whether this rate differs from non‐AR events remains uncertain. This work uses multiple statistical models to estimate regional, extreme precipitation‐temperature scaling rates in California for AR and non‐AR events. Scaling rates are determined using cold‐season daily and hourly precipitation, along with multiple temperature variables to assess robustness of the results. We find that regional scaling rates for ARs are consistently larger than non‐ARs, especially for hourly event maxima (posterior median scale rates of 5.7% and 2.4% per °C for ARs and non‐ARs, respectively). ARs remain near saturated (i.e., high relative humidity) and exhibit more lift and a stronger increase in specific humidity aloft with warming as compared to non‐ARs, helping to explain the difference in precipitation‐temperature scaling rates.
Plain Language Summary
Atmospheric rivers are long and narrow conveyor belts of water vapor in the sky that transport moisture in the lower atmosphere. They often cause extreme precipitation that can cause flooding along the west coast of the United States. The rate of intensification of atmospheric river‐based precipitation with warming is a fundamental scientific question with important implications for adapting civil infrastructure to climate change. This study aims to quantify the relationship between atmospheric river‐based extreme precipitation and temperature, and to determine how it differs from other precipitation events. To do so, we utilize multiple statistical methods to quantify cold‐season precipitation‐temperature scaling rates at weather stations across California. In our analysis, we consider different temperature variables, timescales of data, and ways of accumulating precipitation during events to determine the robustness of our results. Overall, we find that extreme precipitation increases faster with warming during atmospheric rivers compared to other types of events. This difference is linked to the fact that atmospheric rivers exhibit a more direct increase in moisture content and lift with warming. These results suggest that extreme precipitation during atmospheric rivers will increase faster than other events as temperatures rise in the future, which could accelerate damage linked to future flooding events.
Key Points
Precipitation‐temperature (P‐T) scaling rates during atmospheric rivers (ARs) are larger than other precipitation events in California
P‐T scaling rates for AR‐based events approach the Clausius‐Clapeyron rate (7% per °C) when using dew point temperature
The differences in P‐T scaling rates for ARs versus non‐ARs are linked to the sensitivity of moisture content and lifting mechanism to warming
Climate change introduces substantial uncertainty to water resources planning and raises the key question: when, or under what conditions, should adaptation occur? A number of recent studies aim to ...identify policies mapping future observations to actions—in other words, framing climate adaptation as an optimal control problem. This paper uses the control paradigm to review and classify recent dynamic planning studies according to their approaches to uncertainty characterization, policy structure, and solution methods. We propose a set of research gaps and opportunities in this area centered on the challenge of characterizing uncertainty, which prevents the unambiguous application of control methods to this problem. These include exogenous uncertainty in forcing, model structure, and parameters propagated through a chain of climate and hydrologic models; endogenous uncertainty in human‐environmental system dynamics across multiple scales; and sampling uncertainty due to the finite length of historical observations and future projections. Recognizing these challenges, several opportunities exist to improve the use of control methods for climate adaptation, namely, how problem context and understanding of climate processes might assist with uncertainty quantification and experimental design, out‐of‐sample validation and robustness of optimized adaptation policies, and monitoring and data assimilation, including trend detection, Bayesian inference, and indicator variable selection. We conclude with a summary of recommendations for dynamic water resources planning under climate change through the lens of optimal control.
Key Points
This paper reviews dynamic planning studies for water resources systems under climate change framed as optimal control problems
Multiple sources of uncertainty, including climate and human system dynamics, prevent the identification of an optimal adaptation policy
Research opportunities remain to link dynamic planning with climate process insights and to identify indicator variables for monitoring
We develop multioutput neural network models to predict flow‐duration curves (FDCs) in 9,203 ungaged locations in the Southeastern United States for six decades between 1950 and 2009. The model ...architecture contains multiple response variables in the output layer that correspond to individual quantiles along the FDC. During training, predictions are made for each quantile, and a combined loss function is used for back propagation and parameter updating. The loss function accounts for the covariance between the quantiles and generates physically consistent outputs (i.e., monotonically increasing quantiles with increasing nonexceedance probabilities). We use neural network dropout to generate posterior‐predictive distributions for FDCs and test model performance under cross validation. Finally, we demonstrate how local surrogate models, via the Local Interpretable Model‐agnostic Explanations method, can be used to infer the relation between basin characteristics and the predicted FDCs. Results suggest that multioutput neural network models can learn the monotonic relations between adjacent quantiles on an FDC; they result in better predictions than single‐output neural network models that predict each quantile independently, and basin characteristics are most useful for predicting smaller quantiles, whereas bias terms from neighboring quantiles are most informative for predicting higher quantiles.
Key Points
Multioutput neural networks (MNNs) generate monotonically increasing flow‐duration curves
Monte Carlo dropout captures uncertainty for estimating streamflow quantiles
Local surrogate models approximate how the MNN is using basin characteristics for each observation
Approaches for probability density function (pdf) development of future climate often assume that different climate models provide independent information, despite model similarities that stem from a ...common genealogy (models with shared code or developed at the same institution). Here we use an ensemble of projections from the Coupled Model Intercomparison Project Phase 5 to develop probabilistic climate information, with and without an accounting of intermodel correlations, for seven regions across the United States. We then use the pdfs to estimate midcentury climate‐related risks to a water utility in one of the regions. We show that the variance of climate changes is underestimated across all regions if model correlations are ignored, and in some cases, the mean change shifts as well. When coupled with impact models of the hydrology and infrastructure of a water utility, the underestimated likelihood of large climate changes significantly alters the quantification of risk for water shortages by midcentury.
Key Points
Intermodel correlations reduce the information content of model ensembles
The variance of climate change pdfs grows if correlations are accounted
Risk to local systems is underestimated if model correlations are ignored
This study examines whether deep learning models can produce reliable future projections of streamflow under warming. We train a regional long short‐term memory network (LSTM) to daily streamflow in ...15 watersheds in California and develop three process models (HYMOD, SAC‐SMA, and VIC) as benchmarks. We force all models with scenarios of warming and assess their hydrologic response, including shifts in the hydrograph and total runoff ratio. All process models show a shift to more winter runoff, reduced summer runoff, and a decline in the runoff ratio due to increased evapotranspiration. The LSTM predicts similar hydrograph shifts but in some watersheds predicts an unrealistic increase in the runoff ratio. We then test two alternative versions of the LSTM in which process model outputs are used as either additional training targets (i.e., multi‐output LSTM) or input features. Results indicate that the multi‐output LSTM does not correct the unrealistic streamflow projections under warming. The hybrid LSTM using estimates of evapotranspiration from SAC‐SMA as an additional input feature produces more realistic streamflow projections, but this does not hold for VIC or HYMOD. This suggests that the hybrid method depends on the fidelity of the process model. Finally, we test climate change responses under an LSTM trained to over 500 watersheds across the United States and find more realistic streamflow projections under warming. Ultimately, this work suggests that hybrid modeling may support the use of LSTMs for hydrologic projections under climate change, but so may training LSTMs to a large, diverse set of watersheds.
Plain Language Summary
Recent research has shown that deep learning models can outperform process models in hydrologic prediction and forecasting, but it is unclear whether they can be used to project streamflow response under climate change. The concern is that deep learning models will be unable to reliably extrapolate beyond the range of historical climate, whereas process models can leverage physics to make such projections. To test this question, this study trained a deep learning hydrologic model (termed a long short‐term memory network, LSTM) to retrieve data from 15 watersheds in California and also trained three process‐based hydrologic models to the same watersheds for comparison. We also developed two other versions of the LSTM that use the output from the process models during training. We forced all models with the same scenarios of warming and compared their hydrologic response. The results suggested that the LSTM trained using process model data as input can improve the realism of streamflow projections under warming, but this is not guaranteed. We also conducted a similar experiment with an LSTM trained with data from over 500 watersheds and found more realistic hydrologic responses, suggesting deep learning models may provide more reliable projections when trained with more diverse data.
Key Points
A long short‐term memory network trained to 15 watersheds can produce misleading increases in annual runoff under significant warming
When also trained with outputs from process models, the regional network can produce more reliable runoff projections, but not always
A network trained to over 500 basins mostly produces realistic runoff projections with warming, but also runoff increases in glacial areas
The conventional approach to the frequency analysis of extreme precipitation is complicated by non-stationarity resulting from climate variability and change. This study utilized a non-stationary ...frequency analysis to better understand the time-varying behavior of short-duration (1-, 6-, 12-, and 24-h) precipitation extremes at 65 weather stations scattered across South Korea. Trends in precipitation extremes were diagnosed with respect to both annual maximum precipitation (AMP) and peaks-over-threshold (POT) extremes. Non-stationary generalized extreme value (GEV) and generalized Pareto distribution (GPD) models with model parameters made a linear function of time were applied to AMP and POT respectively. Trends detected using the Mann–Kendall test revealed that the stations showing an increasing trend in AMP extremes were concentrated in the mountainous areas (the northeast and southwest regions) of South Korea. Trend tests on POT extremes provided fairly different results, with a significantly reduced number of stations showing an increasing trend and with some stations showing a decreasing trend. For most of stations showing a statistically significant trend, non-stationary GEV and GPD models significantly outperformed their stationary counterparts, particularly for precipitation extremes with shorter durations. Due to a significant-increasing trend in the POT frequency found at a considerable number of stations (about 10 stations for each rainfall duration), the performance of modeling POT extremes was further improved with a non-homogeneous Poisson model. The large differences in design storm estimates between stationary and non-stationary models (design storm estimates from stationary models were significantly lower than the estimates of non-stationary models) demonstrated the challenges in relying on the stationary assumption when planning the design and management of water facilities. This study also highlighted the need of caution when quantifying design storms from POT and AMP extremes by showing a large discrepancy between the estimates from those two approaches.
A multivariate, multisite daily weather generator is presented for use in decision‐centric vulnerability assessments under climate change. The tool is envisioned to be useful for a wide range of ...socioeconomic and biophysical systems sensitive to different aspects of climate variability and change. The proposed stochastic model has several components, including (1) a wavelet decomposition coupled to an autoregressive model to account for structured, low‐frequency climate oscillations, (2) a Markov chain and k‐nearest‐neighbor (KNN) resampling scheme to simulate spatially distributed, multivariate weather variables over a region, and (3) a quantile mapping procedure to enforce long‐term distributional shifts in weather variables that result from prescribed climate changes. The Markov chain is used to better represent wet and dry spell statistics, while the KNN bootstrap resampler preserves the covariance structure between the weather variables and across space. The wavelet‐based autoregressive model is applied to annual climate over the region and used to modulate the Markov chain and KNN resampling, embedding appropriate low‐frequency structure within the daily weather generation process. Parameters can be altered in any of the components of the proposed model to enable the generation of realistic time series of climate variables that exhibit changes to both lower‐order and higher‐order statistics at long‐term (interannual), mid‐term (seasonal), and short‐term (daily) timescales. The tool can be coupled with impact models in a bottom‐up risk assessment to efficiently and exhaustively explore the potential climate changes under which a system is most vulnerable. An application of the weather generator is presented for the Connecticut River basin to demonstrate the tool's ability to generate a wide range of possible climate sequences over an extensive spatial domain.
Key Points
GCMs explore only a limited range of possible future climates
Stochastic models can be used to exhaustively explore climate change space
Robust adaptation strategies require testing over a large range possible futures
Sustaining food security under climate conditions expected for the 21st century will require that existing crop production systems simultaneously increase both productivity and resiliency to warmer ...and more variable climate conditions. In this study, we analyzed observational rainfed maize (Zea mays L.) yield data from major maize production areas of the US Corn Belt. These data included detailed information on crop management and genetics not typically available in observational studies, allowing us to better understand maize yield response to climate under variable management. Spatial variability in management variables across the study domain is coincident with spatial climate gradients. Regularized global and geographically weighted regression models were used to explore maize yield response to climate, management, genetics, and their interactions, while accounting for collinearity among them associated with corresponding scales of spatial variability. In contrast with recent analyses suggesting increased susceptibility to drought stress under higher plant populations in maize production, our analyses indicated that under moisture limitation, higher yields were achieved when high planting rates were coupled with delayed planting date. Maize genetic families that performed best with adequate moisture saw greater yield penalties under moisture limited conditions, while positive response to increased radiation was consistent among family lines. The magnitude of yield response to climate, management, and their interactions was also variable across the study domain, suggesting that information on crop management in spatial yield data can be used to better tailor local management practices to changes in yield potential resulting from agronomic advancements and changing local climate.
•A novel weather regime-based stochastic weather generator simulates daily precipitation and temperature in California.•The model reproduces a wide range of climate statistics and extremes with high ...fidelity at various spatiotemporal scales.•The model’s strong performance statewide supports climate impact assessments on water systems across California’s watersheds.
This study is the first of a two-part series presenting a novel weather regime-based stochastic weather generator to support bottom-up climate vulnerability assessments of water systems in California. In Part 1 of this series, we present the details of model development and validation. The model is based on the identification and simulation of weather regimes, or large-scale patterns of atmospheric flow, which are then used to condition the simulation of local, daily weather at a 6 km resolution across the state. We conduct a thorough validation of a baseline, 1000-year model simulation to evaluate its ability to accurately simulate daily precipitation and minimum and maximum temperature at various spatial scales (grid cell, river basin) and temporal scales (daily, event-based, monthly, annual, inter-annual to decadal). Results show that the model effectively reproduces a large suite of climate statistics at these scales across the entire state, including moments, spells, dry and wet extremes, and extreme hot and cold periods. Moreover, the model successfully maintains spatial correlations and inter-variable relationships, enabling the use of model simulations in hydrologic and water resources analyses that span multiple watersheds across California. The weather generator can simulate physically plausible extreme events (e.g., multi-day extreme precipitation and severe drought) that extend beyond the worst case conditions observed historically, independent of climate change. Thus, the baseline simulation can be used to understand the impacts of natural climate variability on both flood and drought risk in regional water systems. Scenarios of climate change are discussed in Part 2.