The simulation speed of two‐dimensional hydrodynamic flood models is a limiting factor when catchments are large, a considerable number of simulations is required (e.g., exploratory modeling, ...Monte‐Carlo flood simulations, or predicting probabilistic flood maps), or when there is a need for real‐time flood emergency management. Rapid Flood Models (RFMs) that rely only on topographic depressions and the water balance equation have been successfully implemented to predict maximum urban flood inundation depths within seconds to a few minutes. However, the preprocessing step (identification of depressions and their attributes) and the postprocessing step (marking up possible flow paths of flood water in between flooded depressions) of RFMs is time consuming. In this study, we developed a new fast flood inundation model based on the cellular automata (CA) approach. The new model does not require the preprocessing and postprocessing steps of RFMs and therefore can provide more simulation speed. The performance of our new model, referred to as Cellular Automata fast flood evaluation (CA‐ffé), was compared to two well‐known hydrodynamic flood models (HEC‐RAS and TUFLOW) in 20 simulation experiments conducted in five different urban subcatchments. CA‐ffé predicted maximum inundation depth with reasonable accuracy in a matter of seconds to a few minutes for a single rainfall event simulation. The CA‐ffé model performed exceptionally well in areas with low‐lying depressions. However, in areas where floodwaters had higher momentum and velocity, the model usually was not able to estimate inundation depths calculated by HEC‐RAS or TUFLOW. CA‐ffé's key drawback is also its inability to represent the temporal evolution of flooding and flow velocities. Nevertheless, its ability to provide spatial flood extents and depths in a fraction of the time compared to its hydrodynamic counterparts is a significant advancement toward exploratory approaches for water systems planning, model‐based predictive control, and real‐time flood management.
Key Points
A rapid urban flood inundation model was developed using a novel cellular automata approach and tested against detailed hydrodynamic models
Our model successfully predicted maximum inundation depth caused by excessive rain and stormwater surcharges within seconds to a few minutes
Selecting appropriate ranges for the model's parameters is crucial for model performance
Flood‐frequency curves, critical for water infrastructure design, are typically developed based on a stationary climate assumption. However, climate changes are expected to violate this assumption. ...Here, we propose a new, climate‐informed methodology for estimating flood‐frequency curves under non‐stationary future climate conditions. The methodology develops an asynchronous, semiparametric local‐likelihood regression (ASLLR) model that relates moments of annual maximum flood to climate variables using the generalized linear model. We estimate the first two marginal moments (MM) – the mean and variance – of the underlying log‐Pearson Type‐3 distribution from the ASLLR with the monthly rainfall and temperature as predictors. The proposed methodology, ASLLR‐MM, is applied to 40 U.S. Geological Survey streamgages covering 18 water resources regions across the conterminous United States. A correction based on the aridity index was applied on the estimated variance, after which the ASLLR‐MM approach was evaluated with both historical (1951–2005) and projected (2006–2035, under RCP4.5 and RCP8.5) monthly precipitation and temperature from eight Global Circulation Models (GCMs) consisting of 39 ensemble members. The estimated flood‐frequency quantiles resulting from the ASLLR‐MM and GCM members compare well with the flood‐frequency quantiles estimated using the historical period of observed climate and flood information for humid basins, whereas the uncertainty in model estimates is higher in arid basins. Considering additional atmospheric and land‐surface conditions and a multi‐level model structure that includes other basins in a region could further improve the model performance in arid basins.
Plain Language Summary
Reliable projection of future flood risk enables us to assess the infrastructure risk and to develop contingency measures to reduce potential flood losses. The flooding process of any basin is linked with the relevant climatic variables to estimate the flood quantiles. Given the future projections of those climatic variables from global climate models, we propose a statistical approach to estimate flood risk. Estimated flood quantiles for both the observed and future periods have lesser uncertainty in humid basins in the east, whereas arid basins show considerable uncertainty for both periods.
Key Points
A climate‐informed modeling framework is developed to estimate near‐term (10–30 years) flood risk over the conterminous United States
The developed framework estimates observed flood quantiles well using Global Circulation Models historical simulations
Estimated flood quantiles for both the observed and future period have lesser uncertainty in humid basins compared to the arid basins
American engineers have done astounding things to bend the Mississippi River to their will: forcing one of its tributaries to flow uphill, transforming over a thousand miles of roiling currents into ...a placid staircase of water, and wresting the lower half of the river apart from its floodplain. American law has aided and abetted these feats. But despite our best efforts, so-called natural disasters continue to strike the Mississippi basin, as raging floodwaters decimate waterfront communities and abandoned towns literally crumble into the Gulf of Mexico. In some places, only the tombstones remain, leaning at odd angles as the underlying soil erodes away.A Century of Unnatural Disasterreveals that it is seductively deceptive - but horribly misleading - to call such catastrophes natural.Authors Christine A. Klein and Sandra B. Zellmer present a sympathetic account of the human dreams, pride, and foibles that got us to this point, weaving together engaging historical narratives and accessible law stories drawn from actual courtroom dramas. The authors deftly uncover the larger story of how the law reflects and even amplifies our ambivalent attitude toward nature - simultaneously revering wild rivers and places for what they are, while working feverishly to change them into something else. Despite their sobering revelations, the authors' final message is one of hope. Although the acknowledgement of human responsibility for unnatural disasters can lead to blame, guilt, and liability, it can also prod us to confront the consequences of our actions, leading to a liberating sense of possibility and to the knowledge necessary to avoid future disasters.Christine A. Kleinis the Chesterfield Smith Professor of Law at the University of Florida Levin College of Law and is co-author ofNatural Resources Law: A Place-Based Book of Problems and Cases(Aspen Publishers).Sandra B. Zellmerholds the Robert B. Daugherty Chair at the University of Nebraska College of Law and is co-author ofNatural Resources Law(West).
In frequency analysis of annual maximum flood series (AMFS), it is mechanism‐based and significant to incorporate the information of the underlying “ordinary” daily streamflow events for improving ...accuracy of flood risk estimation and appropriate management. In previous studies related to flood nonstationarity, the classical norming constants method (C‐NCM) has been used to derive statistical parameters of annual maximum flood distribution from daily streamflow distribution. However, C‐NCM does not flexibly consider the feasible range of the scale parameter of annual maximum flood distributions, which can result in the inability to provide sufficient and reliable models to fit AMFS. This paper aims to investigate the potential of two alternative norming constants methods in fitting annual maximum floods, namely the Hall's norming constants method (H‐NCM) and the Fisher and Tippetts' norming constants method (FT‐NCM) respectively. A comparative study of these three methods was carried out using hydrological streamflow series of 77 stations in the Yangtze and Yellow River basins, China. The results found that H‐NCM outperforms both C‐NCM and FT‐NCM for the stations with relatively low‐skewness coefficient of AMFS. Therefore, H‐NCM is recommended to prioritize for practical applications when considering daily streamflow rather than small samples of flood events. Furthermore, if the skewness coefficient exceeds Cs,Gumbel≈1.14, the overall modelling performance of these three methods is significantly deteriorated. This can provide some reference for the applicability of the norming constants method in flood frequency analysis.
The norming constants method is to deduce the distribution of flood extreme value series from the distribution of annual daily flow series. It breaks through the limitation of the loss of effective flood series information in traditional flood frequency analysis and expands the flood series sample information.
Flood occurrence is increasing due to escalated urbanization and extreme climate change; hence, various studies on this issue and methods of flood monitoring and mapping are also increasing to reduce ...the severe impacts of flood disasters. The advancement of current technologies such as light detection and ranging (LiDAR) systems facilitated and improved flood applications. In a LiDAR system, a laser emits light that travels to the ground and reflects off objects like buildings and trees. The reflected light energy returns to the sensor, whereby the time interval is recorded. Since the conventional methods cannot produce high-resolution digital elevation model (DEM) data, which results in low accuracy of flood simulation results, LiDAR data are extensively used as an alternative. This review aims to study the potential and the applications of LiDAR-derived DEM in flood studies. It also provides insight into the operating principles of different LiDAR systems, system components, and advantages and disadvantages of each system. This paper discusses several topics relevant to flood studies from a LiDAR-derived DEM perspective. Furthermore, the challenges and future perspectives regarding DEM LiDAR data for flood mapping and assessment are also reviewed. This study demonstrates that LiDAR-derived data are useful in flood risk management, especially in the future assessment of flood-related problems.
Abstract There is a chronic disconnection among purely probabilistic flood frequency analysis of flood hazards, flood risks, and hydrological flood mechanisms, which hamper our ability to assess ...future flood impacts. We present a vulnerability‐based approach to estimating riverine flood risk that accommodates a more direct linkage between decision‐relevant metrics of risk and the dominant mechanisms that cause riverine flooding. We adapt the conventional peaks‐over‐threshold (POT) framework to be used with extreme precipitation from different climate processes and rainfall‐runoff‐based model output. We quantify the probability that at least one adverse hydrologic threshold, potentially defined by stakeholders, will be exceeded within the next N years. This approach allows us to consider flood risk as the summation of risk from separate atmospheric mechanisms, and supports a more direct mapping between hazards and societal outcomes. We perform this analysis within a bottom‐up framework to consider the relevance and consequences of information, with varying levels of credibility, on changes to atmospheric patterns driving extreme precipitation events. We demonstrate our proposed approach using a case study for Fall Creek in Ithaca, NY, USA, where we estimate the risk of stakeholder‐defined flood metrics from three dominant mechanisms: summer convection, tropical cyclones, and spring rain and snowmelt. Using downscaled climate projections, we determine how flood risk associated with a subset of mechanisms may change in the future, and the resultant shift to annual flood risk. The flood risk approach we propose can provide powerful new insights into future flood threats.
Plain Language Summary As the climate changes, we expect weather patterns to shift. We often discuss how strong and how frequent future storms might become. Scientists can estimate these changes with global models, called GCMs though we have some difficulty explaining that some of our climate predictions are reliable (like future air temperature) while others are less reliable (future rain intensity). Within a given region, we might know that floods are generally caused by certain storms (i.e., tropical cyclones, frontal systems). Based on climate projections, we often have reason to believe that these storms might be changing, but we do not have a simple way of letting these predictions tell us about future flooding risk. We propose a method of estimating the riverine flooding risk by looking at the dominant regional storm types that most commonly cause floods. We identify different types of storms and consider how often and intense these storms are at present. We use our approach to determine what exactly the GCMs can say about the future climate with confidence, and how that might alter flooding risk. Our results suggest that in the Northeast U.S. changes to future risk are linked to the most uncertain aspects of future climate predictions.
Key Points We present a modified peaks‐over‐threshold approach to riverine flood hazard incorporating relevant atmospheric characteristics We employ a vulnerability‐based approach that facilitates a direct linkage between decision‐relevant risk metrics and flood mechanisms Using downscaled climate projections, we demonstrate how information of varied reliability may be incorporated into a flood risk estimate
The Peace‐Athabasca Delta in Alberta, Canada has numerous perched basins that are primarily recharged after large ice jams cause floods (an ecological benefit). Previous studies have estimated that ...such large floods are likely to decrease in frequency under various climate projections. However, there is a sizable uncertainty range in these predicted flood probabilities, in part due to the short 60‐year systematic record that contained few large ice jam floods. An additional 50 years of historical data are available from various sources, with expert‐interpreted flood categories; however, these categorizations are uncertain in magnitude and occurrence. We developed a Bayesian framework that considers magnitude and occurrence uncertainties within a logistic regression model that predicts the annual probability of a large flood. The Bayesian regression estimates the joint distribution of parameters describing the effects of climatic factors and parameters that describe the probability that historical flood magnitudes were recorded as large (or not) when a truly large (or not) flood occurred. We compare four models for hindcasting and projecting large ice jam flood probabilities in future climates. The models consider: (a) historical data uncertainty, (b) no historical data uncertainty, (c) only the systematic record, and (d) the systematic record with a different model. Neglecting historical data uncertainty provides inaccurate estimates, while using only the systematic record provides wider prediction intervals than considering the full record with uncertain historical data. Thus, we demonstrate that including uncertain historical information can effectively extend the record length and make flood frequency analyses more accurate and precise.
Key Points
We use a Bayesian logistic regression framework to estimate ice jam flood frequency while considering uncertainty in the historical record
We compare annual flood probabilities from a model trained with a systematic record to a model trained with additional historical data
Prediction intervals for projected climates are narrower when uncertain historical data are used
Ensemble Forecast Operations (EFO) is a risk‐based approach of reservoir flood control operations that incorporates ensemble streamflow predictions (ESPs) made by the California‐Nevada River Forecast ...Center. Reservoir operations for each member of an ESP are individually modeled to forecast system conditions and calculate risk of reaching critical operational thresholds. Reservoir release decisions are simulated to manage forecasted risk with respect to established risk tolerance levels. EFO was developed for Lake Mendocino, a 111,000 acre‐foot reservoir near Ukiah, California, to evaluate its viability to improve reservoir storage reliability without increasing downstream flood risk. Lake Mendocino is a dual use reservoir, owned and operated for flood control by the United States Army Corps of Engineers and operated for water supply by Sonoma Water. EFO was simulated using a 26‐year (1985–2010) ESP hindcast generated by the California‐Nevada River Forecast Center, which provides 61‐member ensembles of 15‐day flow forecasts. EFO simulations yield generally higher storage levels during the flood management season while maintaining needed flood storage capacity by strategically prereleasing water in advance of forecasted storms. Model results demonstrate a 33% increase in median storage at the end of the flood management season (10 May) over existing operations without marked changes in flood frequency for locations downstream from Lake Mendocino. EFO may be a viable alternative for managing flood control operations at Lake Mendocino that provides multiple benefits (water supply, flood mitigation, and ecosystems) and provides a management framework that could be adapted and applied to other flood control reservoirs.
Key Points
Ensemble Forecast Operations for Lake Mendocino is a new probabilistic decision support system for reservoir flood control operations
Ensemble Forecast Operations utilizes ensemble streamflow predictions to manage to forecasted risk of exceeding critical storage levels
Evaluation of Ensemble Forecast Operations demonstrates improved reservoir storage reliability for water supply and ecosystems
A Simple Model of Flood Peak Attenuation Paiva, Rodrigo C. D.; Lima, Stefany G.
Water resources research,
February 2024, 2024-02-00, 20240201, Letnik:
60, Številka:
2
Journal Article
Recenzirano
Odprti dostop
A simple analytical model was developed for evaluating the attenuation of flood wave peak discharge. The physically‐based model represents the flood wave along its trajectory, based on the diffusive ...model. Relative peak discharge decreases along the downstream distance according to a power function. The distance is scaled by the attenuation factor related to river hydrodynamics (flow rating, hydraulic diffusivity, celerity, and floodplain storage) and input hydrograph (initial peak discharge, hydrograph volume, and its relative curvature). It also informs the attenuation length, which is a practical indicator of the river distance in which discharge decreases by a given factor. Sensitivity analyses indicate that initial peak discharge, volume, floodplain storage, and slope are the governing factors of attenuation. Model's validity and accuracy were demonstrated by reproducing data from (a) numerical solutions of the Saint‐Venant equations covering a wide range of conditions, (b) 29 observations from 11 historical dam‐breaks, (c) 15 observations of natural floods in seven rivers and (d) a detailed hydrodynamic model. The model errors were generally lower than 10% and not larger than the typical uncertainty of flood observations. The accuracy is higher than simplified empirical models and analogous to a detailed hydrodynamic model that is representative of current practice. The proposed flood attenuation model can be easily applied using a few common parameters and a simple equation in a basic spreadsheet. It is suitable for practical applications such as first assessments of natural and dam‐break floods, engineering design, and analyses of large river networks supported by remote sensing data.
Plain Language Summary
Floods are the most common and damaging natural disaster. Predicting how flood waves weaken while traveling along rivers is key to clarifying the risks of natural and dam‐break floods, in engineering design, reservoir operation, and environmental analysis. We developed a simple and innovative physical model of flood wave attenuation. This model was accurate when tested against observations from historical dam‐break and natural floods and sophisticated computer simulations covering a wide range of river types and flow conditions. Flood waves weaken more when their peak is large, their volume is low, and in low‐slope rivers with large floodplains. This simple and meaningful equation can be easily applied for practical applications and help with massive mapping of floods over large regions.
Key Points
A simple physically‐based analytical model of flood wave peak attenuation is developed
The model is validated using numerical solutions of the Saint‐Venant equations and observations of historical dam breaks and natural floods
Flood wave attenuation is governed mostly by initial peak discharge, volume, floodplain storage, and river slope
The occurrence of heavy rainfall in the south-eastern hilly region of Bangladesh makes this area highly susceptible to recurrent flash flooding. As the region is the commercial capital of Bangladesh, ...these flash floods pose a significant threat to the national economy. Predicting this type of flooding is a complex task which requires a detailed understanding of the river basin characteristics. This study evaluated the susceptibility of the region to flash floods emanating from within the Karnaphuli and Sangu river basins. Twenty-two morphometric parameters were used. The occurrence and impact of flash floods within these basins are mainly associated with the volume of runoff, runoff velocity, and the surface infiltration capacity of the various watersheds. Analysis showed that major parts of the basin were susceptible to flash flooding events of a ‘moderate’-to-‘very high’ level of severity. The degree of susceptibility of ten of the watersheds was rated as ‘high’, and one was ‘very high’. The flash flood susceptibility map drawn from the analysis was used at the sub-district level to identify populated areas at risk. More than 80% of the total area of the 16 sub-districts were determined to have a ‘high’-to-‘very-high’-level flood susceptibility. The analysis noted that around 3.4 million people reside in flash flood-prone areas, therefore indicating the potential for loss of life and property. The study identified significant flash flood potential zones within a region of national importance, and exposure of the population to these events. Detailed analysis and display of flash flood susceptibility data at the sub-district level can enable the relevant organizations to improve watershed management practices and, as a consequence, alleviate future flood risk.