The increasing attention of stakeholders to extreme winds impacting the built environment is driving towards the adoption of probabilistic risk assessment methods, which aim at the stochastic ...modelling of three main risk components: hazard, vulnerability (or fragility), and exposure. Taking from seismic risk assessment, the hazard is typically expressed in terms of exceedance rate of an intensity measure of the natural event, usually related to wind speed, and the risk metric is the expected annual loss or the exceedance rate of the loss. On these premises the extreme wind risk assessment software, ERMESS, has been developed for risk assessment for portfolios of buildings. It integrates recent global- and regional-scale hazard maps for extreme wind events, that is, cyclones and tornadoes, and a database of more than five-thousand building- and component-level wind vulnerability and fragility functions from the literature. A procedure to develop building-level fragility models, based on existing component-level fragility functions, was also developed and embedded in ERMESS. Finally, the exposure (i.e. consequence) models are based on information provided by the insurance industry. The paper illustrates the software by means of proof-of-concept applications that show how ERMESS can be effective in wind risk assessment.
In the state-of-the-art of structural engineering the actions for design or assessment of bridges should derive from a probabilistic (i.e., frequentist) characterization of the loads. Data from ...weigh-in-motion (WIM) systems can inform stochastic models for traffic loads. However, WIM is not widespread, and data of this kind are scarce in the literature and often not recent. Due to structural safety reasons, the 52 km long A3 highway in Italy, connecting the cities of Naples and Salerno, has been equipped with a WIM system which has been operational since the beginning of 2021. The system's measurements of each vehicle transiting over the WIM devices, impede overloads on the many bridges featured in the transportation infrastructure. By the time of this writing the WIM system has seen one year of uninterrupted operation, collecting more than thirty-six million datapoints in the meantime. This short paper presents and discusses these WIM measurements, deriving the empirical distributions of traffic loads and making the original data available for further research and applications.
Force‐based seismic design involves the reduction of elastic spectra by introducing a behavior factor, q. This approach is widespread in engineering practice; however, recent studies have shown that ...structures consistently designed at different sites will not share the same level of seismic risk, which can be defined as the annual rate of the structure failing to meet a seismic performance objective, despite seismic actions having the same exceedance return period at all sites. This paper investigates whether the definition of site‐specific q factors can lead to uniform risk across sites characterized by varying levels of seismic hazard, based on the pushover curves of bare frame reinforced concrete buildings. These pushover curves are used to establish the backbones of equivalent single degree of freedom systems with varying lateral resistance. These systems are fictitiously placed at several Italian sites and their seismic failure risk is computed by integrating their fragility, assessed by means of incremental dynamic analysis, with each site's hazard curve. By assuming an arbitrary risk threshold, the same for all sites, the corresponding lateral strength leading to said threshold is determined and the corresponding behavior factor is back calculated. As expected, risk‐targeted q factors tend to increase with decreasing seismic hazard and are highly sensitive to the shape of the hazard curve beyond the design return period. Coupled with the fact that at low hazard sites lateral strength is determined by detailing for gravity‐load design and minimum code requirements, rather than seismic design actions, the results suggest that q factor‐based design is unsuitable for warranting territorially uniform seismic safety, yet it may be suitable for setting an upper‐bound to the annual failure probability.
Analytical methods in performance-based earthquake engineering (PBEE) typically employ suites of ground motion records to run dynamic analysis of a structure's computer model. The sample size of ...ground motions used in this context, affects the estimation uncertainty that underlies the seismic risk metrics thus obtained. This article presents R2R-EU (record-to-record estimation uncertainty), which is a PBEE software tool that numerically implements various schemes for estimating structure-specific seismic fragility and for the quantification of the estimation uncertainty behind seismic risk estimates, emanating from record-to-record variability in structural response. The software accepts as input the results of structural dynamic analysis to a set of accelerograms and seismic hazard curves. Estimation uncertainty is quantified by providing statistics, such as mean and variance, of the estimators of the failure rate and the fragility parameters (where applicable) and possibly their distribution. The user can choose the analysis method among some resampling and/or simulation schemes belonging to the bootstrap family, the delta method and other solutions from probability and statistics theory.
•Software tool for performance-based earthquake engineering research and applications.•Structure-specific seismic fragility function fitting to the results of dynamic analysis.•Incremental dynamic analysis, multi-stripe analysis and cloud analysis are supported.•Risk metrics in the form of annual failure rates are obtained by integrating fragility with hazard.•Estimation uncertainty behind fragility and risk estimates is inferred by various methods, including the bootstrap, the delta method and closed-form solutions.
Abstract Risk‐targeted spectra for seismic design have been proposed in earthquake engineering literature to harmonize seismic reliability for different structures designed at different sites. Such a ...proposal has been motivated by the fact that designing for uniform seismic hazard across building sites has been shown to generally lead to non‐uniform seismic risk. This note uses a case study implementation of the risk‐targeted design actions philosophy for seven Italian sites, to showcase two noteworthy practical issues with this otherwise appealing approach. First, at lower‐to‐moderate‐hazard sites, which may be the majority in a country, design against gravity loads and detailing according to minimum code requirements, can result in higher‐than‐anticipated overstrength, not commensurate with the adopted level of seismic design loads, thus deviating from the target reliability. This leaves only higher‐hazard sites with real margins for homogenization of reliability, which raises the bar. Second, risk‐targeted spectrum approaches typically require some a‐priori assumptions for structural fragility at low‐performance objectives, corresponding to alleged high damage. These assumptions, among others, carry an implicit adoption of shaking intensity measures with which to express fragility, whose lack of sufficiency and efficiency may in turn reduce the level of homogenization of risk that can be achieved.
Earthquakes are clustered in time and space; therefore, structures may be subjected to multiple consecutive instances of potentially damaging shaking, with insufficient in‐between time for repair ...operations to take place. Methodologies to evaluate the risk dynamics in this situation require vulnerability models that are able to capture the transitions between damage states, from the intact conditions to failure, due to multiple damaging earthquakes, that is, state‐dependent fragility curves. One of the state‐of‐the‐art methods for the assessment of structure‐specific state‐dependent fragility curves relies on a variant of incremental dynamic analysis (IDA), which is often termed back‐to‐back or B2B‐IDA. The computational costs typically involved in B2B‐IDA motivate attempts to simplify the evaluation of state‐dependent fragility curves. This paper presents a simplified method for multi‐story moment‐resisting frame structures, based on pushover analysis in conjunction with a predictive model for the main features of a damaged structural system, such as residual deformations and loss of stiffness and/or strength. The predictive model enables the probabilistic definition of the post‐earthquake pushover curve of a damaged structural system, given the displacement demand imposed by a preceding damaging shock. The state‐dependent fragility curves are then evaluated via IDA of single‐degree‐of‐freedom oscillators based on these pushover curves. Illustrative applications validate the ability of the proposed methodology to provide state‐dependent fragilities with reduced computational costs compared to the back‐to‐back IDA method.
Summary
Response‐history nonlinear dynamic analysis is an analytical tool that often sees use in risk‐oriented earthquake engineering applications. In the context of performance‐based earthquake ...engineering, dynamic analysis serves to obtain a probabilistic description of seismic structural vulnerability. This typically involves subjecting a nonlinear numerical computer model to a set of ground‐motions that represent a sample of possible realizations of base acceleration at the site of interest. The analysis results are then used to calibrate a stochastic model that describes structural response as a function of shaking intensity. The sample size of the ground‐motion record set is nowadays usually governed by computation‐demand constraints, yet it directly affects the uncertainty in estimation of seismic response. The present study uses analytical and numerical means to investigate the record sample size, n, required to achieve quantifiable levels of mean relative estimation error on seismic risk metrics. Regression‐based cloud analysis in the context of Cornell's reliability method and incremental dynamic analysis using various intensity measures were employed to derive a relation of the form
Δ/n, where Δ is a parameter that depends on both the dispersion of structural responses and the shape of the hazard curve at the site. For the cases examined, n can be kept in the 40 to 100 range and achieve 10% mean relative error. The study can contribute to guide engineers towards an informed a‐priori assessment of the number of records needed to achieve a desired value for the coefficient of variation of the estimator of structural seismic risk.
SPO2FRAG (Static PushOver to FRAGility) is introduced, a MATLAB
®
-coded software tool for estimating structure-specific seismic fragility curves of buildings, using the results of static pushover ...analysis. The SPO2FRAG tool (available online at
http://wpage.unina.it/iuniervo/doc_en/SPO2FRAG.htm
) eschews the need for computationally demanding dynamic analyses by simulating the results of incremental dynamic analysis via the SPO2IDA algorithm and an equivalent single-degree-of-freedom approximation of the structure. Subsequently, fragility functions may be calculated for multiple limit states, using the intensity-measure-based analytical approach. The damage thresholds may also be random variables and uncertainty in estimation of the fragility parameters may be explicitly accounted for. The research background underlying the various modules comprising SPO2FRAG is presented together with an operational description of how the various functions are integrated within the software’s graphical user interface. Two illustrative SPO2FRAG applications are also offered, using a steel and a reinforced concrete moment resisting frame. Finally, the software’s output is compared with the results of incremental dynamic analysis as validation of SPO2FRAG’s effectiveness.