The Global Earthquake Model aims to combine the main features of state-of-the-art science, global collaboration and buy-in, transparency and openness in an initiative to calculate and communicate ...earthquake risk worldwide. One of the first steps towards this objective has been the open-source development and release of software for seismic hazard and risk assessment called the OpenQuake engine. This software comprises a set of calculators capable of computing human or economic losses for a collection of assets, caused by a given scenario event, or by considering the probability of all possible events that might happen within a region within a certain time span. This paper provides an insight into the current status of the development of this tool and presents a comprehensive description of each calculator, with example results.
South America - in particular, the Andean countries - are exposed to high levels of seismic hazard, which, when combined with the elevated concentration of population and properties, has led to an ...alarming potential for human and economic losses. Although several fragility models have been developed in recent decades for South America, and occasionally used in probabilistic risk analysis, these models have been developed using distinct methodologies and assumptions, which renders any direct comparison of the results across countries questionable, and thus application at a regional level unreliable. This publication aims at obtaining a uniform fragility model for the most representative building classes in the Andean region, for large-scale risk analysis. To this end, sets of single-degree-of-freedom oscillators were created and subjected to a series of ground motion records using nonlinear time history analyses, and the resulting damage distributions were used to derive sets of fragility functions.
In this article, we present a new data collection that combines information about earthquake damage with seismic shaking. Starting from the Da.D.O. database, which provides information on the damage ...of individual buildings subjected to sequences of past earthquakes in Italy, we have generated ShakeMaps for all the events with magnitude greater than 5.0 that have contributed to these sequences. The sequences under examination are those of Irpinia 1980, Umbria Marche 1997, Pollino 1998, Molise 2002, L’Aquila 2009 and Emilia 2012. In this way, we were able to combine, for a total of the 117,695 buildings, the engineering parameters included in Da.D.O., but revised and reprocessed in this application, and the ground shaking data for six different variables (namely, intensity in MCS scale, PGA, PGV, SA at 0.3s, 1.0s and 3.0s). The potential applications of this data collection are innumerable: from recalibrating fragility curves to training machine learning models to quantifying earthquake damage. This data collection will be made available within Da.D.O., a platform of the Italian Department of Civil Protection, developed by EUCENTRE.
The seismic vulnerability of single-story adobe dwellings located in Cusco, Peru, is studied based on a mechanics-based procedure, which considers the analysis of in-plane and out-of-plane failure ...mechanisms of walls. The capacity of each dwelling is expressed as a function of its displacement capacity and period of vibration and is evaluated for different limit states to damage. The seismic demand has been obtained from several displacement response spectral shapes. From the comparison of the capacity with the demand, probabilities of limit state exceedance have been obtained for different PGA values. The results indicate that fragility curves in terms of PGA are strongly influenced by the response spectrum shape; however, this is not the case for the derivation of fragility curves in terms of limit state spectral displacement. Finally, fragility curves for dwellings located in Pisco, Peru, were computed and the probabilities of limit state exceedance were compared with the data obtained from the 2007 Peruvian earthquake.
Improving the textural quality of whole new potatoes with respect to firmness was investigated through the application of low temperature blanching at 60, 65, 70, 75, 80, 90 and 100
°C for times up ...to 1
h. Rate of firmness degradation upon blanching, as measured using a 7
mm diameter probe attached to an Instron (4464) Universal Testing Machine, was significantly lower at 60–75
°C than at 80–100
°C over the investigated blanching times (
P
<
0.05). The activity of pectin methyl esterase (PME) was determined for whole new potatoes with an optimum activity of 2.92
μmol/min/g at 65
°C for 15
min. The enzyme was rapidly inactivated after 15
min at 75
°C and after 5
min at both 80 and 90
°C. Processing was by immersion in a thermostatically controlled water bath at 90 or 100
°C for times up to 25
min and with and without blanching at 65
°C or 75
°C for 15
min. Shear force as an indicator of firmness was measured by shearing through the whole potato with a single blade (1
mm thickness) at a cross-head speed of 50
mm/min. Firmness was significantly higher (
P
<
0.05) for processed potatoes blanched at 65
°C than those cooked at 95 or 100
°C without blanching. Low temperature blanching offers the potential for improving texture of processed whole new potatoes.
In Europe, the design of new structures according to modern regulations requires a uniform hazard spectrum for a given return period (e.g., 475 years). The assumption is that the resulting collapse ...probability is equally uniform for all structures, regardless of their structural properties or location. However, the uncertainty in the collapse capacity and hazard curves at different sites lead to an unequal level of risk. This discrepancy is undesirable given that some inhabitants will live in dwellings with a lower seismic safety than others living in structures designed according to the same regulation. The estimation of risk-targeted hazard maps allows for the definition of a design ground motion leading to a uniform level of risk. Using hundreds of fragility models developed for European buildings and hazard results from the SHARE project, we calculate risk-targeted hazard maps for a pre-established annual collapse probability.
This paper presents a methodology for the appropriate treatment of variability in the process of building vulnerability assessment. Material, geometric and mechanical properties of the assessed ...building typologies are simulated through a Monte-Carlo sampling procedure in which the statistical distribution of the latter parameters are taken into account. Record selection is performed in accordance with conditional hazard-consistent distributions of a comprehensive set of intensity measures, and matters of sufficiency, efficiency, predictability and scaling robustness are envisaged in the presented framework. Several intensity measures (IMs) are conjugated in the evaluation of building fragility and vulnerability, whereby fragility functions are established as the multivariate distribution of joint probability of being in a sequential set of damage states. Vulnerability Functions consequently determined provide not only a mean Damage Ratio per level of seismic intensity, but rather probabilistic distributions of Damage Ratio that reflect the ground motion variability expected as the interested site; as determined by the hazard-consistent conditional distribution of a set of sufficient intensity measures.
For the estimation of "local personal risk," i.e., the annual probability of fatality for a hypothetical person continuously present in or near a building, an analytical methodology based on the ...probability of partial and complete collapse mechanisms (fragility models) and the probability of death given those collapse mechanisms (consequence models) for a building stock exposed to induced seismicity ground shaking is presented.