Measurements of present‐day surface deformation are essential for the assessment of long‐term seismic hazard. The European Space Agency's Sentinel‐1 satellites enable global, high‐resolution ...observation of crustal motion from Interferometric Synthetic Aperture Radar (InSAR). We have developed automated InSAR processing systems that exploit the first ~5 years of Sentinel‐1 data to measure surface motions for the ~800,000‐km2 Anatolian region. Our new 3‐D velocity and strain rate fields illuminate deformation patterns dominated by westward motion of Anatolia relative to Eurasia, localized strain accumulation along the North and East Anatolian Faults, and rapid vertical signals associated with anthropogenic activities and to a lesser extent extension across the grabens of western Anatolia. We show that automatically processed Sentinel‐1 InSAR data can characterize details of the velocity and strain rate fields with high resolution and accuracy over large regions. These results are important for assessing the relationship between strain accumulation and release in earthquakes.
Plain Language Summary
Satellite‐based measurements of small rates of motion of the Earth's surface made at high spatial resolutions and over large areas are important for many geophysical applications including improving earthquake hazard models. We take advantage of recent advances in geodetic techniques in order to measure surface velocities and tectonic strain accumulation across the Anatolia region, including the highly seismogenic and often deadly North Anatolian Fault. We show that by combining Sentinel‐1 Interferometric Synthetic Aperture Radar (InSAR) data with Global Navigation Satellite System (GNSS) measurements we can enhance our view of surface deformation associated with active tectonics, the earthquake cycle, and anthropogenic processes.
Key Points
We produce high‐resolution horizontal and vertical velocity and strain rate fields for Anatolia from Sentinel‐1 InSAR and GNSS observations
Velocity gradients indicate shear strain accumulation along the North and East Anatolian Faults and extension across western Anatolia
InSAR data are critical for capturing high‐resolution details of the velocity and strain rate fields
Adaptation in new environments depends on the amount of genetic variation available for evolution, and the efficacy by which natural selection discriminates among this variation. However, whether ...some ecological factors reveal more genetic variation, or impose stronger selection pressures than others, is typically not known. Here, we apply the enzyme kinetic theory to show that rising global temperatures are predicted to intensify natural selection throughout the genome by increasing the effects of DNA sequence variation on protein stability. We test this prediction by (i) estimating temperature-dependent fitness effects of induced mutations in seed beetles adapted to ancestral or elevated temperature, and (ii) calculate 100 paired selection estimates on mutations in benign versus stressful environments from unicellular and multicellular organisms. Environmental stress
did not increase mean selection on de novo mutation, suggesting that the cost of adaptation does not generally increase in new ecological settings to which the organism is maladapted. However, elevated temperature increased the mean strength of selection on genome-wide polymorphism, signified by increases in both mutation load and mutational variance in fitness. These results have important implications for genetic diversity gradients and the rate and repeatability of evolution under climate change.
•First measure of anaerobic failure energy of lithium ion batteries.•Novel and simple bomb calorimeter method developed and demonstrated.•Four different cathode chemistries examined.•Full range of ...charged capacity used as independent variable.•Failure energy identified as primary safety hazard.
The energy released by failure of rechargeable 18-mm diameter by 65-mm long cylindrical (18650) lithium ion cells/batteries was measured in a bomb calorimeter for 4 different commercial cathode chemistries over the full range of charge using a method developed for this purpose. Thermal runaway was induced by electrical resistance (Joule) heating of the cell in the nitrogen-filled pressure vessel (bomb) to preclude combustion. The total energy released by cell failure, ΔHf, was assumed to be comprised of the stored electrical energy E (cell potential×charge) and the chemical energy of mixing, reaction and thermal decomposition of the cell components, ΔUrxn. The contribution of E and ΔUrxn to ΔHf was determined and the mass of volatile, combustible thermal decomposition products was measured in an effort to characterize the fire safety hazard of rechargeable lithium ion cells.
Space-borne Synthetic Aperture Radar (SAR) Interferometry (InSAR) is now a key geophysical tool for surface deformation studies. The European Commission’s Sentinel-1 Constellation began acquiring ...data systematically in late 2014. The data, which are free and open access, have global coverage at moderate resolution with a 6 or 12-day revisit, enabling researchers to investigate large-scale surface deformation systematically through time. However, full exploitation of the potential of Sentinel-1 requires specific processing approaches as well as the efficient use of modern computing and data storage facilities. Here we present Looking Into Continents from Space with Synthetic Aperture Radar (LiCSAR), an operational system built for large-scale interferometric processing of Sentinel-1 data. LiCSAR is designed to automatically produce geocoded wrapped and unwrapped interferograms and coherence estimates, for large regions, at 0.001° resolution (WGS-84 coordinate system). The products are continuously updated at a frequency depending on prioritised regions (monthly, weekly or live update strategy). The products are open and freely accessible and downloadable through an online portal. We describe the algorithms, processing, and storage solutions implemented in LiCSAR, and show several case studies that use LiCSAR products to measure tectonic and volcanic deformation. We aim to accelerate the uptake of InSAR data by researchers as well as non-expert users by mass producing interferograms and derived products.
ABSTRACT
Efficient automated detection of flux-transient, re-occurring flux-variable, and moving objects is increasingly important for large-scale astronomical surveys. We present braai, a ...convolutional-neural-network, deep-learning real/bogus classifier designed to separate genuine astrophysical events and objects from false positive, or bogus, detections in the data of the Zwicky Transient Facility (ZTF), a new robotic time-domain survey currently in operation at the Palomar Observatory in California, USA. Braai demonstrates a state-of-the-art performance as quantified by its low false negative and false positive rates. We describe the open-source software tools used internally at Caltech to archive and access ZTF’s alerts and light curves (kowalski
), and to label the data (zwickyverse). We also report the initial results of the classifier deployment on the Edge Tensor Processing Units that show comparable performance in terms of accuracy, but in a much more (cost-) efficient manner, which has significant implications for current and future surveys.
Earthquakes are caused by the release of tectonic strain accumulated between events. Recent advances in satellite geodesy mean we can now measure this interseismic strain accumulation with a high ...degree of accuracy. But it remains unclear how to interpret short-term geodetic observations, measured over decades, when estimating the seismic hazard of faults accumulating strain over centuries. Here, we show that strain accumulation rates calculated from geodetic measurements around a major transform fault are constant for its entire 250-year interseismic period, except in the ~10 years following an earthquake. The shear strain rate history requires a weak fault zone embedded within a strong lower crust with viscosity greater than ~10
Pa s. The results support the notion that short-term geodetic observations can directly contribute to long-term seismic hazard assessment and suggest that lower-crustal viscosities derived from postseismic studies are not representative of the lower crust at all spatial and temporal scales.
The Zwicky Transient Facility: Surveys and Scheduler Bellm, Eric C.; Kulkarni, Shrinivas R.; Barlow, Tom ...
Publications of the Astronomical Society of the Pacific,
06/2019, Volume:
131, Issue:
1000
Journal Article
Peer reviewed
Open access
We present a novel algorithm for scheduling the observations of time-domain imaging surveys. Our integer linear programming approach optimizes an observing plan for an entire night by assigning ...targets to temporal blocks, enabling strict control of the number of exposures obtained per field and minimizing filter changes. A subsequent optimization step minimizes slew times between each observation. Our optimization metric self-consistently weights contributions from time-varying airmass, seeing, and sky brightness to maximize the transient discovery rate. We describe the implementation of this algorithm on the surveys of the Zwicky Transient Facility and present its on-sky performance.
A molecular-level fire growth parameter Lyon, Richard E.; Safronava, Natallia; Crowley, Sean ...
Polymer degradation and stability,
April 2021, 2021-04-00, 20210401, Volume:
186
Journal Article
Peer reviewed
•A simple burning model is used to connect the continuum-level fire response of polymers to molecular-level thermal decomposition kinetics.•A micro (10−6 kg) scale fire growth capacity (FGC) is ...identified that includes ignitability and burning rate, and successfully ranks the performance of 32 polymers in 3 bench scale fire tests.•The measurement of FGC using a standard (ASTM D7309-19, Method A) microscale combustion calorimeter is described.
The relationship between the atomic composition of polymers and the fuel generating reactions in fires has been studied at various levels using molecular dynamics simulations of polymer thermal decomposition, molar group contributions to combustion properties, and finite element models of burning that require properties obtained from thermal analyses. In this study, a conceptual model is used to link the molecular-level processes of flaming combustion measured in thermal analysis to the fire response of a polymer at the continuum level. A scaling parameter emerges from this analysis called the fire growth capacity/FGC that includes ignitability and burning rate. The FGC was measured in a microscale combustion calorimeter for 32 polymers having a wide range of atomic composition, thermal stability and char forming tendency. It was found that FGC successfully ranks the expected fire performance of these polymers when subjected to a small flame and radiant heat.
Current time domain facilities are finding several hundreds of transient astronomical events a year. The discovery rate is expected to increase in the future as soon as new surveys such as the Zwicky ...Transient Facility (ZTF) and the Large Synoptic Sky Survey (LSST) come online. Presently, the rate at which transients are classified is approximately one order or magnitude lower than the discovery rate, leading to an increasing "follow-up drought". Existing telescopes with moderate aperture can help address this deficit when equipped with spectrographs optimized for spectral classification. Here, we provide an overview of the design, operations and first results of the Spectral Energy Distribution Machine (SEDM), operating on the Palomar 60-inch telescope (P60). The instrument is optimized for classification and high observing efficiency. It combines a low-resolution (R ∼ 100) integral field unit (IFU) spectrograph with "Rainbow Camera" (RC), a multi-band field acquisition camera which also serves as multi-band (ugri) photometer. The SEDM was commissioned during the operation of the intermediate Palomar Transient Factory (iPTF) and has already lived up to its promise. The success of the SEDM demonstrates the value of spectrographs optimized for spectral classification.
The ability to rapidly access optical satellite imagery is now an intrinsic component of managing the disaster response that follows a major earthquake. These images provide synoptic data on the ...impacts, extent, and intensity of damage, which is essential for mitigating further losses by feeding into the response coordination. However, whilst the efficiency of the response can be hampered when cloud cover limits image availability, spatio-temporal variations in cloud cover have never been considered as part of the design of effective disaster mapping. Here we show how annual variations in cloud cover may affect our capacity to respond rapidly throughout the year and consequently contribute to overall earthquake risk. We find that on a global scale when accounting for cloud, the worst time of year for an earthquake disaster is between June and August. During these months, 40% of the global population at risk from earthquakes are obscured from optical satellite view for >3 consecutive days. Southeastern Asia is particularly strongly affected, accounting for the majority of the population at risk from earthquakes that could be obscured by cloud in every month. Our results demonstrate the importance of the timing of earthquakes in terms of our capacity to respond effectively, highlighting the need for more intelligent design of disaster response that is not overly reliant on optical satellite imagery.