SUMMARY
The epidemic type aftershock sequence (ETAS) model provides a good description of the post-seismic spatio-temporal clustering of seismicity and is also able to capture some features of the ...increase of seismic activity caused by foreshocks. Recent results, however, have shown that the number of foreshocks observed in instrumental catalogues is significantly much larger than the one predicted by the ETAS model. Here we show that it is possible to keep an epidemic description of post-seismic activity and, at the same time, to incorporate pre-seismic temporal clustering, related to foreshocks. Taking also into-account the short-term incompleteness of instrumental catalogues, we present a model which achieves very good description of the southern California seismicity both on the aftershock and on the foreshock side. Our results indicate that the existence of a preparatory phase anticipating main shocks represents the most plausible explanation for the occurrence of foreshocks.
SUMMARY
The evaluation of the b value of the Gutenberg–Richter (GR) law, for a sample composed of n earthquakes, presents a systematic positive bias δb which is proportional to 1/n . In this study, ...we show how to incorporate in δb the bias introduced by deviations from the GR law. More precisely we show that δb is proportional to the square of the variability coefficient CV, defined as the ratio between the standard deviation of the magnitude distribution and its mean value. When the magnitude distribution follows the GR law CV = 1 and this allows us to introduce a new graphical procedure, based on the dependence of b on n, which allows us to identify the incompleteness magnitude mc as the threshold magnitude leading to CV = 1. The method is tested on synthetic catalogues and it is applied to estimate mc in Southern California, Japan and New Zealand.
LArSoft is a set of detector-independent software tools for the simulation, reconstruction and analysis of data from liquid argon (LAr) neutrino experiments The common features of LAr time projection ...chambers (TPCs) enable sharing of algorithm code across detectors of very different size and configuration. LArSoft is currently used in production simulation and reconstruction by the ArgoNeuT, DUNE, LArlAT, MicroBooNE, and SBND experiments. The software suite offers a wide selection of algorithms and utilities, including those for associated photo-detectors and the handling of auxiliary detectors outside the TPCs. Available algorithms cover the full range of simulation and reconstruction, from raw waveforms to high-level reconstructed objects, event topologies and classification. The common code within LArSoft is contributed by adopting experiments, which also provide detector-specific geometry descriptions, and code for the treatment of electronic signals. LArSoft is also a collaboration of experiments, Fermilab and associated software projects which cooperate in setting requirements, priorities, and schedules. In this talk, we outline the general architecture of the software and the interaction with external libraries and detector-specific code. We also describe the dynamics of LArSoft software development between the contributing experiments, the projects supporting the software infrastructure LArSoft relies on, and the core LArSoft support project.
The majority of strong earthquakes takes place a few hours after a mainshock, promoting the interest for a real time post-seismic forecasting, which is, however, very inefficient because of the ...incompleteness of available catalogs. Here we present a novel method that uses, as only information, the ground velocity recorded during the first 30 min after the mainshock and does not require that signals are transferred and elaborated by operational units. The method considers the logarithm of the mainshock ground velocity, its peak value defined as the perceived magnitude and the subsequent temporal decay. We conduct a forecast test on the nine M ≥ 6 mainshocks that have occurred since 2013 in the Aegean area. We are able to forecast the number of aftershocks recorded during the first 3 days after each mainshock with an accuracy smaller than 18% in all cases but one with an accuracy of 36%.
The b‐value in earthquake magnitude‐frequency distribution quantifies the relative frequency of large versus small earthquakes. Monitoring its evolution could provide fundamental insights into ...temporal variations of stress on different fault patches. However, genuine b‐value changes are often difficult to distinguish from artificial ones induced by temporal variations of the detection threshold. A highly innovative and effective solution to this issue has recently been proposed by van der Elst (2021, https://doi.org/10.1029/2020jb021027) by means of the b‐positive estimator, which is based on analyzing only the positive differences in magnitude between successive earthquakes. Here, we demonstrate the robustness of the estimator, which remains largely unaffected by detection issues due to the properties of conditional probability. We illustrate that this robustness can be further improved by considering positive differences in magnitude, not only between successive earthquakes but also between different pairs of earthquakes. This generalized approach, defined as the “b‐more‐positive estimator,” enhances efficiency by providing a precise estimate of the b‐value while including a larger number of earthquakes from an incomplete catalog. However, our analysis reveals that the accuracy of the b estimators diminishes when earthquakes below the completeness threshold are included in the catalog. This leads to the paradoxical observation that greater efficiency is achieved when the catalog is more incomplete. To address this, we introduce the “b‐more‐incomplete estimator,” where the b‐more‐positive estimator is applied only after artificially filtering the instrumental catalog to make it more incomplete. Our findings show the superior efficiency of the b‐more‐incomplete method.
Plain Language Summary
Earthquake magnitudes can vary widely, and the b‐value is a common metric used to measure the frequency of earthquakes with large versus small magnitudes. In addition, the b‐value could serve as an indicator of the stress state of different fault patches, making it a valuable tool in earthquake research. However, since small earthquakes are often obscured by previous larger ones, determining whether changes in the b‐value are genuine or simply caused by detection problems can be challenging. To address this issue, a new approach called the b‐positive estimator has been recently developed. The method only considers positive changes in magnitude between successive earthquakes. In this study, we confirm that the b‐positive estimator is a powerful and effective technique to estimate the b‐value and is largely unaffected by issues related to detecting earthquakes. We extend the method by considering positive differences in magnitude, encompassing not only successive earthquakes but also different pairs of earthquakes. In particular we show that the b‐positive estimator is more efficient when the catalog is more incomplete. This allows us to develop modifications to the b‐positive method providing a more efficient tool to monitor the b‐value in ongoing seismic sequences.
Key Points
Conditional probability detecting consecutive earthquakes makes positive magnitude difference distribution weakly affected by incompleteness
The b‐positive estimator can be enhanced by including more earthquake pairs, not only consecutive ones
The b‐positive estimator can be enhanced by making the catalog more incomplete
The ICARUS collaboration employed the 760-ton T600 detector in a successful 3-year physics run at the underground LNGS laboratory, performing a sensitive search for LSND-like anomalous
ν
e
appearance ...in the CERN Neutrino to Gran Sasso beam, which contributed to the constraints on the allowed neutrino oscillation parameters to a narrow region around 1 eV
2
. After a significant overhaul at CERN, the T600 detector has been installed at Fermilab. In 2020 the cryogenic commissioning began with detector cool down, liquid argon filling and recirculation. ICARUS then started its operations collecting the first neutrino events from the booster neutrino beam (BNB) and the Neutrinos at the Main Injector (NuMI) beam off-axis, which were used to test the ICARUS event selection, reconstruction and analysis algorithms. ICARUS successfully completed its commissioning phase in June 2022. The first goal of the ICARUS data taking will be a study to either confirm or refute the claim by Neutrino-4 short-baseline reactor experiment. ICARUS will also perform measurement of neutrino cross sections with the NuMI beam and several Beyond Standard Model searches. After the first year of operations, ICARUS will search for evidence of sterile neutrinos jointly with the Short-Baseline Near Detector, within the Short-Baseline Neutrino program. In this paper, the main activities carried out during the overhauling and installation phases are highlighted. Preliminary technical results from the ICARUS commissioning data with the BNB and NuMI beams are presented both in terms of performance of all ICARUS subsystems and of capability to select and reconstruct neutrino events.
Glucocorticoids promote CXCR4 expression by T cells, monocytes, macrophages, and eosinophils, but it is not known if glucocorticoids regulate CXCR4 in B cells. Considering the important contributions ...of CXCR4 to B cell development and function, we investigated the glucocorticoid/CXCR4 axis in mice. We demonstrate that glucocorticoids upregulate CXCR4 mRNA and protein in mouse B cells. Using a novel strain of mice lacking glucocorticoid receptors (GRs) specifically in B cells, we show that reduced CXCR4 expression associated with GR deficiency results in impaired homing of mature B cells to bone marrow, whereas migration to other lymphoid tissues is independent of B cell GRs. The exchange of mature B cells between blood and bone marrow is sensitive to small, physiologic changes in glucocorticoid activity, as evidenced by the lack of circadian rhythmicity in GR-deficient B cell counts normally associated with diurnal patterns of glucocorticoid secretion. B cell
mice mounted normal humoral responses to immunizations with T-dependent and T-independent (Type 1) Ags, but Ab responses to a multivalent T-independent (Type 2) Ag were impaired, a surprise finding considering the immunosuppressive properties commonly attributed to glucocorticoids. We propose that endogenous glucocorticoids regulate a dynamic mode of B cell migration specialized for rapid exchange between bone marrow and blood, perhaps as a means to optimize humoral immunity during diurnal periods of activity.
A good estimation of the b‐value is crucial for the earthquake hazard assessment. Its evaluation can be strongly affected by an incorrect estimation of the completeness magnitude mc because a too ...small mc will reflect into a small b‐value, whereas a too large mc will imply a larger standard deviation due to the reduction of the magnitude interval. Several methods for the estimation of mc exist, however its evaluation is very delicate and requires some critical decision making in most cases. Here we present a new, very rapid and simple method for mc estimation. It is based on the observation that the Gutenberg‐Richter distribution is an exponential one only for magnitudes larger than mc. As a consequence, the average magnitude ma value should increase linearly with a threshold magnitude mth. The departures from such linear behavior, allows a correct estimation of mc, whereas the linearity of the of ma versus mth allows a correct estimation of the b‐value.
Plain Language Summary
A very simple and rapid method for the estimation of the completeness magnitude and of the b‐value is here introduced. The method is based on the evaluation of the average magnitude as a function of a threshold one. The method does not require any decision and can be easily implemented in automatic procedures.
Key Points
The b and mc parameters are rapidly estimated
The two parameters are estimated independently
The seismic gap hypothesis states that fault regions where no large earthquake has recently occurred, are more prone than others to host the next large earthquake. It can lead to the idea of immunity ...after local disaster which, notwithstanding it sounds reasonable, it has been frequently rejected by objective testing. More generally, the estimate of the occurrence probability of the next big shock on the basis of the time delay from the last earthquake still represents a big challenge. The problem is that this issue cannot be addressed only on the basis of historical catalogs which contain too few well documented big shocks, and decades of future observations appear necessary. On the other hand, recent results have shown that important insights can be obtained from the spatial organization of aftershocks and its relationship to the mainshock slip profile. Here, we address this issue by monitoring the stress evolution together with the occurrence of big shocks and their aftershocks in a model where the fault is described as an elastic interface embedded in a ductile medium. The model reproduces all relevant statistical features of earthquake occurrence and allows us to perform accurate testing of the seismic gap hypothesis and its consequences, particularly on the side of aftershock spatial patterns. We show that large earthquakes do not regularly repeat in time, but it is possible to achieve insights on the time until the next big shock from the percentage of aftershocks occurring inside the mainshock slip contour.
Plain Language Summary
The seismic gap hypothesis states that large earthquakes preferentially occur in seismogenic fault regions, accordingly termed gap regions, where no large earthquake has been observed for a long time. The validity of the hypothesis implies that it is possible to achieve some insights on the timing of the next large earthquake on the basis of the previous seismic history. However, since the 1990s numerous statistical tests have failed to support this hypothesis and the current state of art is that many scientists consider the occurrence of large earthquakes fully unpredictable. In our study, we investigate the validity of the seismic gap hypothesis in a theoretical model which reproduces the main statistical features of real seismic occurrence in space, time, and magnitude. We show that, even if the model assumes a homogeneous and constant stress rate, the occurrence of large shocks is very irregular in time and space. Nonetheless, our findings support the hypothesis that an accurate monitoring of the shear stress rate on the fault and of previous seismic activity can be useful to identify the regions which have a higher probability to host the next big shock.
Key Points
We investigate the hypothesis of alternation in a physical model of a seismic fault presenting realistic features of aftershock occurrence
Aftershocks do not occur in large‐slip areas which become relocked and the next mainshock occurs in different fault regions
The time until the next big shock is inversely proportional to the percentage of aftershocks inside the mainshock slip contour
The ICARUS-T600 Liquid Argon (LAr) Time Projection Chamber (TPC) is taking data with the Fermilab Booster Neutrino Beam-line (BNB) in the Short Baseline Neutrino (SBN) program to search for a ...possible LSND-like sterile neutrino signal. A light detection system, based on 360 Hamamatsu R5912-MOD Photo-Multiplier Tubes (PMTs) deployed behind the TPC wire chambers, has been realized to detect vacuum ultraviolet (VUV) photons produced by ionizing particles in LAr. This system is fundamental for the detector operation, providing an efficient trigger and contributing to the 3D reconstruction of events. Moreover, since the TPC is exposed to a huge flux of cosmic rays due to its shallow depths operations, the light detection system allows for the time reconstruction of events, contributing to the identification and to the selection of neutrino interactions within the beam spill gates.