In December 2019, the International Association of Geomagnetism and Aeronomy (IAGA) Division V Working Group (V-MOD) adopted the thirteenth generation of the International Geomagnetic Reference Field ...(IGRF). This IGRF updates the previous generation with a definitive main field model for epoch 2015.0, a main field model for epoch 2020.0, and a predictive linear secular variation for 2020.0 to 2025.0. This letter provides the equations defining the IGRF, the spherical harmonic coefficients for this thirteenth generation model, maps of magnetic declination, inclination and total field intensity for the epoch 2020.0, and maps of their predicted rate of change for the 2020.0 to 2025.0 time period.
SUMMARY
For the time stationary global geomagnetic field, a new modelling concept is presented. A Bayesian non-parametric approach provides realistic location dependent uncertainty estimates. ...Modelling related variabilities are dealt with systematically by making little subjective a priori assumptions. Rather than parametrizing the model by Gauss coefficients, a functional analytic approach is applied. The geomagnetic potential is assumed a Gaussian process to describe a distribution over functions. A priori correlations are given by an explicit kernel function with non-informative dipole contribution. A refined modelling strategy is proposed that accommodates non-linearities of archeomagnetic observables: First, a rough field estimate is obtained considering only sites that provide full field vector records. Subsequently, this estimate supports the linearization that incorporates the remaining incomplete records. The comparison of results for the archeomagnetic field over the past 1000 yr is in general agreement with previous models while improved model uncertainty estimates are provided.
In December 2019, the 13th revision of the International Geomagnetic Reference Field (IGRF) was released by the International Association of Geomagnetism and Aeronomy (IAGA) Division V Working Group ...V-MOD. This revision comprises two new spherical harmonic main field models for epochs 2015.0 (DGRF-2015) and 2020.0 (IGRF-2020) and a model of the predicted secular variation for the interval 2020.0 to 2025.0 (SV-2020-2025). The models were produced from candidates submitted by fifteen international teams. These teams were led by the British Geological Survey (UK), China Earthquake Administration (China), Universidad Complutense de Madrid (Spain), University of Colorado Boulder (USA), Technical University of Denmark (Denmark), GFZ German Research Centre for Geosciences (Germany), Institut de physique du globe de Paris (France), Institut des Sciences de la Terre (France), Pushkov Institute of Terrestrial Magnetism, Ionosphere and Radio Wave Propagation (Russia), Kyoto University (Japan), University of Leeds (UK), Max Planck Institute for Solar System Research (Germany), NASA Goddard Space Flight Center (USA), University of Potsdam (Germany), and Université de Strasbourg (France). The candidate models were evaluated individually and compared to all other candidates as well to the mean, median and a robust Huber-weighted model of all candidates. These analyses were used to identify, for example, the variation between the Gauss coefficients or the geographical regions where the candidate models strongly differed. The majority of candidates were sufficiently close that the differences can be explained primarily by individual modeling methodologies and data selection strategies. None of the candidates were so different as to warrant their exclusion from the final IGRF-13. The IAGA V-MOD task force thus voted for two approaches: the median of the Gauss coefficients of the candidates for the DGRF-2015 and IGRF-2020 models and the robust Huber-weighted model for the predictive SV-2020-2025. In this paper, we document the evaluation of the candidate models and provide details of the approach used to derive the final IGRF-13 products. We also perform a retrospective analysis of the IGRF-12 SV candidates over their performance period (2015–2020). Our findings suggest that forecasting secular variation can benefit from combining physics-based core modeling with satellite observations.
We propose a global geomagnetic field model for the last 14 thousand years, based on thermoremanent records. We call the model ArchKalmag14k. ArchKalmag14k is constructed by modifying recently ...proposed algorithms, based on space‐time correlations. Due to the amount of data and complexity of the model, the full Bayesian posterior is numerically intractable. To tackle this, we sequentialize the inversion by implementing a Kalman‐filter with a fixed time step. Every step consists of a prediction, based on a degree dependent temporal covariance, and a correction via Gaussian process regression. Dating errors are treated via a noisy input formulation. Cross correlations are reintroduced by a smoothing algorithm and model parameters are inferred from the data. Due to the specific statistical nature of the proposed algorithms, the model comes with space and time‐dependent uncertainty estimates. The new model ArchKalmag14k shows less variation in the large‐scale degrees than comparable models. Local predictions represent the underlying data and agree with comparable models, if the location is sampled well. Uncertainties are bigger for earlier times and in regions of sparse data coverage. We also use ArchKalmag14k to analyze the appearance and evolution of the South Atlantic anomaly together with reverse flux patches at the core‐mantle boundary, considering the model uncertainties. While we find good agreement with earlier models for recent times, our model suggests a different evolution of intensity minima prior to 1650 CE. In general, our results suggest that prior to 6000 BCE the data is not sufficient to support global models.
Plain Language Summary
We use data of archeological and volcanic origin from the last 14 thousand years to construct a global geomagnetic field model. We call the model ArchKalmag14k. The database is uneven in space, with significantly more records in the Northern hemisphere and multiple clusters. Further, the number of available records decreases in time with a distinct drop 6000 BCE. Previous studies introduced a modeling method that was adapted to this inhomogeneities, but could not be applied to the whole database for computational reasons. To tackle this, we modify the method and implement an approach which handles only a number of records at a time. Relations between the individual steps are reintroduced later in the algorithm. Uncertainties in the data and in their ages contribute to estimating reasonable model uncertainties. The model parameters are inferred from the data. ArchKalmag14k shows less variation on a global scale than comparable models. On a local scale, predictions represent the underlying data and agree with comparable models, if the location is covered well by data. Uncertainties are bigger for times and regions of sparse data coverage. The results suggest that prior to 6000 BCE the data is not sufficient to support global models.
Key Points
We propose a new global geomagnetic field model for the Holocene based on thermoremanent records
Existing algorithms based on space‐time correlation are modified by sequentialization via a Kalman‐filter and smoothing
The results suggest that prior to 6000 BCE the data is not sufficient to support global models
Bayesian analysis of the modified Omori law Holschneider, M.; Narteau, C.; Shebalin, P. ...
Journal of Geophysical Research,
June 2012, Letnik:
117, Številka:
B6
Journal Article
Recenzirano
Odprti dostop
In order to examine variations in aftershock decay rate, we propose a Bayesian framework to estimate the {K, c, p}‐values of the modified Omori law (MOL), λ(t) = K(c + t)−p. The Bayesian setting ...allows not only to produce a point estimator of these three parameters but also to assess their uncertainties and posterior dependencies with respect to the observed aftershock sequences. Using a new parametrization of the MOL, we identify the trade‐off between the c and p‐value estimates and discuss its dependence on the number of aftershocks. Then, we analyze the influence of the catalog completeness interval tstart, tstop on the various estimates. To test this Bayesian approach on natural aftershock sequences, we use two independent and non‐overlapping aftershock catalogs of the same earthquakes in Japan. Taking into account the posterior uncertainties, we show that both the handpicked (short times) and the instrumental (long times) catalogs predict the same ranges of parameter values. We therefore conclude that the same MOL may be valid over short and long times.
Key Points
Bayesian analysis of the modified Omori law
Application to early aftershock sequence
Uncertainty analysis
The primary data sources for reconstructing the geomagnetic field of the past millennia are archeomagnetic and sedimentary paleomagnetic data. Sediment records, in particular, are crucial in ...extending the temporal and spatial coverage of global geomagnetic field models, especially when archeomagnetic data are sparse. The exact process on how sediment data acquire magnetization including post‐depositional detrital remanent magnetization is still poorly understood. However, it is widely accepted that these effects lead to a smoothing of the magnetic signal and offsets with respect to the sediment age. They impede the direct inclusion of sediment records in global geomagnetic field modeling. As a first step, we model these effects for a single sediment core using a new class of flexible parameterized lock‐in functions. The parameters of the lock‐in function are estimated by the maximum likelihood method using archeomagnetic data as a reference. The effectiveness of the proposed method is evaluated through synthetic tests. Our results demonstrate that the proposed method is capable of estimating the parameters associated with the distortion caused by the lock‐in process.
Plain Language Summary
Our paper discusses the post‐depositional detrital remanent magnetization (pDRM) process in sediment records and how it affects the magnetization measured in sediment records. When we study the geomagnetic field of the past, we rely on data from archeological and sedimentary sources. However, there is a problem with sediment records called pDRM, which can make the magnetic signal unclear and cause sediment age to be offset. To make the sedimentary data more reliable, we developed a new method to correct the distortion caused by pDRM. Our method involves a new class of flexible parameterized lock‐in functions. Together with archeomagnetic data, we estimate the parameters of the lock‐in functions. Once we have determined the parameters of the lock‐in function, we can use them to correct the distortion caused by pDRM in sedimentary data. We test our method on synthetic data. Our results show that our method is effective in correcting the distortion caused by pDRM and making sedimentary data more reliable for reconstructing the geomagnetic field.
Key Points
A new class of parameterized lock‐in functions is presented
These lock‐in functions are capable of modeling the offset and smoothing associated with the post‐depositional remanent magnetization process
The proposed method is evaluated through several synthetic tests
Extreme hydrological events are often triggered by exceptional co-variations of the relevant hydrometeorological processes and in particular by exceptional co-oscillations at various temporal scales. ...Wavelet and cross wavelet spectral analysis offers promising time-scale resolved analysis methods to detect and analyze such exceptional co-oscillations. This paper presents the state-of-the-art methods of wavelet spectral analysis, discusses related subtleties, potential pitfalls and recently developed solutions to overcome them and shows how wavelet spectral analysis, if combined to a rigorous significance test, can lead to reliable new insights into hydrometeorological processes for real-world applications. The presented methods are applied to detect potentially flood triggering situations in a high Alpine catchment for which a recent re-estimation of design floods encountered significant problems simulating the observed high flows. For this case study, wavelet spectral analysis of precipitation, temperature and discharge offers a powerful tool to help detecting potentially flood producing meteorological situations and to distinguish between different types of floods with respect to the prevailing critical hydrometeorological conditions. This opens very new perspectives for the analysis of model performances focusing on the occurrence and non-occurrence of different types of high flow events. Based on the obtained results, the paper summarizes important recommendations for future applications of wavelet spectral analysis in hydrology.
Discovery of starspots on Vega Bohm, T; Holschneider, M; Lignieres, F ...
Astronomy and astrophysics (Berlin),
5/2015, Letnik:
577
Journal Article
Recenzirano
Odprti dostop
The theoretically studied impact of rapid rotation on stellar evolution needs to be compared with these results of high-resolution spectroscopy-velocimetry observations. Early-type stars present a ...perfect laboratory for these studies. The prototype A0 star Vega has been extensively monitored in recent years in spectro-polarimetry. The goal of this article is to present a thorough analysis of the line profile variations and associated estimators in the early-type standard star Vega (A0) in order to reveal potential activity tracers, exoplanet companions, and stellar oscillations. Vega was monitored in quasi-continuous high-resolution echelle spectroscopy with the highly stabilized velocimeter SOPHIE/OHP. A total of 2588 high signal-to-noise spectra was obtained during 34.7 h on five nights in high-resolution mode at R = 75 000 and covering the visible domain from 3895-6270 Aring. This first strong evidence that standard A-type stars can show surface structures opens a new field of research and ask about a potential link with the recently discovered weak magnetic field discoveries in this category of stars.