ABSTRACT
Cosmological hydrodynamic simulations can accurately predict the properties of the intergalactic medium (IGM), but only under the condition of retaining the high spatial resolution necessary ...to resolve density fluctuations in the IGM. This resolution constraint prohibits simulating large volumes, such as those probed by BOSS and future surveys, like DESI and 4MOST. To overcome this limitation, we present “Iteratively Matched Statistics” (IMS), a novel method to accurately model the Ly
α
forest with collisionless
N
-body simulations, where the relevant density fluctuations are unresolved. We use a small-box, high-resolution hydrodynamic simulation to obtain the probability distribution function (PDF) and the power spectrum of the real-space Ly
α
forest flux. These two statistics are iteratively mapped onto a pseudo-flux field of an
N
-body simulation, which we construct from the matter density. We demonstrate that our method can reproduce the PDF, line of sight and 3D power spectra of the Ly
α
forest with good accuracy (7%, 4%, and 7% respectively). We quantify the performance of the commonly used Gaussian smoothing technique and show that it has significantly lower accuracy (20%–80%), especially for
N
-body simulations with achievable mean inter-particle separations in large-volume simulations. In addition, we show that IMS produces reasonable and smooth spectra, making it a powerful tool for modeling the IGM in large cosmological volumes and for producing realistic “mock” skies for Ly
α
forest surveys.
Abstract
The amplitude of the ionizing background that pervades the intergalactic medium (IGM) at the end of the epoch of reionization provides a valuable constraint on the emissivity of the sources ...that reionized the universe. While measurements of the ionizing background at lower redshifts rely on a simulation-calibrated mapping between the photoionization rate and the mean transmission of the Ly
α
forest, at
z
≳ 6 the IGM becomes increasingly opaque and transmission arises solely in narrow spikes separated by saturated Gunn–Peterson troughs. In this regime, the traditional approach of measuring the average transmission over large ∼50 Mpc/
h
regions is less sensitive and suboptimal. In addition, the five times smaller oscillator strength of the Ly
β
transition implies that the Ly
β
forest is considerably more transparent at
z
≳ 6, even in the presence of contamination by foreground
z
∼ 5 Ly
α
forest absorption. In this work we present a novel statistical approach to analyze the joint distribution of transmission spikes in the cospatial
z
∼ 6 Ly
α
and Ly
β
forests. Our method relies on approximate Bayesian computation (ABC), which circumvents the necessity of computing the intractable likelihood function describing the highly correlated Ly
α
and Ly
β
transmission. We apply ABC to mock data generated from a large-volume hydrodynamical simulation combined with a state-of-the-art model of ionizing background fluctuations in the post-reionization IGM and show that it is sensitive to higher IGM neutral hydrogen fractions than previous techniques. As a proof of concept, we apply this methodology to a real spectrum of a
z
= 6.54 quasar and measure the ionizing background from 5.4 ≤
z
≤ 6.4 along this sightline with ∼0.2 dex statistical uncertainties.
•An extreme-scale precision cosmological simulation framework is described.•New algorithmic features are described.•Accuracy tests using multiple algorithms are reported.•Includes selected results ...from the world’s largest high-resolution simulations.
Current and future surveys of large-scale cosmic structure are associated with a massive and complex datastream to study, characterize, and ultimately understand the physics behind the two major components of the ‘Dark Universe’, dark energy and dark matter. In addition, the surveys also probe primordial perturbations and carry out fundamental measurements, such as determining the sum of neutrino masses. Large-scale simulations of structure formation in the Universe play a critical role in the interpretation of the data and extraction of the physics of interest. Just as survey instruments continue to grow in size and complexity, so do the supercomputers that enable these simulations. Here we report on HACC (Hardware/Hybrid Accelerated Cosmology Code), a recently developed and evolving cosmology N-body code framework, designed to run efficiently on diverse computing architectures and to scale to millions of cores and beyond. HACC can run on all current supercomputer architectures and supports a variety of programming models and algorithms. It has been demonstrated at scale on Cell- and GPU-accelerated systems, standard multi-core node clusters, and Blue Gene systems. HACC’s design allows for ease of portability, and at the same time, high levels of sustained performance on the fastest supercomputers available. We present a description of the design philosophy of HACC, the underlying algorithms and code structure, and outline implementation details for several specific architectures. We show selected accuracy and performance results from some of the largest high resolution cosmological simulations so far performed, including benchmarks evolving more than 3.6 trillion particles.
We present a new N-body and gas dynamics code, called Nyx, for large-scale cosmological simulations. Nyx follows the temporal evolution of a system of discrete dark matter particles gravitationally ...coupled to an inviscid ideal fluid in an expanding universe. The gas is advanced in an Eulerian framework with block-structured adaptive mesh refinement; a particle-mesh scheme using the same grid hierarchy is used to solve for self-gravity and advance the particles. Computational results demonstrating the validation of Nyx on standard cosmological test problems, and the scaling behavior of Nyx to 50,000 cores, are presented.
The distribution of diffuse gas in the intergalactic medium (IGM) imprints a series of hydrogen absorption lines on the spectra of distant background quasars known as the Lyman-α forest. Cosmological ...hydrodynamical simulations predict that IGM density fluctuations are suppressed below a characteristic scale where thermal pressure balances gravity. We measured this pressure-smoothing scale by quantifying absorption correlations in a sample of close quasar pairs. We compared our measurements to hydrodynamical simulations, where pressure smoothing is determined by the integrated thermal history of the IGM. Our findings are consistent with standard models for photoionization heating by the ultraviolet radiation backgrounds that reionized the universe.
ABSTRACT
We compare two state-of-the-art numerical codes to study the overall accuracy in modelling the intergalactic medium and reproducing Lyman-α forest observables for DESI and high-resolution ...data sets. The codes employ different approaches to solving both gravity and modelling the gas hydrodynamics. The first code, Nyx, solves the Poisson equation using the Particle-Mesh (PM) method and the Euler equations using a finite-volume method. The second code, CRK-HACC , uses a Tree-PM method to solve for gravity, and an improved Lagrangian smoothed particle hydrodynamics (SPH) technique, where fluid elements are modelled with particles, to treat the intergalactic gas. We compare the convergence behaviour of the codes in flux statistics as well as the degree to which the codes agree in the converged limit. We find good agreement overall with differences being less than observational uncertainties, and a particularly notable ≲1 per cent agreement in the 1D flux power spectrum. This agreement was achieved by applying a tessellation methodology for reconstructing the density in CRK-HACC instead of using an SPH kernel as is standard practice. We show that use of the SPH kernel can lead to significant and unnecessary biases in flux statistics; this is especially prominent at high redshifts, z ∼ 5, as the Lyman-α forest mostly comes from lower-density regions that are intrinsically poorly sampled by SPH particles.
Inferring model parameters from experimental data is a grand challenge in many sciences, including cosmology. This often relies critically on high fidelity numerical simulations, which are ...prohibitively computationally expensive. The application of deep learning techniques to generative modeling is renewing interest in using high dimensional density estimators as computationally inexpensive emulators of fully-fledged simulations. These generative models have the potential to make a dramatic shift in the field of scientific simulations, but for that shift to happen we need to study the performance of such generators in the precision regime needed for science applications. To this end, in this work we apply Generative Adversarial Networks to the problem of generating weak lensing convergence maps. We show that our generator network produces maps that are described by, with high statistical confidence, the same summary statistics as the fully simulated maps.
The power spectrum of the Lyman-α Forest at z < 0.5 Khaire, Vikram; Walther, Michael; Hennawi, Joseph F ...
Monthly notices of the Royal Astronomical Society,
06/2019, Letnik:
486, Številka:
1
Journal Article
Recenzirano
Odprti dostop
ABSTRACT
We present new measurements of the flux power-spectrum P(k) of the z < 0.5 H i Lyman-α Forest spanning scales $k \sim 0.001\!-\!0.1\, \mathrm{s \, km}^{-1}$. These results were derived from ...65 far-ultraviolet quasar spectra (resolution $R \sim 18\, 000$) observed with the Cosmic Origin Spectrograph (COS) on board the Hubble Space Telescope. The analysis required careful masking of all contaminating, coincident absorption from H i and metal–line transitions of the Galactic interstellar medium and intervening absorbers as well as proper treatment of the complex COS line-spread function. From the P(k) measurements, we estimate the H i photoionization rate ($\Gamma _{\rm H\,{\small I}}$) in the z < 0.5 intergalactic medium. Our results confirm most of the previous $\Gamma _{\rm H\,{\small I}}$ estimates. We conclude that previous concerns of a photon underproduction crisis are now resolved by demonstrating that the measured $\Gamma _{\rm H\,{\small I}}$ can be accounted for by ultraviolet emission from quasars alone. In a companion paper, we will present constraints on the thermal state of the z < 0.5 intergalactic medium from the P(k) measurements presented here.
ABSTRACT
The Ly α forest (LAF) at z > 5 probes the thermal and reionization history of the intergalactic medium (IGM) and the nature of dark matter, but its interpretation requires comparison to ...cosmological hydrodynamical simulations. At high-z, convergence of these simulations is more exacting since transmission is dominated by underdense voids that are challenging to resolve. With evidence mounting for a late end to reionization, small structures down to the sub-kpc level may survive to later times than conventionally thought due to the reduced time for pressure smoothing to impact the gas, further tightening simulation resolution requirements. We perform a suite of simulations using the Eulerian cosmological hydrodynamics code Nyx, spanning domain sizes of 1.25 − 10 h−1 Mpc and 5 − 80 h−1 kpc cells, and explore the interaction of these variables with the timing of reionization on the properties of the matter distribution and the simulated LAF at z = 5.5. In observable Ly α power, convergence within 10 per cent is achieved for k < 0.1 s km–1, but larger k shows deviation of up to 20 per cent. While a later reionization retains more small structure in the density field, because of the greater thermal broadening there is little difference in the convergence of LAF power between early (z = 9) and later (z = 6) reionizations. We conclude that at z ∼ 5.5, resolutions of 10 kpc are necessary for convergence of LAF power at k < 0.1 s km–1, while higher-k modes require higher resolution, and that the timing of reionization does not significantly impact convergence given realistic photoheating.
ABSTRACT
The thermal state of the intergalactic medium contains vital information about the epoch of reionization, one of the most transformative yet poorly understood periods in the young Universe. ...This thermal state is encoded in the small-scale structure of Lyman-α (Ly α) absorption in quasar spectra. The 1D flux power spectrum measures the average small-scale structure along quasar sightlines. At high redshifts, where the opacity is large, averaging mixes high signal-to-noise ratio transmission spikes with noisy absorption troughs. Wavelet amplitudes are an alternate statistic that maintains spatial information while quantifying fluctuations at the same spatial frequencies as the power spectrum, giving them the potential to more sensitively measure the small-scale structure. Previous Ly α forest studies using wavelet amplitude probability density functions (PDFs) used limited spatial frequencies and neglected strong correlations between PDF bins and across wavelets scales, resulting in suboptimal and unreliable parameter inference. Here we present a novel method for performing statistical inference using wavelet amplitude PDFs that spans the full range of spatial frequencies probed by the power spectrum and that fully accounts for these correlations. We applied this procedure to realistic mock data drawn from a simple thermal model parametrized by the temperature at mean density, T0, and find that wavelets deliver 1σ constraints on T0 that are on average 7 per cent more sensitive at z = 5 (12 per cent at z = 6) than those from the power spectrum. We consider the possibility of combing wavelet PDFs with the power, but find that this does not lead to improved sensitivity.