Abstract
The intense radiation emitted by luminous quasars dramatically alters the ionization state of their surrounding IGM. This so-called proximity effect extends out to tens of Mpc, and manifests ...as large coherent regions of enhanced Ly
α
(Ly
α
) forest transmission in absorption spectra of background sightlines. Here, we present a novel method based on Ly
α
forest tomography, which is capable of mapping these quasar “light echoes” in three dimensions. Using a dense grid (10–100) of faint (
) background galaxies as absorption probes, one can measure the ionization state of the IGM in the vicinity of a foreground quasar, yielding detailed information about the quasar’s radiative history and emission geometry. An end-to-end analysis—combining cosmological hydrodynamical simulations post-processed with a quasar emission model, realistic estimates of galaxy number densities, and instrument + telescope throughput—is conducted to explore the feasibility of detecting quasar light echoes. We present a new, fully Bayesian statistical method that allows one to reconstruct quasar light echoes from thousands of individual low-S/N transmission measurements. Armed with this tool, we undertake an exhaustive parameter study and show that light echoes can be convincingly detected for luminous (
M
1450
< −27.5 mag, corresponding to
m
1450
< 18.4 mag at
) quasars at redshifts 3 <
z
QSO
< 5, and that a relative precision better than 20% on the quasar age can be achieved for individual objects in the expected range of ages between 1 and 100 Myr. The observational requirements are relatively modest: moderate-resolution (
R
≳ 750), multiobject spectroscopy at a low signal-to-noise ratio (S/N > 5) is sufficient, requiring three-hour integrations using existing instruments on 8 m class telescopes.
The ever increasing size and complexity of data coming from simulations of cosmic structure formation demand equally sophisticated tools for their analysis. During the past decade, the art of object ...finding in these simulations has hence developed into an important discipline itself. A multitude of codes based upon a huge variety of methods and techniques have been spawned yet the question remained as to whether or not they will provide the same (physical) information about the structures of interest. Here we summarize and extent previous work of the 'halo finder comparison project': we investigate in detail the (possible) origin of any deviations across finders. To this extent, we decipher and discuss differences in halo-finding methods, clearly separating them from the disparity in definitions of halo properties. We observe that different codes not only find different numbers of objects leading to a scatter of up to 20 per cent in the halo mass and V
max function, but also that the particulars of those objects that are identified by all finders differ. The strength of the variation, however, depends on the property studied, e.g. the scatter in position, bulk velocity, mass and the peak value of the rotation curve is practically below a few per cent, whereas derived quantities such as spin and shape show larger deviations. Our study indicates that the prime contribution to differences in halo properties across codes stems from the distinct particle collection methods and - to a minor extent - the particular aspects of how the procedure for removing unbound particles is implemented. We close with a discussion of the relevance and implications of the scatter across different codes for other fields such as semi-analytical galaxy formation models, gravitational lensing and observables in general.
The passage of muons through matter is dominated by the Coulomb interaction with electrons and nuclei in the matter. The muon interaction with the electrons leads to continuous energy loss and ...stopping of the muons. The muon interaction with nuclei leads to angular diffusion. Using both stopped muons and angle diffusion interactions allows us to determine density and identify materials. Here we demonstrate material identification using data taken at Los Alamos with a particle tracker built from a set of sealed drift tubes with commercial electronics and software, the Mini Muon Tracker (MMT).
Haloes gone MAD: The Halo-Finder Comparison Project Knebe, Alexander; Knollmann, Steffen R.; Muldrew, Stuart I. ...
Monthly Notices of the Royal Astronomical Society,
08/2011, Letnik:
415, Številka:
3
Journal Article
Recenzirano
Odprti dostop
ABSTRACT
We present a detailed comparison of fundamental dark matter halo properties retrieved by a substantial number of different halo finders. These codes span a wide range of techniques including ...friends‐of‐friends, spherical‐overdensity and phase‐space‐based algorithms. We further introduce a robust (and publicly available) suite of test scenarios that allow halo finder developers to compare the performance of their codes against those presented here. This set includes mock haloes containing various levels and distributions of substructure at a range of resolutions as well as a cosmological simulation of the large‐scale structure of the universe.
All the halo‐finding codes tested could successfully recover the spatial location of our mock haloes. They further returned lists of particles (potentially) belonging to the object that led to coinciding values for the maximum of the circular velocity profile and the radius where it is reached. All the finders based in configuration space struggled to recover substructure that was located close to the centre of the host halo, and the radial dependence of the mass recovered varies from finder to finder. Those finders based in phase space could resolve central substructure although they found difficulties in accurately recovering its properties. Through a resolution study we found that most of the finders could not reliably recover substructure containing fewer than 30–40 particles. However, also here the phase‐space finders excelled by resolving substructure down to 10–20 particles. By comparing the halo finders using a high‐resolution cosmological volume, we found that they agree remarkably well on fundamental properties of astrophysical significance (e.g. mass, position, velocity and peak of the rotation curve).
We further suggest to utilize the peak of the rotation curve, vmax, as a proxy for mass, given the arbitrariness in defining a proper halo edge.
ABSTRACT
We present the one-dimensional Ly α forest power spectrum measurement using the first data provided by the Dark Energy Spectroscopic Instrument (DESI). The data sample comprises 26 330 ...quasar spectra, at redshift z > 2.1, contained in the DESI Early Data Release and the first 2 months of the main survey. We employ a Fast Fourier Transform (FFT) estimator and compare the resulting power spectrum to an alternative likelihood-based method in a companion paper. We investigate methodological and instrumental contaminants associated with the new DESI instrument, applying techniques similar to previous Sloan Digital Sky Survey (SDSS) measurements. We use synthetic data based on lognormal approximation to validate and correct our measurement. We compare our resulting power spectrum with previous SDSS and high-resolution measurements. With relatively small number statistics, we successfully perform the FFT measurement, which is already competitive in terms of the scale range. At the end of the DESI survey, we expect a five times larger Ly α forest sample than SDSS, providing an unprecedented precise one-dimensional power spectrum measurement.
The reionization of helium at z ~ 3 is the final phase transition of the intergalactic medium and supposed to be driven purely by quasars. The He ii transverse proximity effect—enhanced He ii ...transmission in a background sightline caused by the ionizing radiation of a foreground quasar—therefore offers a unique opportunity to probe the morphology of He ii reionization and to investigate the emission properties of quasars, e.g., ionizing emissivity, lifetime and beaming geometry. We use the most-recent HST/COS far-UV dataset of 22 He ii absorption spectra and conduct our own dedicated optical spectroscopic survey to find foreground quasars around these He ii sightlines. Based on a set of 66 foreground quasars, we perform the first statistical analysis of the He ii transverse proximity effect. Despite a large object-to-object variance, our stacking analysis reveals an excess in the average He ii transmission near the foreground quasars at 3σ significance. This statistical evidence for the transverse proximity effect is corroborated by a clear dependence of the signal strength on the inferred He ii ionization rate at the background sightline. Our detection places, based on the transverse light crossing time, a geometrical limit on the quasar lifetime of tQ > 25 Myr. This evidence for sustained activity of luminous quasars is relevant for the morphology of H i and He ii reionization and helps to constrain AGN triggering mechanisms, accretion physics and models of black hole mass assembly. We show how future modeling of the transverse proximity effect can additionally constrain quasar emission geometries and e.g., clarify if the large observed object-to-object variance can be explained by current models of quasar obscuration.
Muon tomography is a technique that uses cosmic ray muons to generate three dimensional images of volumes using information contained in the Coulomb scattering of the muons. Advantages of this ...technique are the ability of cosmic rays to penetrate significant overburden and the absence of any additional dose delivered to subjects under study above the natural cosmic ray flux. Disadvantages include the relatively long exposure times and poor position resolution and complex algorithms needed for reconstruction. Here we demonstrate a new method for obtaining improved position resolution and statistical precision for objects with spherical symmetry.
The use of high-quality simulated sky catalogs is essential for the success of cosmological surveys. The catalogs have diverse applications, such as investigating signatures of fundamental physics in ...cosmological observables, understanding the effect of systematic uncertainties on measured signals and testing mitigation strategies for reducing these uncertainties, aiding analysis pipeline development and testing, and survey strategy optimization. The list of applications is growing with improvements in the quality of the catalogs and the details that they can provide. Given the importance of simulated catalogs, it is critical to provide rigorous validation protocols that enable both catalog providers and users to assess the quality of the catalogs in a straightforward and comprehensive way. For this purpose, we have developed the DESCQA framework for the Large Synoptic Survey Telescope Dark Energy Science Collaboration as well as for the broader community. The goal of DESCQA is to enable the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. In this paper, we present the design concept and first implementation of DESCQA. In order to establish and demonstrate its full functionality we use a set of interim catalogs and validation tests. We highlight several important aspects, both technical and scientific, that require thoughtful consideration when designing a validation framework, including validation metrics and how these metrics impose requirements on the synthetic sky catalogs.
Modern cosmological simulations have reached the trillion-element scale, rendering data storage and subsequent analysis formidable tasks. To address this circumstance, we present a new MPI-parallel ...approach for analysis of simulation data while the simulation runs, as an alternative to the traditional workflow consisting of periodically saving large data sets to disk for subsequent ‘offline’ analysis. We demonstrate this approach in the compressible gasdynamics/
N
-body code Nyx, a hybrid
MPI
+
OpenMP
code based on the BoxLib framework, used for large-scale cosmological simulations. We have enabled on-the-fly workflows in two different ways: one is a straightforward approach consisting of all MPI processes periodically halting the main simulation and analyzing each component of data that they own (‘
in situ
’). The other consists of partitioning processes into disjoint MPI groups, with one performing the simulation and periodically sending data to the other ‘sidecar’ group, which post-processes it while the simulation continues (‘in-transit’). The two groups execute their tasks asynchronously, stopping only to synchronize when a new set of simulation data needs to be analyzed. For both the
in situ
and in-transit approaches, we experiment with two different analysis suites with distinct performance behavior: one which finds dark matter halos in the simulation using merge trees to calculate the mass contained within iso-density contours, and another which calculates probability distribution functions and power spectra of various fields in the simulation. Both are common analysis tasks for cosmology, and both result in summary statistics significantly smaller than the original data set. We study the behavior of each type of analysis in each workflow in order to determine the optimal configuration for the different data analysis algorithms.