Euclid preparation Blanchard, A.; Camera, S.; Carbone, C. ...
Astronomy and astrophysics (Berlin),
10/2020, Letnik:
642
Journal Article
Recenzirano
Odprti dostop
Aims.
The
Euclid
space telescope will measure the shapes and redshifts of galaxies to reconstruct the expansion history of the Universe and the growth of cosmic structures. The estimation of the ...expected performance of the experiment, in terms of predicted constraints on cosmological parameters, has so far relied on various individual methodologies and numerical implementations, which were developed for different observational probes and for the combination thereof. In this paper we present validated forecasts, which combine both theoretical and observational ingredients for different cosmological probes. This work is presented to provide the community with reliable numerical codes and methods for
Euclid
cosmological forecasts.
Methods.
We describe in detail the methods adopted for Fisher matrix forecasts, which were applied to galaxy clustering, weak lensing, and the combination thereof. We estimated the required accuracy for
Euclid
forecasts and outline a methodology for their development. We then compare and improve different numerical implementations, reaching uncertainties on the errors of cosmological parameters that are less than the required precision in all cases. Furthermore, we provide details on the validated implementations, some of which are made publicly available, in different programming languages, together with a reference training-set of input and output matrices for a set of specific models. These can be used by the reader to validate their own implementations if required.
Results.
We present new cosmological forecasts for
Euclid
. We find that results depend on the specific cosmological model and remaining freedom in each setting, for example flat or non-flat spatial cosmologies, or different cuts at non-linear scales. The numerical implementations are now reliable for these settings. We present the results for an optimistic and a pessimistic choice for these types of settings. We demonstrate that the impact of cross-correlations is particularly relevant for models beyond a cosmological constant and may allow us to increase the dark energy figure of merit by at least a factor of three.
ABSTRACT
We present a new, updated version of the EuclidEmulator (called EuclidEmulator2), a fast and accurate predictor for the nonlinear correction of the matter power spectrum. 2 per cent level ...accurate emulation is now supported in the eight-dimensional parameter space of w0waCDM+∑mν models between redshift z = 0 and z = 3 for spatial scales within the range $0.01 \, h\, {\rm Mpc}^{-1}\le k \le 10\, h\, {\rm Mpc}^{-1}$. In order to achieve this level of accuracy, we have had to improve the quality of the underlying N-body simulations used as training data: (i) we use self-consistent linear evolution of non-dark matter species such as massive neutrinos, photons, dark energy, and the metric field, (ii) we perform the simulations in the so-called N-body gauge, which allows one to interpret the results in the framework of general relativity, (iii) we run over 250 high-resolution simulations with 30003 particles in boxes of 1(h−1 Gpc)3 volumes based on paired-and-fixed initial conditions, and (iv) we provide a resolution correction that can be applied to emulated results as a post-processing step in order to drastically reduce systematic biases on small scales due to residual resolution effects in the simulations. We find that the inclusion of the dynamical dark energy parameter wa significantly increases the complexity and expense of creating the emulator. The high fidelity of EuclidEmulator2 is tested in various comparisons against N-body simulations as well as alternative fast predictors such as HALOFIT, HMCode, and CosmicEmu. A blind test is successfully performed against the Euclid Flagship v2.0 simulation. Nonlinear correction factors emulated with EuclidEmulator2 are accurate at the level of $1{{\ \rm per\ cent}}$ or better for $0.01 \, h\, {\rm Mpc}^{-1}\le k \le 10\, h\, {\rm Mpc}^{-1}$ and z ≤ 3 compared to high-resolution dark-matter-only simulations. EuclidEmulator2 is publicly available at https://github.com/miknab/EuclidEmulator2.
We present a tomographic weak lensing analysis of the Kilo Degree Survey Data Release 4 (KiDS-1000), using a new pseudo angular power spectrum estimator (pseudo-
C
ℓ
) under development for the ESA
...Euclid
mission. Over 21 million galaxies with shape information are divided into five tomographic redshift bins, ranging from 0.1 to 1.2 in photometric redshift. We measured pseudo-
C
ℓ
using eight bands in the multipole range 76 <
ℓ
< 1500 for auto- and cross-power spectra between the tomographic bins. A series of tests were carried out to check for systematic contamination from a variety of observational sources including stellar number density, variations in survey depth, and point spread function properties. While some marginal correlations with these systematic tracers were observed, there is no evidence of bias in the cosmological inference.
B
-mode power spectra are consistent with zero signal, with no significant residual contamination from
E
/
B
-mode leakage. We performed a Bayesian analysis of the pseudo-
C
ℓ
estimates by forward modelling the effects of the mask. Assuming a spatially flat ΛCDM cosmology, we constrained the structure growth parameter
S
8
=
σ
8
(Ω
m
/0.3)
1/2
= 0.754
−0.029
+0.027
. When combining cosmic shear from KiDS-1000 with baryon acoustic oscillation and redshift space distortion data from recent Sloan Digital Sky Survey (SDSS) measurements of luminous red galaxies, as well as the Lyman-
α
forest and its cross-correlation with quasars, we tightened these constraints to
S
8
= 0.771
−0.032
+0.006
. These results are in very good agreement with previous KiDS-1000 and SDSS analyses and confirm a ∼3
σ
tension with early-Universe constraints from cosmic microwave background experiments.
Euclid
is poised to survey galaxies across a cosmological volume of unprecedented size, providing observations of more than a billion objects distributed over a third of the full sky. Approximately ...20 million of these galaxies will have their spectroscopy available, allowing us to map the three-dimensional large-scale structure of the Universe in great detail. This paper investigates prospects for the detection of cosmic voids therein and the unique benefit they provide for cosmological studies. In particular, we study the imprints of dynamic (redshift-space) and geometric (Alcock–Paczynski) distortions of average void shapes and their constraining power on the growth of structure and cosmological distance ratios. To this end, we made use of the Flagship mock catalog, a state-of-the-art simulation of the data expected to be observed with
Euclid
. We arranged the data into four adjacent redshift bins, each of which contains about 11 000 voids and we estimated the stacked void-galaxy cross-correlation function in every bin. Fitting a linear-theory model to the data, we obtained constraints on
f
/
b
and
D
M
H
, where
f
is the linear growth rate of density fluctuations,
b
the galaxy bias,
D
M
the comoving angular diameter distance, and
H
the Hubble rate. In addition, we marginalized over two nuisance parameters included in our model to account for unknown systematic effects in the analysis. With this approach,
Euclid
will be able to reach a relative precision of about 4% on measurements of
f
/
b
and 0.5% on
D
M
H
in each redshift bin. Better modeling or calibration of the nuisance parameters may further increase this precision to 1% and 0.4%, respectively. Our results show that the exploitation of cosmic voids in
Euclid
will provide competitive constraints on cosmology even as a stand-alone probe. For example, the equation-of-state parameter,
w
, for dark energy will be measured with a precision of about 10%, consistent with previous more approximate forecasts.
Euclid preparation Barnett, R.; Warren, S. J.; Mortlock, D. J. ...
Astronomy and astrophysics (Berlin),
11/2019, Letnik:
631
Journal Article
Recenzirano
Odprti dostop
We provide predictions of the yield of 7 < z < 9 quasars from the Euclid wide survey, updating the calculation presented in the Euclid Red Book in several ways. We account for revisions to the Euclid ...near-infrared filter wavelengths; we adopt steeper rates of decline of the quasar luminosity function (QLF; Φ) with redshift, Φ ∝ 10 k ( z − 6) , k = −0.72, and a further steeper rate of decline, k = −0.92; we use better models of the contaminating populations (MLT dwarfs and compact early-type galaxies); and we make use of an improved Bayesian selection method, compared to the colour cuts used for the Red Book calculation, allowing the identification of fainter quasars, down to J AB ∼ 23. Quasars at z > 8 may be selected from Euclid O Y J H photometry alone, but selection over the redshift interval 7 < z < 8 is greatly improved by the addition of z -band data from, e.g., Pan-STARRS and LSST. We calculate predicted quasar yields for the assumed values of the rate of decline of the QLF beyond z = 6. If the decline of the QLF accelerates beyond z = 6, with k = −0.92, Euclid should nevertheless find over 100 quasars with 7.0 < z < 7.5, and ∼25 quasars beyond the current record of z = 7.5, including ∼8 beyond z = 8.0. The first Euclid quasars at z > 7.5 should be found in the DR1 data release, expected in 2024. It will be possible to determine the bright-end slope of the QLF, 7 < z < 8, M 1450 < −25, using 8 m class telescopes to confirm candidates, but follow-up with JWST or E-ELT will be required to measure the faint-end slope. Contamination of the candidate lists is predicted to be modest even at J AB ∼ 23. The precision with which k can be determined over 7 < z < 8 depends on the value of k , but assuming k = −0.72 it can be measured to a 1 σ uncertainty of 0.07.
The
Euclid
mission – with its spectroscopic galaxy survey covering a sky area over 15 000 deg
2
in the redshift range 0.9 <
z
< 1.8 – will provide a sample of tens of thousands of cosmic voids. ...This paper thoroughly explores for the first time the constraining power of the void size function on the properties of dark energy (DE) from a survey mock catalogue, the official
Euclid
Flagship simulation. We identified voids in the Flagship light-cone, which closely matches the features of the upcoming
Euclid
spectroscopic data set. We modelled the void size function considering a state-of-the art methodology: we relied on the volume-conserving (Vdn) model, a modification of the popular Sheth & van de Weygaert model for void number counts, extended by means of a linear function of the large-scale galaxy bias. We found an excellent agreement between model predictions and measured mock void number counts. We computed updated forecasts for the
Euclid
mission on DE from the void size function and provided reliable void number estimates to serve as a basis for further forecasts of cosmological applications using voids. We analysed two different cosmological models for DE: the first described by a constant DE equation of state parameter,
w
, and the second by a dynamic equation of state with coefficients
w
0
and
w
a
. We forecast 1
σ
errors on
w
lower than 10% and we estimated an expected figure of merit (FoM) for the dynamical DE scenario FoM
w
0
,
w
a
= 17 when considering only the neutrino mass as additional free parameter of the model. The analysis is based on conservative assumptions to ensure full robustness, and is a pathfinder for future enhancements of the technique. Our results showcase the impressive constraining power of the void size function from the
Euclid
spectroscopic sample, both as a stand-alone probe, and to be combined with other
Euclid
cosmological probes.
Pair-instability supernovae are theorized supernovae that have not yet been observationally confirmed. They are predicted to exist in low-metallicity environments. Because overall metallicity becomes ...lower at higher redshifts, deep near-infrared transient surveys probing high-redshift supernovae are suitable to discover pair-instability supernovae. The
Euclid
satellite, which is planned launch in 2023, has a near-infrared wide-field instrument that is suitable for a high-redshift supernova survey. The Euclid Deep Survey is planned to make regular observations of three Euclid Deep Fields (40 deg
2
in total) spanning
Euclid
’s six-year primary mission period. While the observations of the Euclid Deep Fields are not frequent, we show that the predicted long duration of pair-instability supernovae would allow us to search for high-redshift pair-instability supernovae with the Euclid Deep Survey. Based on the current observational plan of the
Euclid
mission, we conduct survey simulations in order to estimate the expected numbers of pair-instability supernova discoveries. We find that up to several hundred pair-instability supernovae at
z
≲ 3.5 can be discovered within the Euclid Deep Survey. We also show that pair-instability supernova candidates can be efficiently identified by their duration and color, which can be determined with the current Euclid Deep Survey plan. We conclude that the
Euclid
mission can lead to the first confirmation of pair-instability supernovae if their event rates are as high as those predicted by recent theoretical studies. We also update the expected numbers of superluminous supernova discoveries in the Euclid Deep Survey based on the latest observational plan.
Context. The standard cosmological model is based on the fundamental assumptions of a spatially homogeneous and isotropic universe on large scales. An observational detection of a violation of these ...assumptions at any redshift would immediately indicate the presence of new physics.
Aims. We quantify the ability of the Euclid mission, together with contemporary surveys, to improve the current sensitivity of null tests of the canonical cosmological constant Λ and the cold dark matter (ΛCDM) model in the redshift range 0 < z < 1.8.
Methods. We considered both currently available data and simulated Euclid and external data products based on a ΛCDM fiducial model, an evolving dark energy model assuming the Chevallier-Polarski-Linder parameterization or an inhomogeneous Lemaître-Tolman-Bondi model with a cosmological constant Λ, and carried out two separate but complementary analyses: a machine learning reconstruction of the null tests based on genetic algorithms, and a theory-agnostic parametric approach based on Taylor expansion and binning of the data, in order to avoid assumptions about any particular model.
Results. We find that in combination with external probes, Euclid can improve current constraints on null tests of the ΛCDM by approximately a factor of three when using the machine learning approach and by a further factor of two in the case of the parametric approach. However, we also find that in certain cases, the parametric approach may be biased against or missing some features of models far from ΛCDM.
Conclusions. Our analysis highlights the importance of synergies between Euclid and other surveys. These synergies are crucial for providing tighter constraints over an extended redshift range for a plethora of different consistency tests of some of the main assumptions of the current cosmological paradigm.
The
Euclid
space telescope will survey a large dataset of cosmic voids traced by dense samples of galaxies. In this work we estimate its expected performance when exploiting angular photometric void ...clustering, galaxy weak lensing, and their cross-correlation. To this aim, we implemented a Fisher matrix approach tailored for voids from the
Euclid
photometric dataset and we present the first forecasts on cosmological parameters that include the void-lensing correlation. We examined two different probe settings, pessimistic and optimistic, both for void clustering and galaxy lensing. We carried out forecast analyses in four model cosmologies, accounting for a varying total neutrino mass,
M
ν
, and a dynamical dark energy (DE) equation of state,
w
(
z
), described by the popular Chevallier-Polarski-Linder parametrization. We find that void clustering constraints on
h
and Ω
b
are competitive with galaxy lensing alone, while errors on
n
s
decrease thanks to the orthogonality of the two probes in the 2D-projected parameter space. We also note that, as a whole, with respect to assuming the two probes as independent, the inclusion of the void-lensing cross-correlation signal improves parameter constraints by 10 − 15%, and enhances the joint void clustering and galaxy lensing figure of merit (FoM) by 10% and 25%, in the pessimistic and optimistic scenarios, respectively. Finally, when further combining with the spectroscopic galaxy clustering, assumed as an independent probe, we find that, in the most competitive case, the FoM increases by a factor of 4 with respect to the combination of weak lensing and spectroscopic galaxy clustering taken as independent probes. The forecasts presented in this work show that photometric void clustering and its cross-correlation with galaxy lensing deserve to be exploited in the data analysis of the
Euclid
galaxy survey and promise to improve its constraining power, especially on
h
, Ω
b
, the neutrino mass, and the DE evolution.
Current and future imaging surveys require photometric redshifts (photo-
z
s) to be estimated for millions of galaxies. Improving the photo-
z
quality is a major challenge but is needed to advance ...our understanding of cosmology. In this paper we explore how the synergies between narrow-band photometric data and large imaging surveys can be exploited to improve broadband photometric redshifts. We used a multi-task learning (MTL) network to improve broadband photo-
z
estimates by simultaneously predicting the broadband photo-
z
and the narrow-band photometry from the broadband photometry. The narrow-band photometry is only required in the training field, which also enables better photo-
z
predictions for the galaxies without narrow-band photometry in the wide field. This technique was tested with data from the Physics of the Accelerating Universe Survey (PAUS) in the COSMOS field. We find that the method predicts photo-
z
s that are 13% more precise down to magnitude
i
AB
<
23; the outlier rate is also 40% lower when compared to the baseline network. Furthermore, MTL reduces the photo-
z
bias for high-redshift galaxies, improving the redshift distributions for tomographic bins with
z
> 1. Applying this technique to deeper samples is crucial for future surveys such as
Euclid
or LSST. For simulated data, training on a sample with
i
AB
< 23, the method reduces the photo-
z
scatter by 16% for all galaxies with
i
AB
< 25. We also studied the effects of extending the training sample with photometric galaxies using PAUS high-precision photo-
z
s, which reduces the photo-
z
scatter by 20% in the COSMOS field.