Euclid preparation Paykari, P; Kitching, T; Hoekstra, H ...
Astronomy and astrophysics (Berlin),
03/2020, Letnik:
635
Journal Article
Recenzirano
Odprti dostop
Aims. Our aim is to quantify the impact of systematic effects on the inference of cosmological parameters from cosmic shear. Methods. We present an “end-to-end” approach that introduces sources of ...bias in a modelled weak lensing survey on a galaxy-by-galaxy level. We propagated residual biases through a pipeline from galaxy properties at one end to cosmic shear power spectra and cosmological parameter estimates at the other end. We did this to quantify how imperfect knowledge of the pipeline changes the maximum likelihood values of dark energy parameters. Results. We quantify the impact of an imperfect correction for charge transfer inefficiency and modelling uncertainties of the point spread function for Euclid, and find that the biases introduced can be corrected to acceptable levels.
The material composition of asteroids is an essential piece of knowledge in the quest to understand the formation and evolution of the Solar System. Visual to near-infrared spectra or multiband ...photometry is required to constrain the material composition of asteroids, but we currently have such data, especially in the near-infrared wavelengths, for only a limited number of asteroids. This is a significant limitation considering the complex orbital structures of the asteroid populations. Up to 150 000 asteroids will be visible in the images of the upcoming ESA
Euclid
space telescope, and the instruments of
Euclid
will offer multiband visual to near-infrared photometry and slitless near-infrared spectra of these objects. Most of the asteroids will appear as streaks in the images. Due to the large number of images and asteroids, automated detection methods are needed. A non-machine-learning approach based on the Streak Det software was previously tested, but the results were not optimal for short and/or faint streaks. We set out to improve the capability to detect asteroid streaks in
Euclid
images by using deep learning. We built, trained, and tested a three-step machine-learning pipeline with simulated
Euclid
images. First, a convolutional neural network (CNN) detected streaks and their coordinates in full images, aiming to maximize the completeness (recall) of detections. Then, a recurrent neural network (RNN) merged snippets of long streaks detected in several parts by the CNN. Lastly, gradient-boosted trees (
XGBoost
) linked detected streaks between different
Euclid
exposures to reduce the number of false positives and improve the purity (precision) of the sample. The deep-learning pipeline surpasses the completeness and reaches a similar level of purity of a non-machine-learning pipeline based on the
StreakDet
software. Additionally, the deep-learning pipeline can detect asteroids 0.25–0.5 magnitudes fainter than
StreakDet
. The deep-learning pipeline could result in a 50% increase in the number of detected asteroids compared to the
StreakDet
software. There is still scope for further refinement, particularly in improving the accuracy of streak coordinates and enhancing the completeness of the final stage of the pipeline, which involves linking detections across multiple exposures.
Context.
The Copernican principle, the notion that we are not at a special location in the Universe, is one of the cornerstones of modern cosmology. Its violation would invalidate the ...Friedmann-Lemaître-Robertson-Walker metric, causing a major change in our understanding of the Universe. Thus, it is of fundamental importance to perform observational tests of this principle.
Aims.
We determine the precision with which future surveys will be able to test the Copernican principle and their ability to detect any possible violations.
Methods.
We forecast constraints on the inhomogeneous Lemaître-Tolman-Bondi (LTB) model with a cosmological constant Λ, basically a cosmological constant Λ and cold dark matter (CDM) model but endowed with a spherical inhomogeneity. We consider combinations of currently available data and simulated
Euclid
data, together with external data products, based on both ΛCDM and ΛLTB fiducial models. These constraints are compared to the expectations from the Copernican principle.
Results.
When considering the ΛCDM fiducial model, we find that
Euclid
data, in combination with other current and forthcoming surveys, will improve the constraints on the Copernican principle by about 30%, with ±10% variations depending on the observables and scales considered. On the other hand, when considering a ΛLTB fiducial model, we find that future
Euclid
data, combined with other current and forthcoming datasets, will be able to detect gigaparsec-scale inhomogeneities of contrast −0.1.
Conclusions.
Next-generation surveys, such as
Euclid
, will thoroughly test homogeneity at large scales, tightening the constraints on possible violations of the Copernican principle.
Cosmological constraints from key probes of the
Euclid
imaging survey rely critically on the accurate determination of the true redshift distributions,
n
(
z
), of tomographic redshift bins. We ...determine whether the mean redshift, ⟨
z
⟩, of ten
Euclid
tomographic redshift bins can be calibrated to the
Euclid
target uncertainties of
σ
(⟨
z
⟩) < 0.002 (1 +
z
) via cross-correlation, with spectroscopic samples akin to those from the Baryon Oscillation Spectroscopic Survey (BOSS), Dark Energy Spectroscopic Instrument (DESI), and
Euclid
’s NISP spectroscopic survey. We construct mock
Euclid
and spectroscopic galaxy samples from the Flagship simulation and measure small-scale clustering redshifts up to redshift
z
< 1.8 with an algorithm that performs well on current galaxy survey data. The clustering measurements are then fitted to two
n
(
z
) models: one is the true
n
(
z
) with a free mean; the other a Gaussian process modified to be restricted to non-negative values. We show that ⟨
z
⟩ is measured in each tomographic redshift bin to an accuracy of order 0.01 or better. By measuring the clustering redshifts on subsets of the full Flagship area, we construct scaling relations that allow us to extrapolate the method performance to larger sky areas than are currently available in the mock. For the full expected
Euclid
, BOSS, and DESI overlap region of approximately 6000 deg
2
, the uncertainties attainable by clustering redshifts exceeds the
Euclid
requirement by at least a factor of three for both
n
(
z
) models considered, although systematic biases limit the accuracy. Clustering redshifts are an extremely effective method for redshift calibration for
Euclid
if the sources of systematic biases can be determined and removed, or calibrated out with sufficiently realistic simulations. We outline possible future work, in particular an extension to higher redshifts with quasar reference samples.
An accurate covariance matrix is essential for obtaining reliable cosmological results when using a Gaussian likelihood. In this paper we study the covariance of pseudo-
C
ℓ
estimates of tomographic ...cosmic shear power spectra. Using two existing publicly available codes in combination, we calculate the full covariance matrix, including mode-coupling contributions arising from both partial sky coverage and non-linear structure growth. For three different sky masks, we compare the theoretical covariance matrix to that estimated from publicly available
N
-body weak lensing simulations, finding good agreement. We find that as a more extreme sky cut is applied, a corresponding increase in both Gaussian off-diagonal covariance and non-Gaussian super-sample covariance is observed in both theory and simulations, in accordance with expectations. Studying the different contributions to the covariance in detail, we find that the Gaussian covariance dominates along the main diagonal and the closest off-diagonals, but farther away from the main diagonal the super-sample covariance is dominant. Forming mock constraints in parameters that describe matter clustering and dark energy, we find that neglecting non-Gaussian contributions to the covariance can lead to underestimating the true size of confidence regions by up to 70 per cent. The dominant non-Gaussian covariance component is the super-sample covariance, but neglecting the smaller connected non-Gaussian covariance can still lead to the underestimation of uncertainties by 10–20 per cent. A real cosmological analysis will require marginalisation over many nuisance parameters, which will decrease the relative importance of all cosmological contributions to the covariance, so these values should be taken as upper limits on the importance of each component.
We present a method for fast evaluation of the covariance matrix for a two-point galaxy correlation function (2PCF) measured with the Landy–Szalay estimator. The standard way of evaluating the ...covariance matrix consists in running the estimator on a large number of mock catalogs, and evaluating their sample covariance. With large random catalog sizes (random-to-data objects’ ratio
M
≫ 1) the computational cost of the standard method is dominated by that of counting the data-random and random-random pairs, while the uncertainty of the estimate is dominated by that of data-data pairs. We present a method called Linear Construction (LC), where the covariance is estimated for small random catalogs with a size of
M
= 1 and
M
= 2, and the covariance for arbitrary
M
is constructed as a linear combination of the two. We show that the LC covariance estimate is unbiased. We validated the method with PINOCCHIO simulations in the range
r
= 20 − 200
h
−1
Mpc. With
M
= 50 and with 2
h
−1
Mpc bins, the theoretical speedup of the method is a factor of 14. We discuss the impact on the precision matrix and parameter estimation, and present a formula for the covariance of covariance.
An accurate covariance matrix is essential for obtaining reliable cosmological results when using a Gaussian likelihood. In this paper we study the covariance of pseudo- C ℓ estimates of tomographic ...cosmic shear power spectra. Using two existing publicly available codes in combination, we calculate the full covariance matrix, including mode-coupling contributions arising from both partial sky coverage and non-linear structure growth. For three different sky masks, we compare the theoretical covariance matrix to that estimated from publicly available N -body weak lensing simulations, finding good agreement. We find that as a more extreme sky cut is applied, a corresponding increase in both Gaussian off-diagonal covariance and non-Gaussian super-sample covariance is observed in both theory and simulations, in accordance with expectations. Studying the different contributions to the covariance in detail, we find that the Gaussian covariance dominates along the main diagonal and the closest off-diagonals, but farther away from the main diagonal the super-sample covariance is dominant. Forming mock constraints in parameters that describe matter clustering and dark energy, we find that neglecting non-Gaussian contributions to the covariance can lead to underestimating the true size of confidence regions by up to 70 per cent. The dominant non-Gaussian covariance component is the super-sample covariance, but neglecting the smaller connected non-Gaussian covariance can still lead to the underestimation of uncertainties by 10–20 per cent. A real cosmological analysis will require marginalisation over many nuisance parameters, which will decrease the relative importance of all cosmological contributions to the covariance, so these values should be taken as upper limits on the importance of each component.
Euclid preparation Paykari, P.; Hoekstra, H.; Azzollini, R. ...
Astronomy and astrophysics (Berlin),
03/2020, Letnik:
635
Journal Article
Recenzirano
Odprti dostop
Aims.
Our aim is to quantify the impact of systematic effects on the inference of cosmological parameters from cosmic shear.
Methods.
We present an “end-to-end” approach that introduces sources of ...bias in a modelled weak lensing survey on a galaxy-by-galaxy level. We propagated residual biases through a pipeline from galaxy properties at one end to cosmic shear power spectra and cosmological parameter estimates at the other end. We did this to quantify how imperfect knowledge of the pipeline changes the maximum likelihood values of dark energy parameters.
Results.
We quantify the impact of an imperfect correction for charge transfer inefficiency and modelling uncertainties of the point spread function for
Euclid
, and find that the biases introduced can be corrected to acceptable levels.