The first half of this paper explores the origin of systematic biases in the measurement of weak gravitational lensing. Compared to previous work, we expand the investigation of point spread function ...instability and fold in for the first time the effects of non-idealities in electronic imaging detectors and imperfect galaxy shape measurement algorithms. Together, these now explain the additive
and multiplicative
systematics typically reported in current lensing measurements. We find that overall performance is driven by a product of a telescope/camera's absolute performance, and our knowledge about its performance.
The second half of this paper propagates any residual shear measurement biases through to their effect on cosmological parameter constraints. Fully exploiting the statistical power of Stage IV weak lensing surveys will require additive biases
and multiplicative biases
. These can be allocated between individual budgets in hardware, calibration data and software, using results from the first half of the paper.
If instrumentation is stable and well calibrated, we find extant shear measurement software from Gravitational Lensing Accuracy Testing 2010 (GREAT10) already meet requirements on galaxies detected at signal-to-noise ratio = 40. Averaging over a population of galaxies with a realistic distribution of sizes, it also meets requirements for a 2D cosmic shear analysis from space. If used on fainter galaxies or for 3D cosmic shear tomography, existing algorithms would need calibration on simulations to avoid introducing bias at a level similar to the statistical error. Requirements on hardware and calibration data are discussed in more detail in a companion paper. Our analysis is intentionally general, but is specifically being used to drive the hardware and ground segment performance budget for the design of the European Space Agency's recently selected Euclid mission.
Aims.
Stage IV lensing surveys promise to make an unprecedented amount of excellent data available. This will represent a huge leap in terms of quantity and quality and will open the way for the use ...of novel tools that surpass the standard second-order statistics for probing the high-order properties of the convergence field. Motivated by these considerations, some of us have started a long-term project aiming at using Minkowski functionals (MFs) as complementary and supplementary probes to increase the lensing figure of merit (FoM).
Methods.
As a second step on this path, we discuss the use of MFs for a survey consisting of a wide total area
A
tot
that is imaged at a limiting magnitude mag
W
and contains a subset of area
A
deep
, where observations are pushed to a deeper limiting magnitude mag
D
. We present an updated procedure to match the theoretically predicted MFs to the measured MFs, and take the effect of map reconstruction from noisy shear data into account. We validate this renewed method against simulated datasets with different source redshift distributions and total number density, setting these quantities in accordance with the depth of the survey. We can then rely on a Fisher matrix analysis to forecast the improvement in the FoM that is due to the joint use of shear tomography and MFs under different assumptions on (
A
tot
,
A
deep
, and mag
D
), and the prior on the MFs nuisance parameters.
Results.
We find that MFs can provide valuable help in increasing the FoM of the lensing survey when the nuisance parameters are known with non-negligible precision. The possibility of compensating for the loss of FoM through a cut in the multipole range that is probed by shear tomography is even more interesting. This makes the results more robust against uncertainties in the modeling of nonlinearities. This makes MFs a promising tool for increasing the FoM and also protects the constraints on the cosmological parameters mainly from theoretical systematic effects.
Full text
Available for:
FMFMET, NUK, UL, UM, UPUK
Minkowski functionals (MFs) quantify the topological properties of a given field probing its departure from Gaussianity. We investigate their use on lensing convergence maps in order to see whether ...they can provide further insights on the underlying cosmology with respect to the standard second-order statistics, i.e., cosmic shear tomography. To this end, we first present a method to match theoretical predictions with measured MFs taking care of the shape noise, imperfections in the map reconstruction, and inaccurate description of the nonlinearites in the matter power spectrum and bispectrum. We validate this method against simulated maps reconstructed from shear fields generated by the MICE simulation. We then perform a Fisher matrix analysis to forecast the accuracy on cosmological parameters from a joint MFs and shear tomography analysis. It turns out that MFs are indeed helpful to break the Ωm–σ8 degeneracy, thus generating a sort of chain reaction leading to an overall increase of the figure of merit.
Full text
Available for:
CMK, CTK, FMFMET, IJS, NUK, PNG, UM
Context.
The unprecedented amount and the excellent quality of lensing data expected from upcoming ground and space-based surveys present a great opportunity for shedding light on questions that ...remain unanswered with regard to our universe and the validity of the standard ΛCDM cosmological model. The development of new techniques that are capable of exploiting the vast quantity of data provided by future observations, in the most effective way possible, is of great importance.
Aims.
This is the reason we chose to investigate the development of a new method for treating weak-lensing higher-order statistics, which are known to break the degeneracy among cosmological parameters thanks to their capacity to probe non-Gaussian properties of the shear field. In particular, the proposed method applies directly to the observed quantity, namely, the noisy galaxy ellipticity.
Methods.
We produced simulated lensing maps with different sets of cosmological parameters and used them to measure higher-order moments, Minkowski functionals, Betti numbers, and other statistics related to graph theory. This allowed us to construct datasets with a range of sizes, levels of precision, and smoothing. We then applied several machine learning algorithms to determine which method best predicts the actual cosmological parameters associated with each simulation.
Results.
The most optimal model turned out to be a simple multidimensional linear regression. We use this model to compare the results coming from the different datasets and find that we can measure, with a good level of accuracy, the majority of the parameters considered in this study. We also investigated the relation between each higher-order estimator and the different cosmological parameters for several signal-to-noise thresholds and redshifts bins.
Conclusions.
Given the promising results we obtained, we consider this approach a valuable resource that is worthy of further development.
Full text
Available for:
FMFMET, NUK, UL, UM, UPUK
Euclid is a European Space Agency medium-class mission selected for launch in 2020 within the cosmic vision 2015–2025 program. The main goal of Euclid is to understand the origin of the accelerated ...expansion of the universe. Euclid will explore the expansion history of the universe and the evolution of cosmic structures by measuring shapes and red-shifts of galaxies as well as the distribution of clusters of galaxies over a large fraction of the sky. Although the main driver for Euclid is the nature of dark energy, Euclid science covers a vast range of topics, from cosmology to galaxy evolution to planetary research. In this review we focus on cosmology and fundamental physics, with a strong emphasis on science beyond the current standard models. We discuss five broad topics: dark energy and modified gravity, dark matter, initial conditions, basic assumptions and questions of methodology in the data analysis. This review has been planned and carried out within Euclid’s Theory Working Group and is meant to provide a guide to the scientific themes that will underlie the activity of the group during the preparation of the Euclid mission.
Full text
Available for:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, UILJ, UKNU, UL, UM, UPUK
The unprecedented quality, the increased data set, and the wide area of ongoing and near future weak lensing surveys allows one to move beyond the standard two points statistics, thus making it ...worthwhile to investigate higher order probes. As an interesting step toward this direction, we explore the use of higher order moments (HOM) of the convergence field as a way to increase the lensing figure of merit (FoM). To this end, we rely on simulated convergence to first show that HOM can be measured and calibrated so that it is indeed possible to predict them for a given cosmological model provided suitable nuisance parameters are introduced and then marginalized over. We then forecast the accuracy on cosmological parameters from the use of HOM alone and in combination with standard shear power spectra tomography. It turns out that HOM allow one to break some common degeneracies, thus significantly boosting the overall FoM. We also qualitatively discuss possible systematics and how they can be dealt with.
Full text
Available for:
CMK, CTK, FMFMET, IJS, NUK, PNG, UM
Defining a weak lensing experiment in space Cropper, Mark; Hoekstra, Henk; Kitching, Thomas ...
Monthly Notices of the Royal Astronomical Society,
06/2013, Volume:
431, Issue:
4
Journal Article
Peer reviewed
Open access
This paper describes the definition of a typical next-generation space-based weak gravitational lensing experiment. We first adopt a set of top-level science requirements from the literature, based ...on the scale and depth of the galaxy sample, and the avoidance of systematic effects in the measurements which would bias the derived shear values. We then identify and categorize the contributing factors to the systematic effects, combining them with the correct weighting, in such a way as to fit within the top-level requirements. We present techniques which permit the performance to be evaluated and explore the limits at which the contributing factors can be managed. Besides the modelling biases resulting from the use of weighted moments, the main contributing factors are the reconstruction of the instrument point spread function, which is derived from the stellar images on the image, and the correction of the charge transfer inefficiency in the CCD detectors caused by radiation damage.
The zCOSMOS 10k-Bright Spectroscopic Sample Lilly, Simon J; Le Brun, Vincent; Maier, Christian ...
The Astrophysical journal. Supplement series,
10/2009, Volume:
184, Issue:
2
Journal Article
Peer reviewed
Open access
We present spectroscopic redshifts of a large sample of galaxies with I AB < 22.5 in the COSMOS field, measured from spectra of 10,644 objects that have been obtained in the first two years of ...observations in the zCOSMOS-bright redshift survey. These include a statistically complete subset of 10,109 objects. The average accuracy of individual redshifts is 110 km s-1, independent of redshift. The reliability of individual redshifts is described by a Confidence Class that has been empirically calibrated through repeat spectroscopic observations of over 600 galaxies. There is very good agreement between spectroscopic and photometric redshifts for the most secure Confidence Classes. For the less secure Confidence Classes, there is a good correspondence between the fraction of objects with a consistent photometric redshift and the spectroscopic repeatability, suggesting that the photometric redshifts can be used to indicate which of the less secure spectroscopic redshifts are likely right and which are probably wrong, and to give an indication of the nature of objects for which we failed to determine a redshift. Using this approach, we can construct a spectroscopic sample that is 99% reliable and which is 88% complete in the sample as a whole, and 95% complete in the redshift range 0.5 < z < 0.8. The luminosity and mass completeness levels of the zCOSMOS-bright sample of galaxies is also discussed.