Weak gravitational lensing provides a sensitive probe of cosmology by measuring the mass distribution and the geometry of the low-redshift Universe. We show how an all-sky weak lensing tomographic ...survey can jointly constrain different sets of cosmological parameters describing dark energy, massive neutrinos (hot dark matter) and the primordial power spectrum. In order to put all sectors on an equal footing, we introduce a new parameter β, the second-order running spectral index. Using the Fisher matrix formalism with and without cosmic microwave background (CMB) priors, we examine how the constraints vary as the parameter set is enlarged. We find that weak lensing with CMB priors provides robust constraints on dark energy parameters and can simultaneously provide strong constraints on all three sectors. We find that the dark energy sector is largely insensitive to the inclusion of the other cosmological sectors. Implications for the planning of future surveys are discussed.
Full text
Available for:
BFBNIB, FZAB, GIS, IJS, IZUM, KILJ, NLZOH, NUK, OILJ, PILJ, PNG, SAZU, SBCE, SBMB, UL, UM, UPUK
A major quest in cosmology is to understand the nature of dark energy. It is now well known that the use of several cosmological probes is required to break the underlying degeneracies on ...cosmological parameters. In this paper, we present a method based on a frequentist approach that combines probes without any prior constraints. As one application, the current supernovae type Ia and cosmic microwave background data are analyzed with an evolving dark energy component, and our results are first compared to other analyses. We emphasize the consequences of implementing the dark energy perturbations for an equation of state that varies with time. We then simulate the expectations of different future projects. The constraints from weak lensing surveys on the measurement of dark energy evolution are combined with the measurements from the cosmic microwave background and type Ia supernovae. We present the impacts for mid-term and long-term surveys and confirm that the combination with weak lensing is very powerful in breaking parameter degeneracies. A second generation of experiments is, however, required to achieve a 0.1 error on the parameters describing the evolution of dark energy.
Full text
Available for:
FMFMET, NUK, UL, UM, UPUK
As wide-field surveys yield ever more precise measurements, cosmology has entered a phase of high precision requiring highly accurate and fast theoretical predictions. At the heart of most ...cosmological model predictions is a numerical solution of the Einstein–Boltzmann equations governing the evolution of linear perturbations in the Universe. We present PyCosmo, a new Python-based framework to solve this set of equations using a special purpose solver based on symbolic manipulations, automatic generation of C++ code and sparsity optimisation. The code uses a consistency relation of the field equations to adapt the time step and does not rely on physical approximations for speed-up. After reviewing the system of first-order linear homogeneous differential equations to be solved, we describe the numerical scheme implemented in PyCosmo. We then compare the predictions and performance of the code for the computation of the transfer functions of cosmological perturbations and compare it to existing cosmological Boltzmann codes. While PyCosmo does not yet have all the features of other codes, our approach is complementary to other fast cosmological Boltzmann solvers and can be used as an independent test of their numerical solutions. The symbolic representation of the Einstein–Boltzmann equation system in PyCosmo provides a convenient interface for implementing extended cosmological models. We also discuss how the PyCosmo framework can also be used as a general framework to compute cosmological quantities as well as observables for both interactive and high-performance batch jobs applications. Information about the PyCosmo package and future code releases are available at http://www.cosmology.ethz.ch/research/software-lab.html.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UL, UM, UPCLJ, UPUK, ZRSKP
We present the first constraints on cosmology from the Dark Energy Survey (DES), using weak lensing measurements from the preliminary Science Verification (SV) data. We use 139 square degrees of SV ...data, which is less than 3% of the full DES survey area. Using cosmic shear 2-point measurements over three redshift bins we find sigma sub(8)(Omega sub(m)/0.3 ) super(0.5) = 0.81+ or -0.06 (68% confidence), after marginalizing over 7 systematics parameters and 3 other cosmological parameters. We examine the robustness of our results to the choice of data vector and systematics assumed, and find them to be stable. About 20% of our error bar comes from marginalizing over shear and photometric redshift calibration uncertainties. The current state-of-the-art cosmic shear measurements from CFHTLenS are mildly discrepant with the cosmological constraints from Planck CMB data; our results are consistent with both data sets. Our uncertainties are ~ 30% larger than those from CFHTLenS when we carry out a comparable analysis of the two data sets, which we attribute largely to the lower number density of our shear catalogue. We investigate constraints on dark energy and find that, with this small fraction of the full survey, the DES SV constraints make negligible impact on the Planck constraints. The moderate disagreement between the CFHTLenS and Planck values of sigma sub(8)(Omega sub(m)/0.3 ) super(0.5) is present regardless of the value of w.
Full text
Available for:
CMK, CTK, FMFMET, IJS, NUK, PNG, UM
We present photometric redshift estimates for galaxies used in the weak lensing analysis of the Dark Energy Survey Science Verification (DES SV) data. Four model- or machine learning-based ...photometric redshift methods-annz2, bpz calibrated against BCC-Ufig simulations, skynet, and tpz-are analyzed. For training, calibration, and testing of these methods, we construct a catalogue of spectroscopically confirmed galaxies matched against DES SV data. The performance of the methods is evaluated against the matched spectroscopic catalogue, focusing on metrics relevant for weak lensing analyses, with additional validation against COSMOS photo-z's. From the galaxies in the DES SV shear catalogue, which have mean redshift 0.72 + or - 0.01 over the range 0.3 < z< 1.3, we construct three tomographic bins with means of z= {0.45,0.67,1.00}. These bins each have systematic uncertainties delta sub(z)<, ~ 0.05 in the mean of the fiducial skynet photo-z n(z). We propagate the errors in the redshift distributions through to their impact on cosmological parameters estimated with cosmic shear, and find that they cause shifts in the value of sigma sub(8) of approximately 3%. This shift is within the one sigma statistical errors on sigma sub(8) for the DES SV shear catalogue. We further study the potential impact of systematic differences on the critical surface density, capital sigma sub(crit), finding levels of bias safely less than the statistical power of DES SV data. We recommend a final Gaussian prior for the photo-z bias in the mean of n(z) of width 0.05 for each of the three tomographic bins, and show that this is a sufficient bias model for the corresponding cosmology analysis.
Full text
Available for:
CMK, CTK, FMFMET, IJS, NUK, PNG, UM
PyCosmo is a Python-based framework for the fast computation of cosmological model predictions. One of its core features is the symbolic representation of the Einstein–Boltzmann system of equations. ...Efficient C/C++ code is generated from the SymPy symbolic expressions making use of the sympy2c package. This enables easy extensions of the equation system for the implementation of new cosmological models. We illustrate this with three extensions of the PyCosmo Boltzmann solver to include a dark energy component with a constant equation of state, massive neutrinos and a radiation streaming approximation. We describe the PyCosmo framework, highlighting new features, and the symbolic implementation of the new models. We compare the PyCosmo predictions for the ΛCDM model extensions with CLASS, both in terms of accuracy and computational speed. We find a good agreement, to better than 0.1% when using high-precision settings and a comparable computational speed. Links to the Python Package Index (PyPI) page of the code release and to the PyCosmo Hub, an online platform where the package is installed, are available at: https://cosmology.ethz.ch/research/software-lab/PyCosmo.html.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UL, UM, UPCLJ, UPUK, ZRSKP
47.
Summary of the DUNE mission concept
Proceedings of SPIE, the International Society for Optical Engineering/Proceedings of SPIE
Conference Proceeding
We present a three-dimensional cosmic shear analysis of the Hubble Space Telescope COSMOS survey, the largest ever optical imaging program performed In space. We have measured the shapes of galaxies ...for the telltale distortions caused by weak gravitational tensing and traced the growth of that signal as a function of redshift. Using both 2D and 3D analyses, we measure cosmological parameters Omega sub(m), the density of matter in the universe, and sigma sub(8), the normalization of the matter power spectrum. The introduction of redshift information tightens the constraints by a factor of 3 and also reduces the relative sampling (or "cosmic") variance compared to recent surveys that may be larger but are only two-dimensional. From the 3D analysis, we find that sigma sub(8)( Omega sub(m)/0.3) super(0.44) = 0.866 super(+) sub(-) super(0) sub(0) super(.) sub(.) super(0) sub(0) super(8) sub(6) super(5) sub(8) 68% confidence limits, Including both statistical and potential systematic sources of error in the total budget. Indeed, the absolute calibration of shear measurement methods is now the dominant source of uncertainty. Assuming instead a baseline cosmology to fix the geometry of the universe, we have measured the growth of structure on both linear and nonlinear physical scales. Our results thus demonstrate a proof of concept for tomographic analysis techniques that have been proposed for future weak-lensing surveys by a dedicated wide-field telescope in space.
Weak gravitational lensing is now established as a powerful method to measure mass fluctuations in the universe. It relies on the measurement of small coherent distortions of the images of background ...galaxies. Even low‐level correlations in the intrinsic shapes of galaxies could however produce a significant spurious lensing signal. These correlations are also interesting in their own right, since their detection would constrain models of galaxy formation. Using haloes found in N‐body simulations, we compute the correlation functions of the intrinsic ellipticity of spiral galaxies assuming that the disc is perpendicular to the angular momentum of the dark matter halo. We also consider a simple model for elliptical galaxies, in which the shape of the dark matter halo is assumed to be the same as that of the light. For deep lensing surveys with median redshifts ∼1, we find that intrinsic correlations of ∼10−4 on angular scales
are generally below the expected lensing signal, and contribute only a small fraction of the excess signals reported on these scales. On larger scales we find limits to the intrinsic correlation function at a level ∼10−5, which gives a (model‐dependent) range of separations for which the intrinsic signal is about an order of magnitude below the ellipticity correlation function expected from weak lensing. Intrinsic correlations are thus negligible on these scales for dedicated weak lensing surveys. For wider but shallower surveys such as SuperCOSMOS, APM and SDSS, we cannot exclude the possibility that intrinsic correlations could dominate the lensing signal. We discuss how such surveys could be used to calibrate the importance of this effect, as well as study spin–spin correlations of spiral galaxies.
Full text
Available for:
BFBNIB, FZAB, GIS, IJS, IZUM, KILJ, NLZOH, NUK, OILJ, PILJ, PNG, SAZU, SBCE, SBMB, UL, UM, UPUK
The Shear Testing Programme (STEP) is a collaborative project to improve the accuracy and reliability of all weak lensing measurements in preparation for the next generation of wide-field surveys. In ...this first STEP paper, we present the results of a blind analysis of simulated ground-based observations of relatively simple galaxy morphologies. The most successful methods are shown to achieve percent level accuracy. From the cosmic shear pipelines that have been used to constrain cosmology, we find weak lensing shear measured to an accuracy that is within the statistical errors of current weak lensing analyses, with shear measurements accurate to better than 7 per cent. The dominant source of measurement error is shown to arise from calibration uncertainties where the measured shear is over or underestimated by a constant multiplicative factor. This is of concern as calibration errors cannot be detected through standard diagnostic tests. The measured calibration errors appear to result from stellar contamination, false object detection, the shear measurement method itself, selection bias and/or the use of biased weights. Additive systematics (false detections of shear) resulting from residual point-spread function anisotropy are, in most cases, reduced to below an equivalent shear of 0.001, an order of magnitude below cosmic shear distortions on the scales probed by current surveys. Our results provide a snapshot view of the accuracy of current ground-based weak lensing methods and a benchmark upon which we can improve. To this end we provide descriptions of each method tested and include details of the eight different implementations of the commonly used Kaiser, Squires & Broadhurst method (KSB+) to aid the improvement of future KSB+ analyses.
Full text
Available for:
BFBNIB, FZAB, GIS, IJS, IZUM, KILJ, NLZOH, NUK, OILJ, PILJ, PNG, SAZU, SBCE, SBMB, UL, UM, UPUK