Aims. We propose a new mass mapping algorithm, specifically designed to recover small-scale information from a combination of gravitational shear and flexion. Including flexion allows us to ...supplement the shear on small scales in order to increase the sensitivity to substructures and the overall resolution of the convergence map without relying on strong lensing constraints. Methods. To preserve all available small scale information, we avoid any binning of the irregularly sampled input shear and flexion fields and treat the mass mapping problem as a general ill-posed inverse problem, which is regularised using a robust multi-scale wavelet sparsity prior. The resulting algorithm incorporates redshift, reduced shear, and reduced flexion measurements for individual galaxies and is made highly efficient by the use of fast Fourier estimators. Results. We tested our reconstruction method on a set of realistic weak lensing simulations corresponding to typical HST/ACS cluster observations and demonstrate our ability to recover substructures with the inclusion of flexion, which are otherwise lost if only shear information is used. In particular, we can detect substructures on the 15′′ scale well outside of the critical region of the clusters. In addition, flexion also helps to constrain the shape of the central regions of the main dark matter halos.
We present FlowPM, a Particle-Mesh (PM) cosmological N-body code implemented in Mesh-TensorFlow for GPU-accelerated, distributed, and differentiable simulations. We implement and validate the ...accuracy of a novel multi-grid scheme based on multiresolution pyramids to compute large-scale forces efficiently on distributed platforms. We explore the scaling of the simulation on large-scale supercomputers and compare it with corresponding Python based PM code, finding on an average 10x speed-up in terms of wallclock time. We also demonstrate how this novel tool can be used for efficiently solving large scale cosmological inference problems, in particular reconstruction of cosmological fields in a forward model Bayesian framework with hybrid PM and neural network forward model. We provide skeleton code for these examples and the entire code is publicly available at https://github.com/modichirag/flowpm▪.
•End-to-end differentiable cosmological N-Body simulations.•GPU based simulations with 10x speed gain over current CPU simulations.•First N-body simulation written in TensorFlow interfacing with ML and DL components.•Novel multi-grid force algorithm for distributed computing of large scale forces.•Support for large-scale distribution on supercomputers with Mesh TensorFlow.
Large-scale imaging surveys will increase the number of galaxy-scale strong lensing candidates by maybe three orders of magnitudes beyond the number known today. Finding these rare objects will ...require picking them out of at least tens of millions of images, and deriving scientific results from them will require quantifying the efficiency and bias of any search method. To achieve these objectives automated methods must be developed. Because gravitational lenses are rare objects, reducing false positives will be particularly important. We present a description and results of an open gravitational lens finding challenge. Participants were asked to classify 100 000 candidate objects as to whether they were gravitational lenses or not with the goal of developing better automated methods for finding lenses in large data sets. A variety of methods were used including visual inspection, arc and ring finders, support vector machines (SVM) and convolutional neural networks (CNN). We find that many of the methods will be easily fast enough to analyse the anticipated data flow. In test data, several methods are able to identify upwards of half the lenses after applying some thresholds on the lens characteristics such as lensed image brightness, size or contrast with the lens galaxy without making a single false-positive identification. This is significantly better than direct inspection by humans was able to do. Having multi-band, ground based data is found to be better for this purpose than single-band space based data with lower noise and higher resolution, suggesting that multi-colour data is crucial. Multi-band space based data will be superior to ground based data. The most difficult challenge for a lens finder is differentiating between rare, irregular and ring-like face-on galaxies and true gravitational lenses. The degree to which the efficiency and biases of lens finders can be quantified largely depends on the realism of the simulated data on which the finders are trained.
Context.
Weak lensing mass-mapping is a useful tool for accessing the full distribution of dark matter on the sky, but because of intrinsic galaxy ellipticies, finite fields, and missing data, the ...recovery of dark matter maps constitutes a challenging, ill-posed inverse problem
Aims.
We introduce a novel methodology that enables the efficient sampling of the high-dimensional Bayesian posterior of the weak lensing mass-mapping problem, relying on simulations to define a fully non-Gaussian prior. We aim to demonstrate the accuracy of the method to simulated fields, and then proceed to apply it to the mass reconstruction of the HST/ACS COSMOS field.
Methods.
The proposed methodology combines elements of Bayesian statistics, analytic theory, and a recent class of deep generative models based on neural score matching. This approach allows us to make full use of analytic cosmological theory to constrain the 2pt statistics of the solution, to understand any differences between this analytic prior and full simulations from cosmological simulations, and to obtain samples from the full Bayesian posterior of the problem for robust uncertainty quantification.
Results.
We demonstrate the method in the
κ
TNG simulations and find that the posterior mean significantly outperfoms previous methods (Kaiser–Squires, Wiener filter, Sparsity priors) both for the root-mean-square error and in terms of the Pearson correlation. We further illustrate the interpretability of the recovered posterior by establishing a close correlation between posterior convergence values and the S/N of the clusters artificially introduced into a field. Finally, we apply the method to the reconstruction of the HST/ACS COSMOS field, which yields the highest-quality convergence map of this field to date.
Conclusions.
We find the proposed approach to be superior to previous algorithms, scalable, providing uncertainties, and using a fully non-Gaussian prior.
Deep learning (DL) has shown remarkable results in solving inverse problems in various domains. In particular, the Tikhonet approach is very powerful in deconvolving optical astronomical images. ...However, this approach only uses the
ℓ
2
loss, which does not guarantee the preservation of physical information (e.g., flux and shape) of the object that is reconstructed in the image. A new loss function has been proposed in the framework of sparse deconvolution that better preserves the shape of galaxies and reduces the pixel error. In this paper, we extend the Tikhonet approach to take this shape constraint into account and apply our new DL method, called ShapeNet, to a simulated optical and radio-interferometry dataset. The originality of the paper relies on i) the shape constraint we use in the neural network framework, ii) the application of DL to radio-interferometry image deconvolution for the first time, and iii) the generation of a simulated radio dataset that we make available for the community. A range of examples illustrates the results.
Aims.
We introduce a novel approach to reconstructing dark matter mass maps from weak gravitational lensing measurements. The cornerstone of the proposed method lies in a new modelling of the matter ...density field in the Universe as a mixture of two components: (1) a sparsity-based component that captures the non-Gaussian structure of the field, such as peaks or halos at different spatial scales, and (2) a Gaussian random field, which is known to represent the linear characteristics of the field well.
Methods.
We propose an algorithm called MCALens that jointly estimates these two components. MCALens is based on an alternating minimisation incorporating both sparse recovery and a proximal iterative Wiener filtering.
Results.
Experimental results on simulated data show that the proposed method exhibits improved estimation accuracy compared to customised mass-map reconstruction methods.
Context. Upcoming spectroscopic galaxy surveys are extremely promising to help in addressing the major challenges of cosmology, in particular in understanding the nature of the dark universe. The ...strength of these surveys, naturally described in spherical geometry, comes from their unprecedented depth and width, but an optimal extraction of their three-dimensional information is of utmost importance to best constrain the properties of the dark universe. Aims. Although there is theoretical motivation and novel tools to explore these surveys using the 3D spherical Fourier-Bessel (SFB) power spectrum of galaxy number counts Cℓ(k,k′), most survey optimisations and forecasts are based on the tomographic spherical harmonics power spectrum \hbox{$C^{(ij)}_\ell$}Cℓ(ij). The goal of this paper is to perform a new investigation of the information that can be extracted from these two analyses in the context of planned stage IV wide-field galaxy surveys. Methods. We compared tomographic and 3D SFB techniques by comparing the forecast cosmological parameter constraints obtained from a Fisher analysis. The comparison was made possible by careful and coherent treatment of non-linear scales in the two analyses, which makes this study the first to compare 3D SFB and tomographic constraints on an equal footing. Nuisance parameters related to a scale- and redshift-dependent galaxy bias were also included in the computation of the 3D SFB and tomographic power spectra for the first time. Results. Tomographic and 3D SFB methods can recover similar constraints in the absence of systematics. This requires choosing an optimal number of redshift bins for the tomographic analysis, which we computed to be N = 26 for zmed ≃ 0.4, N = 30 for zmed ≃ 1.0, and N = 42 for zmed ≃ 1.7. When marginalising over nuisance parameters related to the galaxy bias, the forecast 3D SFB constraints are less affected by this source of systematics than the tomographic constraints. In addition, the rate of increase of the figure of merit as a function of median redshift is higher for the 3D SFB method than for the 2D tomographic method. Conclusions. Constraints from the 3D SFB analysis are less sensitive to unavoidable systematics stemming from a redshift- and scale-dependent galaxy bias. Even for surveys that are optimised with tomography in mind, a 3D SFB analysis is more powerful. In addition, for survey optimisation, the figure of merit for the 3D SFB method increases more rapidly with redshift, especially at higher redshifts, suggesting that the 3D SFB method should be preferred for designing and analysing future wide-field spectroscopic surveys. CosmicPy, the Python package developed for this paper, is freely available at https://cosmicpy.github.io.
Spherical 3D isotropic wavelets Lanusse, F.; Rassat, A.; Starck, J.-L.
Astronomy and astrophysics (Berlin),
04/2012, Volume:
540
Journal Article
Peer reviewed
Open access
Context. Future cosmological surveys will provide 3D large scale structure maps with large sky coverage, for which a 3D spherical Fourier-Bessel (SFB) analysis in spherical coordinates is natural. ...Wavelets are particularly well-suited to the analysis and denoising of cosmological data, but a spherical 3D isotropic wavelet transform does not currently exist to analyse spherical 3D data. Aims. The aim of this paper is to present a new formalism for a spherical 3D isotropic wavelet, i.e. one based on the SFB decomposition of a 3D field and accompany the formalism with a public code to perform wavelet transforms. Methods. We describe a new 3D isotropic spherical wavelet decomposition based on the undecimated wavelet transform (UWT) described in Starck et al. (2006). We also present a new fast discrete spherical Fourier-Bessel transform (DSFBT) based on both a discrete Bessel transform and the HEALPIX angular pixelisation scheme. We test the 3D wavelet transform and as a toy-application, apply a denoising algorithm in wavelet space to the Virgo large box cosmological simulations and find we can successfully remove noise without much loss to the large scale structure. Results. We have described a new spherical 3D isotropic wavelet transform, ideally suited to analyse and denoise future 3D spherical cosmological surveys, which uses a novel DSFBT. We illustrate its potential use for denoising using a toy model. All the algorithms presented in this paper are available for download as a public code called MRS3D at http://jstarck.free.fr/mrs3d.html
Aims. The primordial power spectrum describes the initial perturbations in the Universe which eventually grew into the large-scale structure we observe today, and thereby provides an indirect probe ...of inflation or other structure-formation mechanisms. Here, we introduce a new method to estimate this spectrum from the empirical power spectrum of cosmic microwave background maps. Methods. A sparsity-based linear inversion method, named PRISM, is presented. This technique leverages a sparsity prior on features in the primordial power spectrum in a wavelet basis to regularise the inverse problem. This non-parametric approach does not assume a strong prior on the shape of the primordial power spectrum, yet is able to correctly reconstruct its global shape as well as localised features. These advantages make this method robust for detecting deviations from the currently favoured scale-invariant spectrum. Results. We investigate the strength of this method on a set of WMAP nine-year simulated data for three types of primordial power spectra: a near scale-invariant spectrum, a spectrum with a small running of the spectral index, and a spectrum with a localised feature. This technique proves that it can easily detect deviations from a pure scale-invariant power spectrum and is suitable for distinguishing between simple models of the inflation. We process the WMAP nine-year data and find no significant departure from a near scale-invariant power spectrum with the spectral index ns = 0.972. Conclusions. A high-resolution primordial power spectrum can be reconstructed with this technique, where any strong local deviations or small global deviations from a pure scale-invariant spectrum can easily be detected.
Aims. The primordial power spectrum describes the initial perturbations that seeded the large-scale structure we observe today. It provides an indirect probe of inflation or other structure-formation ...mechanisms. In this Letter, we recover the primordial power spectrum from the Planck PR1 dataset, using our recently published algorithm PRISM. Methods. PRISM is a sparsity-based inversion method that aims at recovering features in the primordial power spectrum from the empirical power spectrum of the cosmic microwave background (CMB). This ill-posed inverse problem is regularised using a sparsity prior on features in the primordial power spectrum in a wavelet dictionary. Although this non-parametric method does not assume a strong prior on the shape of the primordial power spectrum, it is able to recover both its general shape and localised features. As a results, this approach presents a reliable way of detecting deviations from the currently favoured scale-invariant spectrum. Results. We applied PRISM to 100 simulated Planck data to investigate its performance on Planck-like data. We then applied PRISM to the Planck PR1 power spectrum to recover the primordial power spectrum. We also tested the algorithm’s ability to recover a small localised feature at k ~ 0.125 Mpc-1, which caused a large dip at ℓ ~ 1800 in the angular power spectrum. Conclusions. We find no significant departures from the fiducial Planck PR1 near scale-invariant primordial power spectrum with As = 2.215 × 10-9 and ns = 0.9624.