Feedback processes from baryons are expected to strongly affect weak-lensing observables of current and future cosmological surveys. In this paper we present a new parametrisation of halo profiles ...based on gas, stellar, and dark matter density components. This parametrisation is used to modify outputs of gravity-only \(N\)-body simulations (following the prescription of Schneider and Teyssier 1) in order to mimic baryonic effects on the matter density field. The resulting baryonic correction model relies on a few well motivated physical parameters and is able to reproduce the redshift zero clustering signal of hydrodynamical simulations at two percent accuracy below \(k\sim10\) h/Mpc. A detailed study of the baryon suppression effects on the matter power spectrum and the weak lensing shear correlation reveals that the signal is dominated by two parameters describing the slope of the gas profile in haloes and the maximum radius of gas ejection. We show that these parameters can be constrained with the observed gas fraction of galaxy groups and clusters from X-ray data. Based on these observations we predict a beyond percent effect on the power spectrum above \(k=0.2-1.0\) h/Mpc with a maximum suppression of 15-25 percent around \(k\sim 10\) h/Mpc. As a result, the weak lensing angular shear power spectrum is suppressed by 15-25 percent at scales beyond \(\ell\sim 100-600\) and the shear correlations \(\xi_{+}\) and \(\xi_{-}\) are affected at the 10-25 percent level below 5 and 50 arc-minutes, respectively. The relatively large uncertainties of these predictions are a result of the poorly known hydrostatic mass bias of current X-ray observations as well as the generic difficulty to observe the low density gas outside of haloes.
The problem of the detection of a target with random grey levels appearing on a random background is addressed. The optimal processor for any white statistics of the target's and the background's ...grey levels is determined. If the statistics of the grey levels are unknown, it is shown that one can approach the performance of the optimal processor by correlating the silouhette of the target with preprocessed versions of the scene and by performing the fusion of the obtained correlation planes. This result opens new perspectives for optical correlation.
Full text
Available for:
IJS, IMTLJ, KILJ, KISLJ, NUK, SBCE, SBJE, UL, UM, UPCLJ, UPUK
Bayesian inference is often used in cosmology and astrophysics to derive constraints on model parameters from observations. This approach relies on the ability to compute the likelihood of the data ...given a choice of model parameters. In many practical situations, the likelihood function may however be unavailable or intractable due to non-gaussian errors, non-linear measurements processes, or complex data formats such as catalogs and maps. In these cases, the simulation of mock data sets can often be made through forward modeling. We discuss how Approximate Bayesian Computation (ABC) can be used in these cases to derive an approximation to the posterior constraints using simulated data sets. This technique relies on the sampling of the parameter set, a distance metric to quantify the difference between the observation and the simulations and summary statistics to compress the information in the data. We first review the principles of ABC and discuss its implementation using a Population Monte-Carlo (PMC) algorithm and the Mahalanobis distance metric. We test the performance of the implementation using a Gaussian toy model. We then apply the ABC technique to the practical case of the calibration of image simulations for wide field cosmological surveys. We find that the ABC analysis is able to provide reliable parameter constraints for this problem and is therefore a promising technique for other applications in cosmology and astrophysics. Our implementation of the ABC PMC method is made available via a public code release.
Reproductive isolating mechanisms that are stronger for sympatric populations than for allopatric populations of a given species pair are indicative of reproductive character displacement, that is, ...selection for increased barriers to avoid the costly production of hybrid offspring. Evidence of reproductive character displacement in nature remains equivocal and requires further experimental studies. The genusMicrobotryumincludes species of anther‐smut fungi, which castrate pathogens specialized on different plants in the Caryophyllaceae and which serve as excellent models for studying mechanisms of speciation.Microbotryum lychnidis‐dioicaeandMicrobotryum silenes‐dioicaeare sister species that show no assortative mating and relatively high hybrid viability and, yet, display a lack of gene flow in natural populations. We wanted to test whether these apparently contradictory results could be explained by reproductive character displacement. We first confirmed the absence of detectable gene flow between the two species in two sympatric populations. Then, using experimental crosses and inoculations of host plants (Silene latifoliaandSilene dioica), we compared intrinsic reproductive barriers betweenM. lychnidis‐dioicaeandM. silenes‐dioicaefor sympatric versus allopatric populations. We found no evidence for strong reproductive character displacement at any of the following stages: selfing propensity, assortative mating, or hybrid infectivity. Altogether, our results suggest that ecological differences and a tendency for high selfing rates constitute barriers that are strong enough to effectively prevent interspecific gene flow.
Full text
Available for:
BFBNIB, IZUM, KILJ, NMLJ, NUK, PILJ, PNG, SAZU, UL, UM, UPUK
As several large single-dish radio surveys begin operation within the coming decade, a wealth of radio data will become available and provide a new window to the Universe. In order to fully exploit ...the potential of these data sets, it is important to understand the systematic effects associated with the instrument and the analysis pipeline. A common approach to tackle this is to forward-model the entire system - from the hardware to the analysis of the data products. For this purpose, we introduce two newly developed, open-source Python packages: the HI Data Emulator (HIDE) and the Signal Extraction and Emission Kartographer (SEEK) for simulating and processing single-dish radio survey data. HIDE forward-models the process of collecting astronomical radio signals in a single-dish radio telescope instrument and outputs pixel-level time-ordered-data. SEEK processes the time-ordered-data, removes artifacts from Radio Frequency Interference (RFI), automatically applies flux calibration, and aims to recover the astronomical radio signal. The two packages can be used separately or together depending on the application. Their modular and flexible nature allows easy adaptation to other instruments and data sets. We describe the basic architecture of the two packages and examine in detail the noise and RFI modeling in HIDE, as well as the implementation of gain calibration and RFI mitigation in SEEK. We then apply HIDE & SEEK to forward-model a Galactic survey in the frequency range 990 - 1260 MHz based on data taken at the Bleien Observatory. For this survey, we expect to cover 70% of the full sky and achieve a median signal-to-noise ratio of approximately 5 - 6 in the cleanest channels including systematic uncertainties. However, we also point out the potential challenges of high RFI contamination and baseline removal when examining the early data from the Bleien Observatory.
We describe a general framework to design optimal image processing algorithms for polarimetric images formed with coherent radiations, which can be optical or microwave. Starting from the classical ...speckle model for coherent signals, we show that a wide class of algorithms to perform such tasks as detection, localization and segmentation depend on a simple statistics, which is the determinant of the coherency matrix estimated on a given region of the image. We use this property to design computationally efficient techniques for target/edge detection and image segmentation using statistical active contours and the minimum description length principle.
Dark matter in the universe evolves through gravity to form a complex network of halos, filaments, sheets and voids, that is known as the cosmic web. Computational models of the underlying physical ...processes, such as classical N-body simulations, are extremely resource intensive, as they track the action of gravity in an expanding universe using billions of particles as tracers of the cosmic matter distribution. Therefore, upcoming cosmology experiments will face a computational bottleneck that may limit the exploitation of their full scientific potential. To address this challenge, we demonstrate the application of a machine learning technique called Generative Adversarial Networks (GAN) to learn models that can efficiently generate new, physically realistic realizations of the cosmic web. Our training set is a small, representative sample of 2D image snapshots from N-body simulations of size 500 and 100 Mpc. We show that the GAN-generated samples are qualitatively and quantitatively very similar to the originals. For the larger boxes of size 500 Mpc, it is very difficult to distinguish them visually. The agreement of the power spectrum
P
k
is 1–2% for most of the range, between
k
=
0.06
and
k
=
0.4
. For the remaining values of
k
, the agreement is within 15%, with the error rate increasing for
k
>
0.8
. For smaller boxes of size 100 Mpc, we find that the visual agreement to be good, but some differences are noticable. The error on the power spectrum is of the order of 20%. We attribute this loss of performance to the fact that the matter distribution in 100 Mpc cutouts was very inhomogeneous between images, a situation in which the performance of GANs is known to deteriorate. We find a good match for the correlation matrix of full
P
k
range for 100 Mpc data and of small scales for 500 Mpc, with ∼20% disagreement for large scales. An important advantage of generating cosmic web realizations with a GAN is the considerable gains in terms of computation time. Each new sample generated by a GAN takes a fraction of a second, compared to the many hours needed by traditional N-body techniques. We anticipate that the use of generative models such as GANs will therefore play an important role in providing extremely fast and precise simulations of cosmic web in the era of large cosmological surveys, such as Euclid and Large Synoptic Survey Telescope (LSST).