Context.
Stage IV weak lensing experiments will offer more than an order of magnitude leap in precision. We must therefore ensure that our analyses remain accurate in this new era. Accordingly, ...previously ignored systematic effects must be addressed.
Aims.
In this work, we evaluate the impact of the reduced shear approximation and magnification bias on information obtained from the angular power spectrum. To first-order, the statistics of reduced shear, a combination of shear and convergence, are taken to be equal to those of shear. However, this approximation can induce a bias in the cosmological parameters that can no longer be neglected. A separate bias arises from the statistics of shear being altered by the preferential selection of galaxies and the dilution of their surface densities in high-magnification regions.
Methods.
The corrections for these systematic effects take similar forms, allowing them to be treated together. We calculated the impact of neglecting these effects on the cosmological parameters that would be determined from
Euclid
, using cosmic shear tomography. To do so, we employed the Fisher matrix formalism, and included the impact of the super-sample covariance. We also demonstrate how the reduced shear correction can be calculated using a lognormal field forward modelling approach.
Results.
These effects cause significant biases in Ω
m
,
σ
8
,
n
s
, Ω
DE
,
w
0
, and
w
a
of −0.53
σ
, 0.43
σ
, −0.34
σ
, 1.36
σ
, −0.68
σ
, and 1.21
σ
, respectively. We then show that these lensing biases interact with another systematic effect: the intrinsic alignment of galaxies. Accordingly, we have developed the formalism for an intrinsic alignment-enhanced lensing bias correction. Applying this to
Euclid
, we find that the additional terms introduced by this correction are sub-dominant.
Context. Future weak lensing surveys, such as the Euclid mission, will attempt to measure the shapes of billions of galaxies in order to derive cosmological information. These surveys will attain ...very low levels of statistical error, and systematic errors must be extremely well controlled. In particular, the point spread function (PSF) must be estimated using stars in the field, and recovered with high accuracy. Aims. The aims of this paper are twofold. Firstly, we took steps toward a nonparametric method to address the issue of recovering the PSF field, namely that of finding the correct PSF at the position of any galaxy in the field, applicable to Euclid . Our approach relies solely on the data, as opposed to parametric methods that make use of our knowledge of the instrument. Secondly, we studied the impact of imperfect PSF models on the shape measurement of galaxies themselves, and whether common assumptions about this impact hold true in an Euclid scenario. Methods. We extended the recently proposed resolved components analysis approach, which performs super-resolution on a field of under-sampled observations of a spatially varying, image-valued function. We added a spatial interpolation component to the method, making it a true 2-dimensional PSF model. We compared our approach to PSFEx , then quantified the impact of PSF recovery errors on galaxy shape measurements through image simulations. Results. Our approach yields an improvement over PSFEx in terms of the PSF model and on observed galaxy shape errors, though it is at present far from reaching the required Euclid accuracy. We also find that the usual formalism used for the propagation of PSF model errors to weak lensing quantities no longer holds in the case of an Euclid -like PSF. In particular, different shape measurement approaches can react differently to the same PSF modeling errors.
Context. In the last decade, astronomers have found a new type of supernova called superluminous supernovae (SLSNe) due to their high peak luminosity and long light-curves. These hydrogen-free ...explosions (SLSNe-I) can be seen to z ~ 4 and therefore, offer the possibility of probing the distant Universe. Aims. We aim to investigate the possibility of detecting SLSNe-I using ESA’s Euclid satellite, scheduled for launch in 2020. In particular, we study the Euclid Deep Survey (EDS) which will provide a unique combination of area, depth and cadence over the mission. Methods. We estimated the redshift distribution of Euclid SLSNe-I using the latest information on their rates and spectral energy distribution, as well as known Euclid instrument and survey parameters, including the cadence and depth of the EDS. To estimate the uncertainties, we calculated their distribution with two different set-ups, namely optimistic and pessimistic, adopting different star formation densities and rates. We also applied a standardization method to the peak magnitudes to create a simulated Hubble diagram to explore possible cosmological constraints. Results. We show that Euclid should detect approximately 140 high-quality SLSNe-I to z ~ 3.5 over the first five years of the mission (with an additional 70 if we lower our photometric classification criteria). This sample could revolutionize the study of SLSNe-I at z > 1 and open up their use as probes of star-formation rates, galaxy populations, the interstellar and intergalactic medium. In addition, a sample of such SLSNe-I could improve constraints on a time-dependent dark energy equation-of-state, namely w(a), when combined with local SLSNe-I and the expected SN Ia sample from the Dark Energy Survey. Conclusions. We show that Euclid will observe hundreds of SLSNe-I for free. These luminous transients will be in the Euclid data-stream and we should prepare now to identify them as they offer a new probe of the high-redshift Universe for both astrophysics and cosmology.
Context.
The ESA
Euclid
space telescope could observe up to 150 000 asteroids as a side product of its primary cosmological mission. Asteroids appear as trailed sources, that is streaks, in the ...images. Owing to the survey area of 15 000 square degrees and the number of sources, automated methods have to be used to find them.
Euclid
is equipped with a visible camera, VIS (VISual imager), and a near-infrared camera, NISP (Near-Infrared Spectrometer and Photometer), with three filters.
Aims.
We aim to develop a pipeline to detect fast-moving objects in
Euclid
images, with both high completeness and high purity.
Methods.
We tested the
StreakDet
software to find asteroids from simulated
Euclid
images. We optimized the parameters of
StreakDet
to maximize completeness, and developed a post-processing algorithm to improve the purity of the sample of detected sources by removing false-positive detections.
Results.
StreakDet
finds 96.9% of the synthetic asteroid streaks with apparent magnitudes brighter than 23rd magnitude and streak lengths longer than 15 pixels (10 arcsec h
−1
), but this comes at the cost of finding a high number of false positives. The number of false positives can be radically reduced with multi-streak analysis, which utilizes all four dithers obtained by
Euclid
.
Conclusions.
StreakDet
is a good tool for identifying asteroids in
Euclid
images, but there is still room for improvement, in particular, for finding short (less than 13 pixels, corresponding to 8 arcsec h
−1
) and/or faint streaks (fainter than the apparent magnitude of 23).
Context. We present the first results from the project Galactic cold cores, where the cold interstellar clouds detected by the Planck satellite are studied with Herschel photometric observations. The ...final Planck catalogue is expected to contain several thousand sources. The Herschel observations during the science demonstration phase provided the first glimpse into the nature of these sources. Aims. The main goal of the project is to derive the physical properties of the cold core population revealed by Planck. We examine three fields and confirm the Planck detections with Herschel data, which we also use to establish the evolutionary stage of the identified cores. Methods. We study the morphology and spectral energy distribution of the sources using the combined wavelength coverage of Planck and Herschel. The dust colour temperatures and emissivity indices are determined. The masses of the cores are determined with distance estimates which are taken from the literature and are confirmed by kinematic and extinction information. Results. The observations reveal extended regions of cold dust with dust colour temperatures down to Tdust ~ 11 K. The fields represent different evolutionary stages ranging from a quiescent, cold filament in Musca to regions of active star formation in Cepheus. Conclusions. The Herschel observations confirm that the all-sky survey of Planck is capable of making a large number of new cold core detections. Our results suggest that many of the sources may already have left the pre-stellar phase or are at least closely associated with active star formation. High-resolution Herschel observations are needed to establish the true nature of the Planck detections.
Euclid is an ESA mission designed to understand why the expansion of the Universe is accelerating and what is the nature of the dark energy responsible for this acceleration. By measuring two ...cosmological probes simultaneously, the Weak Gravitational Lensing and the Galaxy Clustering (BAO and Redshift-Space distorsions), Euclid will constrain dark energy, general relativity, dark matter and the initial conditions of the Universe with unprecedented accuracy. Euclid will be equipped with a 1.2 m diameter SiC mirror telescope feeding 2 instruments: the visible imager and the Near-Infrared Spectro-Photometer. Here the Euclid's observation probes and main aims are recalled, and the NISP instrument and expected performances are presented.
Abstract
The reliability of astronomical observations at millimetre and submillimetre wavelengths closely depends on a low vertical content of water vapour as well as on high atmospheric emission ...stability. Although Concordia station at Dome C (Antarctica) enjoys good observing conditions in this atmospheric spectral windows, as shown by preliminary site-testing campaigns at different bands and in, not always, time overlapped periods, a dedicated instrument able to continuously determine atmospheric performance for a wide spectral range is not yet planned. In the absence of such measurements, in this paper we suggest a semi-empirical approach to perform an analysis of atmospheric transmission and emission at Dome C to compare the performance for seven photometric bands ranging from 100 GHz to 2 THz. Radiosoundings data provided by the Routine Meteorological Observations Research Project at Concordia station are corrected by temperature and humidity errors and dry biases and then employed to feed Atmospheric Transmission at Microwaves (atm) code to generate synthetic spectra in the wide spectral range from 100 GHz to 2 THz. This approach is attempted for the 2005-2007 data set in order to check its feasibility. To quantify the atmospheric contribution in millimetre and submillimetre observations we are considering several photometric bands, largely explored by ground-based telescopes, in which atmospheric quantities are integrated. The observational capabilities of this site at all the selected spectral bands are analysed considering monthly averaged transmissions joined to the corresponding fluctuations. Transmission and precipitable water vapour statistics at Dome C derived by our semi-empirical approach are consistent with previous works. It is evident the decreasing of the performance at high frequencies. We propose to introduce a new parameter to compare the quality of a site at different spectral bands, in terms of high transmission and emission stability, the site photometric quality ratio. The effect of the instrument filter bandwidth is involved on the estimate of the optical depth performed by the water vapour content knowledge.