Context.
Data from ground-based, high-resolution solar telescopes can only be used for science with calibrations and processing, which requires detailed knowledge about the instrumentation. ...Space-based solar telescopes provide science-ready data, which are easier to work with for researchers whose expertise is in the interpretation of data. Recently, data-processing pipelines for ground-based instruments have been constructed.
Aims.
We aim to provide observers with a user-friendly data pipeline for data from the Swedish 1-meter Solar Telescope (SST) that delivers science-ready data together with the metadata needed for proper interpretation and archiving.
Methods.
We briefly describe the CHROMospheric Imaging Spectrometer (CHROMIS) instrument, including its (pre)filters, as well as recent upgrades to the CRisp Imaging SpectroPolarimeter (CRISP) prefilters and polarization optics. We summarize the processing steps from raw data to science-ready data cubes in FITS files. We report calibrations and compensations for data imperfections in detail. Misalignment of Ca
II
data due to wavelength-dependent dispersion is identified, characterized, and compensated for. We describe intensity calibrations that remove or reduce the effects of filter transmission profiles as well as solar elevation changes. We present REDUX, a new version of the MOMFBD image restoration code, with multiple enhancements and new features. It uses projective transforms for the registration of multiple detectors. We describe how image restoration is used with CRISP and CHROMIS data. The science-ready output is delivered in FITS files, with metadata compliant with the SOLARNET recommendations. Data cube coordinates are specified within the World Coordinate System (WCS). Cavity errors are specified as distortions of the WCS wavelength coordinate with an extension of existing WCS notation. We establish notation for specifying the reference system for Stokes vectors with reference to WCS coordinate directions. The CRIsp SPectral EXplorer (CRISPEX) data-cube browser has been extended to accept SSTRED output and to take advantage of the SOLARNET metadata.
Results.
SSTRED is a mature data-processing pipeline for imaging instruments, developed and used for the SST/CHROMIS imaging spectrometer and the SST/CRISP spectropolarimeter. SSTRED delivers well-characterized, science-ready, archival-quality FITS files with well-defined metadata. The SSTRED code, as well as REDUX and CRISPEX, is freely available through git repositories.
Context.
In metric theories of gravity with photon number conservation, the luminosity and angular diameter distances are related via the Etherington relation, also known as the distance duality ...relation (DDR). A violation of this relation would rule out the standard cosmological paradigm and point to the presence of new physics.
Aims.
We quantify the ability of
Euclid
, in combination with contemporary surveys, to improve the current constraints on deviations from the DDR in the redshift range 0 <
z
< 1.6.
Methods.
We start with an analysis of the latest available data, improving previously reported constraints by a factor of 2.5. We then present a detailed analysis of simulated
Euclid
and external data products, using both standard parametric methods (relying on phenomenological descriptions of possible DDR violations) and a machine learning reconstruction using genetic algorithms.
Results.
We find that for parametric methods
Euclid
can (in combination with external probes) improve current constraints by approximately a factor of six, while for non-parametric methods
Euclid
can improve current constraints by a factor of three.
Conclusions.
Our results highlight the importance of surveys like
Euclid
in accurately testing the pillars of the current cosmological paradigm and constraining physics beyond the standard cosmological model.
Aims. The Spectral Imaging of the Coronal Environment (SPICE) instrument is a high-resolution imaging spectrometer operating at extreme ultraviolet wavelengths. In this paper, we present the concept, ...design, and pre-launch performance of this facility instrument on the ESA/NASA Solar Orbiter mission.
Methods. The goal of this paper is to give prospective users a better understanding of the possible types of observations, the data acquisition, and the sources that contribute to the instrument’s signal.
Results. The paper discusses the science objectives, with a focus on the SPICE-specific aspects, before presenting the instrument’s design, including optical, mechanical, thermal, and electronics aspects. This is followed by a characterisation and calibration of the instrument’s performance. The paper concludes with descriptions of the operations concept and data processing.
Conclusions. The performance measurements of the various instrument parameters meet the requirements derived from the mission’s science objectives. The SPICE instrument is ready to perform measurements that will provide vital contributions to the scientific success of the Solar Orbiter mission.
Aims.
We investigate the contribution of shot-noise and sample variance to uncertainties in the cosmological parameter constraints inferred from cluster number counts, in the context of the
Euclid
...survey.
Methods.
By analysing 1000
Euclid
-like light cones, produced with the PINOCCHIO approximate method, we validated the analytical model of Hu & Kravtsov (2003, ApJ, 584, 702) for the covariance matrix, which takes into account both sources of statistical error. Then, we used such a covariance to define the likelihood function that is better equipped to extract cosmological information from cluster number counts at the level of precision that will be reached by the future
Euclid
photometric catalogs of galaxy clusters. We also studied the impact of the cosmology dependence of the covariance matrix on the parameter constraints.
Results.
The analytical covariance matrix reproduces the variance measured from simulations within the 10 percent; such a difference has no sizeable effect on the error of cosmological parameter constraints at this level of statistics. Also, we find that the Gaussian likelihood with full covariance is the only model that provides an unbiased inference of cosmological parameters without underestimating the errors, and that the cosmology-dependence of the covariance must be taken into account.
Euclid
is poised to survey galaxies across a cosmological volume of unprecedented size, providing observations of more than a billion objects distributed over a third of the full sky. Approximately ...20 million of these galaxies will have their spectroscopy available, allowing us to map the three-dimensional large-scale structure of the Universe in great detail. This paper investigates prospects for the detection of cosmic voids therein and the unique benefit they provide for cosmological studies. In particular, we study the imprints of dynamic (redshift-space) and geometric (Alcock–Paczynski) distortions of average void shapes and their constraining power on the growth of structure and cosmological distance ratios. To this end, we made use of the Flagship mock catalog, a state-of-the-art simulation of the data expected to be observed with
Euclid
. We arranged the data into four adjacent redshift bins, each of which contains about 11 000 voids and we estimated the stacked void-galaxy cross-correlation function in every bin. Fitting a linear-theory model to the data, we obtained constraints on
f
/
b
and
D
M
H
, where
f
is the linear growth rate of density fluctuations,
b
the galaxy bias,
D
M
the comoving angular diameter distance, and
H
the Hubble rate. In addition, we marginalized over two nuisance parameters included in our model to account for unknown systematic effects in the analysis. With this approach,
Euclid
will be able to reach a relative precision of about 4% on measurements of
f
/
b
and 0.5% on
D
M
H
in each redshift bin. Better modeling or calibration of the nuisance parameters may further increase this precision to 1% and 0.4%, respectively. Our results show that the exploitation of cosmic voids in
Euclid
will provide competitive constraints on cosmology even as a stand-alone probe. For example, the equation-of-state parameter,
w
, for dark energy will be measured with a precision of about 10%, consistent with previous more approximate forecasts.
In physically realistic, scalar-field-based dynamical dark energy models (including, e.g., quintessence), one naturally expects the scalar field to couple to the rest of the model’s degrees of ...freedom. In particular, a coupling to the electromagnetic sector leads to a time (redshift) dependence in the fine-structure constant and a violation of the weak equivalence principle. Here we extend the previous
Euclid
forecast constraints on dark energy models to this enlarged (but physically more realistic) parameter space, and forecast how well
Euclid
, together with high-resolution spectroscopic data and local experiments, can constrain these models. Our analysis combines simulated
Euclid
data products with astrophysical measurements of the fine-structure constant,
α
, and local experimental constraints, and it includes both parametric and non-parametric methods. For the astrophysical measurements of
α
, we consider both the currently available data and a simulated dataset representative of Extremely Large Telescope measurements that are expected to be available in the 2030s. Our parametric analysis shows that in the latter case, the inclusion of astrophysical and local data improves the
Euclid
dark energy figure of merit by between 8% and 26%, depending on the correct fiducial model, with the improvements being larger in the null case where the fiducial coupling to the electromagnetic sector is vanishing. These improvements would be smaller with the current astrophysical data. Moreover, we illustrate how a genetic algorithms based reconstruction provides a null test for the presence of the coupling. Our results highlight the importance of complementing surveys like
Euclid
with external data products, in order to accurately test the wider parameter spaces of physically motivated paradigms.
Context. ALMA observations show that dusty, distant, massive ( M * ≳ 10 11 M ⊙ ) galaxies usually have a remarkable star-formation activity, contributing of the order of 25% of the cosmic ...star-formation rate density at z ≈ 3–5, and up to 30% at z ∼ 7. Nonetheless, they are elusive in classical optical surveys, and current near-IR surveys are able to detect them only in very small sky areas. Since these objects have low space densities, deep and wide surveys are necessary to obtain statistically relevant results about them. Euclid will potentially be capable of delivering the required information, but, given the lack of spectroscopic features at these distances within its bands, it is still unclear if Euclid will be able to identify and characterise these objects. Aims. The goal of this work is to assess the capability of Euclid , together with ancillary optical and near-IR data, to identify these distant, dusty, and massive galaxies based on broadband photometry. Methods. We used a gradient-boosting algorithm to predict both the redshift and spectral type of objects at high z . To perform such an analysis, we made use of simulated photometric observations that mimic the Euclid Deep Survey, derived using the state-of-the-art Spectro-Photometric Realizations of Infrared-selected Targets at all- z ( SPRITZ ) software. Results. The gradient-boosting algorithm was found to be accurate in predicting both the redshift and spectral type of objects within the simulated Euclid Deep Survey catalogue at z > 2, while drastically decreasing the runtime with respect to spectral-energy-distribution-fitting methods. In particular, we studied the analogue of HIEROs (i.e. sources selected on the basis of a red H − 4.5> 2.25), combining Euclid and Spitzer data at the depth of the Deep Fields. These sources include the bulk of obscured and massive galaxies in a broad redshift range, 3 < z < 7. We find that the dusty population at 3 ≲ z ≲ 7 is well identified, with a redshift root mean squared error and catastrophic outlier fraction of only 0.55 and 8.5% ( H E ≤ 26), respectively. Our findings suggest that with Euclid we will obtain meaningful insights into the impact of massive and dusty galaxies on the cosmic star-formation rate over time.
Context . The Euclid mission of the European Space Agency will perform a survey of weak lensing cosmic shear and galaxy clustering in order to constrain cosmological models and fundamental physics. ...Aims . We expand and adjust the mock Euclid likelihoods of the MontePython software in order to match the exact recipes used in previous Euclid Fisher matrix forecasts for several probes: weak lensing cosmic shear, photometric galaxy clustering, the cross-correlation between the latter observables, and spectroscopic galaxy clustering. We also establish which precision settings are required when running the Einstein–Boltzmann solvers CLASS and CAMB in the context of Euclid . Methods . For the minimal cosmological model, extended to include dynamical dark energy, we perform Fisher matrix forecasts based directly on a numerical evaluation of second derivatives of the likelihood with respect to model parameters. We compare our results with those of previously validated Fisher codes using an independent method based on first derivatives of the Euclid observables. Results . We show that such MontePython forecasts agree very well with previous Fisher forecasts published by the Euclid Collab oration, and also, with new forecasts produced by the CosmicFish code, now interfaced directly with the two Einstein–Boltzmann solvers CAMB and CLASS . Moreover, to establish the validity of the Gaussian approximation, we show that the Fisher matrix marginal error contours coincide with the credible regions obtained when running Monte Carlo Markov chains with MontePython while using the exact same mock likelihoods. Conclusions . The new Euclid forecast pipelines presented here are ready for use with additional cosmological parameters, in order to explore extended cosmological models.
The material composition of asteroids is an essential piece of knowledge in the quest to understand the formation and evolution of the Solar System. Visual to near-infrared spectra or multiband ...photometry is required to constrain the material composition of asteroids, but we currently have such data, especially in the near-infrared wavelengths, for only a limited number of asteroids. This is a significant limitation considering the complex orbital structures of the asteroid populations. Up to 150 000 asteroids will be visible in the images of the upcoming ESA
Euclid
space telescope, and the instruments of
Euclid
will offer multiband visual to near-infrared photometry and slitless near-infrared spectra of these objects. Most of the asteroids will appear as streaks in the images. Due to the large number of images and asteroids, automated detection methods are needed. A non-machine-learning approach based on the Streak Det software was previously tested, but the results were not optimal for short and/or faint streaks. We set out to improve the capability to detect asteroid streaks in
Euclid
images by using deep learning. We built, trained, and tested a three-step machine-learning pipeline with simulated
Euclid
images. First, a convolutional neural network (CNN) detected streaks and their coordinates in full images, aiming to maximize the completeness (recall) of detections. Then, a recurrent neural network (RNN) merged snippets of long streaks detected in several parts by the CNN. Lastly, gradient-boosted trees (
XGBoost
) linked detected streaks between different
Euclid
exposures to reduce the number of false positives and improve the purity (precision) of the sample. The deep-learning pipeline surpasses the completeness and reaches a similar level of purity of a non-machine-learning pipeline based on the
StreakDet
software. Additionally, the deep-learning pipeline can detect asteroids 0.25–0.5 magnitudes fainter than
StreakDet
. The deep-learning pipeline could result in a 50% increase in the number of detected asteroids compared to the
StreakDet
software. There is still scope for further refinement, particularly in improving the accuracy of streak coordinates and enhancing the completeness of the final stage of the pipeline, which involves linking detections across multiple exposures.
Euclid preparation Ilbert, O.; de la Torre, S.; Wright, A. H. ...
Astronomy and astrophysics (Berlin),
03/2021, Letnik:
647
Journal Article
Recenzirano
Odprti dostop
The analysis of weak gravitational lensing in wide-field imaging surveys is considered to be a major cosmological probe of dark energy. Our capacity to constrain the dark energy equation of state ...relies on an accurate knowledge of the galaxy mean redshift ⟨
z
⟩. We investigate the possibility of measuring ⟨
z
⟩ with an accuracy better than 0.002 (1 +
z
) in ten tomographic bins spanning the redshift interval 0.2 <
z
< 2.2, the requirements for the cosmic shear analysis of
Euclid
. We implement a sufficiently realistic simulation in order to understand the advantages and complementarity, as well as the shortcomings, of two standard approaches: the direct calibration of ⟨
z
⟩ with a dedicated spectroscopic sample and the combination of the photometric redshift probability distribution functions (
z
PDFs) of individual galaxies. We base our study on the Horizon-AGN hydrodynamical simulation, which we analyse with a standard galaxy spectral energy distribution template-fitting code. Such a procedure produces photometric redshifts with realistic biases, precisions, and failure rates. We find that the current
Euclid
design for direct calibration is sufficiently robust to reach the requirement on the mean redshift, provided that the purity level of the spectroscopic sample is maintained at an extremely high level of > 99.8%. The
z
PDF approach can also be successful if the
z
PDF is de-biased using a spectroscopic training sample. This approach requires deep imaging data but is weakly sensitive to spectroscopic redshift failures in the training sample. We improve the de-biasing method and confirm our finding by applying it to real-world weak-lensing datasets (COSMOS and KiDS+VIKING-450).