Abstract
RFX-mod2 (
R
= 2.0 m,
a
= 0.49 m), the upgraded version of the previous RFXmod fusion device, will be equipped with a new reflectometry system specifically designed for plasma position ...control purposes. Featuring a high temporal and spatial resolution and being suitable for harsh fusion reactor experimental conditions (long pulses, high neutron fluxes), reflectometry has been proposed as a good candidate for this task. On RFX-mod2 the diagnostic system will consist of four bistatic ultrafast independent reflectometric units working in the frequency range (16–26.5 GHz) and installed in four different poloidal locations at the same toroidal angle: two on the equatorial plane (high field side/low field side) and two at the vertical top/bottom ports. Standard pyramidal horns will be installed in the external midplane and in the vertical ports, while parabolic hoghorn reflectors have been designed for the internal midplane. Different technical solutions for the integration in the machine, as the additive manufacturing for the antenna production, are presented. Despite the differences with respect to the application on large Tokamaks like DEMO, the RFX-mod2 plasma position reflectometer can contribute to test on a simple machine some of the issues related to the development of a reflectometry based plasma position and shape control.
Cosmological constraints from key probes of the
Euclid
imaging survey rely critically on the accurate determination of the true redshift distributions,
n
(
z
), of tomographic redshift bins. We ...determine whether the mean redshift, ⟨
z
⟩, of ten
Euclid
tomographic redshift bins can be calibrated to the
Euclid
target uncertainties of
σ
(⟨
z
⟩) < 0.002 (1 +
z
) via cross-correlation, with spectroscopic samples akin to those from the Baryon Oscillation Spectroscopic Survey (BOSS), Dark Energy Spectroscopic Instrument (DESI), and
Euclid
’s NISP spectroscopic survey. We construct mock
Euclid
and spectroscopic galaxy samples from the Flagship simulation and measure small-scale clustering redshifts up to redshift
z
< 1.8 with an algorithm that performs well on current galaxy survey data. The clustering measurements are then fitted to two
n
(
z
) models: one is the true
n
(
z
) with a free mean; the other a Gaussian process modified to be restricted to non-negative values. We show that ⟨
z
⟩ is measured in each tomographic redshift bin to an accuracy of order 0.01 or better. By measuring the clustering redshifts on subsets of the full Flagship area, we construct scaling relations that allow us to extrapolate the method performance to larger sky areas than are currently available in the mock. For the full expected
Euclid
, BOSS, and DESI overlap region of approximately 6000 deg
2
, the uncertainties attainable by clustering redshifts exceeds the
Euclid
requirement by at least a factor of three for both
n
(
z
) models considered, although systematic biases limit the accuracy. Clustering redshifts are an extremely effective method for redshift calibration for
Euclid
if the sources of systematic biases can be determined and removed, or calibrated out with sufficiently realistic simulations. We outline possible future work, in particular an extension to higher redshifts with quasar reference samples.
Euclid preparation Castro, T.; Fumagalli, A.; Angulo, R. E. ...
Astronomy and astrophysics (Berlin),
03/2023, Letnik:
671
Journal Article
Recenzirano
Odprti dostop
Euclid
’s photometric galaxy cluster survey has the potential to be a very competitive cosmological probe. The main cosmological probe with observations of clusters is their number count, within ...which the halo mass function (HMF) is a key theoretical quantity. We present a new calibration of the analytic HMF, at the level of accuracy and precision required for the uncertainty in this quantity to be subdominant with respect to other sources of uncertainty in recovering cosmological parameters from
Euclid
cluster counts. Our model is calibrated against a suite of
N
-body simulations using a Bayesian approach taking into account systematic errors arising from numerical effects in the simulation. First, we test the convergence of HMF predictions from different
N
-body codes, by using initial conditions generated with different orders of Lagrangian Perturbation theory, and adopting different simulation box sizes and mass resolution. Then, we quantify the effect of using different halo finder algorithms, and how the resulting differences propagate to the cosmological constraints. In order to trace the violation of universality in the HMF, we also analyse simulations based on initial conditions characterised by scale-free power spectra with different spectral indexes, assuming both Einstein–de Sitter and standard ΛCDM expansion histories. Based on these results, we construct a fitting function for the HMF that we demonstrate to be sub-percent accurate in reproducing results from 9 different variants of the ΛCDM model including massive neutrinos cosmologies. The calibration systematic uncertainty is largely sub-dominant with respect to the expected precision of future mass–observation relations; with the only notable exception of the effect due to the halo finder, that could lead to biased cosmological inference.
In Mild Cognitive Impairment (MCI), the study of brain metabolism, provided by 18F-FluoroDeoxyGlucose Positron Emission Tomography (18F-FDG PET) can be integrated with brain perfusion through ...pseudo-Continuous Arterial Spin Labeling Magnetic Resonance sequences (MR pCASL). Cortical hypometabolism identification generally relies on wide control group datasets; pCASL control groups are instead not publicly available yet, due to lack of standardization in the acquisition parameters. This study presents a quantitative pipeline to be applied to PET and pCASL data to coherently analyze metabolism and perfusion inside 16 matching cortical regions of interest (ROIs) derived from the AAL3 atlas. The PET line is tuned on 36 MCI patients and 107 healthy control subjects, to agree in identifying hypometabolic regions with clinical reference methods (visual analysis supported by a vendor tool and Statistical Parametric Mapping, SPM, with two parametrizations here identified as SPM-A and SPM-B). The analysis was conducted for each ROI separately. The proposed PET analysis pipeline obtained accuracy 78 % and Cohen's к 60 % vs visual analysis, accuracy 79 % and Cohen's к 58 % vs SPM-A, accuracy 77 % and Cohen's к 54 % vs SPM-B. Cohen's к resulted not significantly different from SPM-A and SPM-B Cohen's к when assuming visual analysis as reference method (p-value 0.61 and 0.31 respectively). Considering SPM-A as reference method, Cohen's к is not significantly different from SPM-B Cohen's к as well (p-value = 1.00). The complete PET-pCASL pipeline was then preliminarily applied on 5 MCI patients and metabolism-perfusion regional correlations were assessed. The proposed approach can be considered as a promising tool for PET-pCASL joint analyses in MCI, even in the absence of a pCASL control group, to perform metabolism-perfusion regional correlation studies, and to assess and compare perfusion in hypometabolic or normo-metabolic areas.
•A pipeline to analyze 18F-FDG PET and pCASL in 16 cortical regions is presented.•The PET line is validated on 36 MCI patients vs SPM and visual analysis (reference).•An agreement vs reference in line with intra reference agreement is obtained.•A PET/pCASL analysis relying on PET normative data could drive pCASL interpretation.
Context.
The ESA
Euclid
mission will produce photometric galaxy samples over 15 000 square degrees of the sky that will be rich for clustering and weak lensing statistics. The accuracy of the ...cosmological constraints derived from these measurements will depend on the knowledge of the underlying redshift distributions based on photometric redshift calibrations.
Aims.
A new approach is proposed to use the stacked spectra from
Euclid
slitless spectroscopy to augment broad-band photometric information to constrain the redshift distribution with spectral energy distribution fitting. The high spectral resolution available in the stacked spectra complements the photometry and helps to break the colour-redshift degeneracy and constrain the redshift distribution of galaxy samples.
Methods.
We modelled the stacked spectra as a linear mixture of spectral templates. The mixture may be inverted to infer the underlying redshift distribution using constrained regression algorithms. We demonstrate the method on simulated
Vera C. Rubin
Observatory and
Euclid
mock survey data sets based on the
Euclid
Flagship mock galaxy catalogue. We assess the accuracy of the reconstruction by considering the inference of the baryon acoustic scale from angular two-point correlation function measurements.
Results.
We selected mock photometric galaxy samples at redshift
z
> 1 using the self-organising map algorithm. Considering the idealised case without dust attenuation, we find that the redshift distributions of these samples can be recovered with 0.5% accuracy on the baryon acoustic scale. The estimates are not significantly degraded by the spectroscopic measurement noise due to the large sample size. However, the error degrades to 2% when the dust attenuation model is left free. We find that the colour degeneracies introduced by attenuation limit the accuracy considering the wavelength coverage of
Euclid
near-infrared spectroscopy.
We present a method for fast evaluation of the covariance matrix for a two-point galaxy correlation function (2PCF) measured with the Landy–Szalay estimator. The standard way of evaluating the ...covariance matrix consists in running the estimator on a large number of mock catalogs, and evaluating their sample covariance. With large random catalog sizes (random-to-data objects’ ratio
M
≫ 1) the computational cost of the standard method is dominated by that of counting the data-random and random-random pairs, while the uncertainty of the estimate is dominated by that of data-data pairs. We present a method called Linear Construction (LC), where the covariance is estimated for small random catalogs with a size of
M
= 1 and
M
= 2, and the covariance for arbitrary
M
is constructed as a linear combination of the two. We show that the LC covariance estimate is unbiased. We validated the method with PINOCCHIO simulations in the range
r
= 20 − 200
h
−1
Mpc. With
M
= 50 and with 2
h
−1
Mpc bins, the theoretical speedup of the method is a factor of 14. We discuss the impact on the precision matrix and parameter estimation, and present a formula for the covariance of covariance.
An accurate covariance matrix is essential for obtaining reliable cosmological results when using a Gaussian likelihood. In this paper we study the covariance of pseudo- C ℓ estimates of tomographic ...cosmic shear power spectra. Using two existing publicly available codes in combination, we calculate the full covariance matrix, including mode-coupling contributions arising from both partial sky coverage and non-linear structure growth. For three different sky masks, we compare the theoretical covariance matrix to that estimated from publicly available N -body weak lensing simulations, finding good agreement. We find that as a more extreme sky cut is applied, a corresponding increase in both Gaussian off-diagonal covariance and non-Gaussian super-sample covariance is observed in both theory and simulations, in accordance with expectations. Studying the different contributions to the covariance in detail, we find that the Gaussian covariance dominates along the main diagonal and the closest off-diagonals, but farther away from the main diagonal the super-sample covariance is dominant. Forming mock constraints in parameters that describe matter clustering and dark energy, we find that neglecting non-Gaussian contributions to the covariance can lead to underestimating the true size of confidence regions by up to 70 per cent. The dominant non-Gaussian covariance component is the super-sample covariance, but neglecting the smaller connected non-Gaussian covariance can still lead to the underestimation of uncertainties by 10–20 per cent. A real cosmological analysis will require marginalisation over many nuisance parameters, which will decrease the relative importance of all cosmological contributions to the covariance, so these values should be taken as upper limits on the importance of each component.
Euclid preparation Paykari, P.; Hoekstra, H.; Azzollini, R. ...
Astronomy and astrophysics (Berlin),
03/2020, Letnik:
635
Journal Article
Recenzirano
Odprti dostop
Aims.
Our aim is to quantify the impact of systematic effects on the inference of cosmological parameters from cosmic shear.
Methods.
We present an “end-to-end” approach that introduces sources of ...bias in a modelled weak lensing survey on a galaxy-by-galaxy level. We propagated residual biases through a pipeline from galaxy properties at one end to cosmic shear power spectra and cosmological parameter estimates at the other end. We did this to quantify how imperfect knowledge of the pipeline changes the maximum likelihood values of dark energy parameters.
Results.
We quantify the impact of an imperfect correction for charge transfer inefficiency and modelling uncertainties of the point spread function for
Euclid
, and find that the biases introduced can be corrected to acceptable levels.
VANDELS is an ESO Public Spectroscopic Survey designed to build a sample of high-signal-to-noise ratio, medium-resolution spectra of galaxies at redshifts between 1 and 6.5. Here we present the final ...Public Data Release of the VANDELS Survey, comprising 2087 redshift measurements. We provide a detailed description of sample selection, observations, and data reduction procedures. The final catalogue reaches a target selection completeness of 40% at iAB = 25. The high signal-to-noise ratio of the spectra (above 7 in 80% of the spectra) and the dispersion of 2.5 Å allowed us to measure redshifts with high precision, the redshift measurement success rate reaching almost 100%. Together with the redshift catalogue and the reduced spectra, we also provide optical mid-infrared photometry and physical parameters derived through fitting the spectral energy distribution. The observed galaxy sample comprises both passive and star forming galaxies covering a stellar mass range of 8.3 < Log(M*/M⊙) < 11.7.
Context. Future large-scale surveys, such as the ESA Euclid mission, will produce a large set of galaxy redshifts (≥106) that will require fully automated data-processing pipelines to analyze the ...data, extract crucial information and ensure that all requirements are met. A fundamental element in these pipelines is to associate to each galaxy redshift measurement a quality, or reliability, estimate. Aim. In this work, we introduce a new approach to automate the spectroscopic redshift reliability assessment based on machine learning (ML) and characteristics of the redshift probability density function. Methods. We propose to rephrase the spectroscopic redshift estimation into a Bayesian framework, in order to incorporate all sources of information and uncertainties related to the redshift estimation process and produce a redshift posterior probability density function (PDF). To automate the assessment of a reliability flag, we exploit key features in the redshift posterior PDF and machine learning algorithms. Results. As a working example, public data from the VIMOS VLT Deep Survey is exploited to present and test this new methodology. We first tried to reproduce the existing reliability flags using supervised classification in order to describe different types of redshift PDFs, but due to the subjective definition of these flags (classification accuracy ~58%), we soon opted for a new homogeneous partitioning of the data into distinct clusters via unsupervised classification. After assessing the accuracy of the new clusters via resubstitution and test predictions (classification accuracy ~98%), we projected unlabeled data from preliminary mock simulations for the Euclid space mission into this mapping to predict their redshift reliability labels. Conclusions. Through the development of a methodology in which a system can build its own experience to assess the quality of a parameter, we are able to set a preliminary basis of an automated reliability assessment for spectroscopic redshift measurements. This newly-defined method is very promising for next-generation large spectroscopic surveys from the ground and in space, such as Euclid and WFIRST.