Euclid preparation Blanchard, A.; Camera, S.; Carbone, C. ...
Astronomy and astrophysics (Berlin),
10/2020, Letnik:
642
Journal Article
Recenzirano
Odprti dostop
Aims.
The
Euclid
space telescope will measure the shapes and redshifts of galaxies to reconstruct the expansion history of the Universe and the growth of cosmic structures. The estimation of the ...expected performance of the experiment, in terms of predicted constraints on cosmological parameters, has so far relied on various individual methodologies and numerical implementations, which were developed for different observational probes and for the combination thereof. In this paper we present validated forecasts, which combine both theoretical and observational ingredients for different cosmological probes. This work is presented to provide the community with reliable numerical codes and methods for
Euclid
cosmological forecasts.
Methods.
We describe in detail the methods adopted for Fisher matrix forecasts, which were applied to galaxy clustering, weak lensing, and the combination thereof. We estimated the required accuracy for
Euclid
forecasts and outline a methodology for their development. We then compare and improve different numerical implementations, reaching uncertainties on the errors of cosmological parameters that are less than the required precision in all cases. Furthermore, we provide details on the validated implementations, some of which are made publicly available, in different programming languages, together with a reference training-set of input and output matrices for a set of specific models. These can be used by the reader to validate their own implementations if required.
Results.
We present new cosmological forecasts for
Euclid
. We find that results depend on the specific cosmological model and remaining freedom in each setting, for example flat or non-flat spatial cosmologies, or different cuts at non-linear scales. The numerical implementations are now reliable for these settings. We present the results for an optimistic and a pessimistic choice for these types of settings. We demonstrate that the impact of cross-correlations is particularly relevant for models beyond a cosmological constant and may allow us to increase the dark energy figure of merit by at least a factor of three.
ABSTRACT
We present a new, updated version of the EuclidEmulator (called EuclidEmulator2), a fast and accurate predictor for the nonlinear correction of the matter power spectrum. 2 per cent level ...accurate emulation is now supported in the eight-dimensional parameter space of w0waCDM+∑mν models between redshift z = 0 and z = 3 for spatial scales within the range $0.01 \, h\, {\rm Mpc}^{-1}\le k \le 10\, h\, {\rm Mpc}^{-1}$. In order to achieve this level of accuracy, we have had to improve the quality of the underlying N-body simulations used as training data: (i) we use self-consistent linear evolution of non-dark matter species such as massive neutrinos, photons, dark energy, and the metric field, (ii) we perform the simulations in the so-called N-body gauge, which allows one to interpret the results in the framework of general relativity, (iii) we run over 250 high-resolution simulations with 30003 particles in boxes of 1(h−1 Gpc)3 volumes based on paired-and-fixed initial conditions, and (iv) we provide a resolution correction that can be applied to emulated results as a post-processing step in order to drastically reduce systematic biases on small scales due to residual resolution effects in the simulations. We find that the inclusion of the dynamical dark energy parameter wa significantly increases the complexity and expense of creating the emulator. The high fidelity of EuclidEmulator2 is tested in various comparisons against N-body simulations as well as alternative fast predictors such as HALOFIT, HMCode, and CosmicEmu. A blind test is successfully performed against the Euclid Flagship v2.0 simulation. Nonlinear correction factors emulated with EuclidEmulator2 are accurate at the level of $1{{\ \rm per\ cent}}$ or better for $0.01 \, h\, {\rm Mpc}^{-1}\le k \le 10\, h\, {\rm Mpc}^{-1}$ and z ≤ 3 compared to high-resolution dark-matter-only simulations. EuclidEmulator2 is publicly available at https://github.com/miknab/EuclidEmulator2.
In physically realistic, scalar-field-based dynamical dark energy models (including, e.g., quintessence), one naturally expects the scalar field to couple to the rest of the model’s degrees of ...freedom. In particular, a coupling to the electromagnetic sector leads to a time (redshift) dependence in the fine-structure constant and a violation of the weak equivalence principle. Here we extend the previous
Euclid
forecast constraints on dark energy models to this enlarged (but physically more realistic) parameter space, and forecast how well
Euclid
, together with high-resolution spectroscopic data and local experiments, can constrain these models. Our analysis combines simulated
Euclid
data products with astrophysical measurements of the fine-structure constant,
α
, and local experimental constraints, and it includes both parametric and non-parametric methods. For the astrophysical measurements of
α
, we consider both the currently available data and a simulated dataset representative of Extremely Large Telescope measurements that are expected to be available in the 2030s. Our parametric analysis shows that in the latter case, the inclusion of astrophysical and local data improves the
Euclid
dark energy figure of merit by between 8% and 26%, depending on the correct fiducial model, with the improvements being larger in the null case where the fiducial coupling to the electromagnetic sector is vanishing. These improvements would be smaller with the current astrophysical data. Moreover, we illustrate how a genetic algorithms based reconstruction provides a null test for the presence of the coupling. Our results highlight the importance of complementing surveys like
Euclid
with external data products, in order to accurately test the wider parameter spaces of physically motivated paradigms.
Planck intermediate results Akrami, Y.; Andersen, K. J.; Baccigalupi, C. ...
Astronomy and astrophysics (Berlin),
11/2020, Letnik:
643
Journal Article
Recenzirano
Odprti dostop
We present the
NPIPE
processing pipeline, which produces calibrated frequency maps in temperature and polarization from data from the
Planck
Low Frequency Instrument (LFI) and High Frequency ...Instrument (HFI) using high-performance computers.
NPIPE
represents a natural evolution of previous
Planck
analysis efforts, and combines some of the most powerful features of the separate LFI and HFI analysis pipelines. For example, following the LFI 2018 processing procedure,
NPIPE
uses foreground polarization priors during the calibration stage in order to break scanning-induced degeneracies. Similarly,
NPIPE
employs the HFI 2018 time-domain processing methodology to correct for bandpass mismatch at all frequencies. In addition,
NPIPE
introduces several improvements, including, but not limited to: inclusion of the 8% of data collected during repointing manoeuvres; smoothing of the LFI reference load data streams; in-flight estimation of detector polarization parameters; and construction of maximally independent detector-set split maps. For component-separation purposes, important improvements include: maps that retain the CMB Solar dipole, allowing for high-precision relative calibration in higher-level analyses; well-defined single-detector maps, allowing for robust CO extraction; and HFI temperature maps between 217 and 857 GHz that are binned into 0′.9 pixels (
N
side
= 4096), ensuring that the full angular information in the data is represented in the maps even at the highest
Planck
resolutions. The net effect of these improvements is lower levels of noise and systematics in both frequency and component maps at essentially all angular scales, as well as notably improved internal consistency between the various frequency channels. Based on the
NPIPE
maps, we present the first estimate of the Solar dipole determined through component separation across all nine
Planck
frequencies. The amplitude is (3366.6 ± 2.7)
μ
K, consistent with, albeit slightly higher than, earlier estimates. From the large-scale polarization data, we derive an updated estimate of the optical depth of reionization of
τ
= 0.051 ± 0.006, which appears robust with respect to data and sky cuts. There are 600 complete signal, noise and systematics simulations of the full-frequency and detector-set maps. As a
Planck
first, these simulations include full time-domain processing of the beam-convolved CMB anisotropies. The release of
NPIPE
maps and simulations is accompanied with a complete suite of raw and processed time-ordered data and the software, scripts, auxiliary data, and parameter files needed to improve further on the analysis and to run matching simulations.
FUNDACÁNCER indica que el cáncer de piel ocupa el cuarto lugar entre los de mayor incidencia en Panamá. Alinvestigar los métodos de diagnóstico actuales, se ha descubierto que todavía es una prueba ...puramente cualitativa, basada únicamente en la inspección visual. Después de un efecto de enfriamiento sobre la epidermis, las lesiones benignas tienen una recuperación térmica similar a la piel normal, mientras que la recuperación térmica de la lesión maligna se termoregula en un intervalo de tiempo menor. El objetivo principal es diseñar un dispositivo costo efectivo para añadir un valor cuantitativo a los métodos actuales mediante el contacto físico sobre la piel.
Context. In the last decade, astronomers have found a new type of supernova called superluminous supernovae (SLSNe) due to their high peak luminosity and long light-curves. These hydrogen-free ...explosions (SLSNe-I) can be seen to z ~ 4 and therefore, offer the possibility of probing the distant Universe. Aims. We aim to investigate the possibility of detecting SLSNe-I using ESA’s Euclid satellite, scheduled for launch in 2020. In particular, we study the Euclid Deep Survey (EDS) which will provide a unique combination of area, depth and cadence over the mission. Methods. We estimated the redshift distribution of Euclid SLSNe-I using the latest information on their rates and spectral energy distribution, as well as known Euclid instrument and survey parameters, including the cadence and depth of the EDS. To estimate the uncertainties, we calculated their distribution with two different set-ups, namely optimistic and pessimistic, adopting different star formation densities and rates. We also applied a standardization method to the peak magnitudes to create a simulated Hubble diagram to explore possible cosmological constraints. Results. We show that Euclid should detect approximately 140 high-quality SLSNe-I to z ~ 3.5 over the first five years of the mission (with an additional 70 if we lower our photometric classification criteria). This sample could revolutionize the study of SLSNe-I at z > 1 and open up their use as probes of star-formation rates, galaxy populations, the interstellar and intergalactic medium. In addition, a sample of such SLSNe-I could improve constraints on a time-dependent dark energy equation-of-state, namely w(a), when combined with local SLSNe-I and the expected SN Ia sample from the Dark Energy Survey. Conclusions. We show that Euclid will observe hundreds of SLSNe-I for free. These luminous transients will be in the Euclid data-stream and we should prepare now to identify them as they offer a new probe of the high-redshift Universe for both astrophysics and cosmology.
Medulloblastoma is extremely rare in adults. The role of chemotherapy for average-risk adult patients remains controversial. Surgery and radiotherapy provide a significant disease control and a good ...prognosis, but about 25% of average-risk patients have a relapse and die because of disease progression. No data in average-risk adult patients are available to compareradiotherapy alone and radiotherapyfollowed byadjuvant chemotherapy.
We analyzed 48 average-risk patients according to Chang classification diagnosed from 1988 to 2016.
Median age was 29 years (range 16-61). Based on histological subtypes, 15 patients (31.3%) had classic, 15 patients (31.3%) had desmoplastic, 5 patients (10.4%) had extensive nodularity and 2 patients (4.2%) had large cells/anaplastic medulloblastoma. Twenty-four patients (50%) received adjuvant radiotherapy alone and 24 (50%) received radiotherapy and chemotherapy. After a median follow-up of 12.5 years, we found that chemotherapyincreases progression-free survival (PFS-15 82.3 ± 8.0% in patients treated with radiotherapy and chemotherapyvs. 38.5% ± 13.0% in patients treated with radiotherapy alone p = 0.05) and overall survival (OS-15 89.3% ± 7.2% vs. 52.0% ± 13.1%, p = 0.02). Among patients receiving chemotherapy, the reported grade ≥ 3 adverse events were: 9 cases of neutropenia (6 cases of G3 neutropenia 25% and 3 cases of G4 neutropenia 13%), 1 case of G3 thrombocytopenia (4%) and 2 cases of G3 nausea (8%).
Our study with a long follow up period suggests that adding adjuvant chemotherapy to radiotherapy might improve PFS and OS in average-risk adult medulloblastoma patients.
Euclid preparation Desprez, G.; Coupon, J.; Almosallam, I. ...
Astronomy and astrophysics (Berlin),
12/2020, Letnik:
644
Journal Article
Recenzirano
Odprti dostop
Forthcoming large photometric surveys for cosmology require precise and accurate photometric redshift (photo-
z
) measurements for the success of their main science objectives. However, to date, no ...method has been able to produce photo-
z
s at the required accuracy using only the broad-band photometry that those surveys will provide. An assessment of the strengths and weaknesses of current methods is a crucial step in the eventual development of an approach to meet this challenge. We report on the performance of 13 photometric redshift code single value redshift estimates and redshift probability distributions (PDZs) on a common set of data, focusing particularly on the 0.2 − 2.6 redshift range that the
Euclid
mission will probe. We designed a challenge using emulated
Euclid
data drawn from three photometric surveys of the COSMOS field. The data was divided into two samples: one calibration sample for which photometry and redshifts were provided to the participants; and the validation sample, containing only the photometry to ensure a blinded test of the methods. Participants were invited to provide a redshift single value estimate and a PDZ for each source in the validation sample, along with a rejection flag that indicates the sources they consider unfit for use in cosmological analyses. The performance of each method was assessed through a set of informative metrics, using cross-matched spectroscopic and highly-accurate photometric redshifts as the ground truth. We show that the rejection criteria set by participants are efficient in removing strong outliers, that is to say sources for which the photo-
z
deviates by more than 0.15(1 +
z
) from the spectroscopic-redshift (spec-
z
). We also show that, while all methods are able to provide reliable single value estimates, several machine-learning methods do not manage to produce useful PDZs. We find that no machine-learning method provides good results in the regions of galaxy color-space that are sparsely populated by spectroscopic-redshifts, for example
z
> 1. However they generally perform better than template-fitting methods at low redshift (
z
< 0.7), indicating that template-fitting methods do not use all of the information contained in the photometry. We introduce metrics that quantify both photo-
z
precision and completeness of the samples (post-rejection), since both contribute to the final figure of merit of the science goals of the survey (e.g., cosmic shear from
Euclid
). Template-fitting methods provide the best results in these metrics, but we show that a combination of template-fitting results and machine-learning results with rejection criteria can outperform any individual method. On this basis, we argue that further work in identifying how to best select between machine-learning and template-fitting approaches for each individual galaxy should be pursued as a priority.
Context.
In metric theories of gravity with photon number conservation, the luminosity and angular diameter distances are related via the Etherington relation, also known as the distance duality ...relation (DDR). A violation of this relation would rule out the standard cosmological paradigm and point to the presence of new physics.
Aims.
We quantify the ability of
Euclid
, in combination with contemporary surveys, to improve the current constraints on deviations from the DDR in the redshift range 0 <
z
< 1.6.
Methods.
We start with an analysis of the latest available data, improving previously reported constraints by a factor of 2.5. We then present a detailed analysis of simulated
Euclid
and external data products, using both standard parametric methods (relying on phenomenological descriptions of possible DDR violations) and a machine learning reconstruction using genetic algorithms.
Results.
We find that for parametric methods
Euclid
can (in combination with external probes) improve current constraints by approximately a factor of six, while for non-parametric methods
Euclid
can improve current constraints by a factor of three.
Conclusions.
Our results highlight the importance of surveys like
Euclid
in accurately testing the pillars of the current cosmological paradigm and constraining physics beyond the standard cosmological model.
Upcoming surveys will map the growth of large-scale structure with unprecented precision, improving our understanding of the dark sector of the Universe. Unfortunately, much of the cosmological ...information is encoded on small scales, where the clustering of dark matter and the effects of astrophysical feedback processes are not fully understood. This can bias the estimates of cosmological parameters, which we study here for a joint analysis of mock
Euclid
cosmic shear and
Planck
cosmic microwave background data. We use different implementations for the modelling of the signal on small scales and find that they result in significantly different predictions. Moreover, the different non-linear corrections lead to biased parameter estimates, especially when the analysis is extended into the highly non-linear regime, with the Hubble constant,
H
0
, and the clustering amplitude,
σ
8
, affected the most. Improvements in the modelling of non-linear scales will therefore be needed if we are to resolve the current tension with more and better data. For a given prescription for the non-linear power spectrum, using different corrections for baryon physics does not significantly impact the precision of
Euclid
, but neglecting these correction does lead to large biases in the cosmological parameters. In order to extract precise and unbiased constraints on cosmological parameters from
Euclid
cosmic shear data, it is therefore essential to improve the accuracy of the recipes that account for non-linear structure formation, as well as the modelling of the impact of astrophysical processes that redistribute the baryons.