Invasive pulmonary aspergillosis (IPA) has always been a challenging diagnosis and risk factors an important guide to investigate specific population, especially in Intensive Care Unit. Traditionally ...recognized risk factors for IPA have been haematological diseases or condition associated with severe immunosuppression, lately completed by chronic conditions (such as obstructive pulmonary disease, liver cirrhosis, chronic kidney disease and diabetes), influenza infection and Intensive Care Unit (ICU) admission. Recently, a new association with SARS-CoV2 infection, named COVID-19-associated pulmonary aspergillosis (CAPA), has been reported worldwide, even if its basic epidemiological characteristics have not been completely established yet. In this narrative review, we aimed to explore the potential risk factors for the development of CAPA and to evaluate whether previous host factors or therapeutic approaches used in the treatment of COVID-19 critically ill patients (such as mechanical ventilation, intensive care management, corticosteroids, broad-spectrum antibiotics, immunomodulatory agents) may impact this new diagnostic category. Reviewing all English-language articles published from December 2019 to December 2020, we identified 21 papers describing risk factors, concerning host comorbidities, ICU management, and COVID-19 therapies. Although limited by the quality of the available literature, data seem to confirm the role of previous host risk factors, especially respiratory diseases. However, the attention is shifting from patients’ related risk factors to factors characterizing the hospital and intensive care course, deeply influenced by specific features of COVID treatment itself. Prolonged invasive or non-invasive respiratory support, as well as the impact of corticosteroids and/or immunobiological therapies seem to play a pivotal role. ICU setting related factors, such as environmental factors, isolation conditions, ventilation systems, building renovation works, and temporal spread with respect to pandemic waves, need to be considered. Large, prospective studies based on new risk factors specific for CAPA are warranted to guide surveillance and decision of when and how to treat this particular population.
The increasing prevalence of colistin resistance (ColR) Klebsiella pneumoniae carbapenemase (KPC)-producing K. pneumoniae (Kp) is a matter of concern because of its unfavourable impact on mortality ...of KPC-Kp bloodstream infections (BSI) and the shortage of alternative therapeutic options. A matched case–control–control analysis was conducted. The primary study end point was to assess risk factors for ColR KPC-Kp BSI. The secondary end point was to describe mortality and clinical characteristics of these infections. To assess risk factors for ColR, 142 patients with ColR KPC-Kp BSI were compared to two controls groups: 284 controls without infections caused by KPC-Kp (control group A) and 284 controls with colistin-susceptible (ColS) KPC-Kp BSI (control group B). In the first multivariate analysis (cases vs. group A), previous colistin therapy, previous KPC-Kp colonization, ≥3 previous hospitalizations, Charlson score ≥3 and neutropenia were found to be associated with the development of ColR KPC-Kp BSI. In the second multivariate analysis (cases vs. group B), only previous colistin therapy, previous KPC-Kp colonization and Charlson score ≥3 were associated with ColR. Overall, ColR among KPC-Kp blood isolates increased more than threefold during the 4.5-year study period, and 30-day mortality of ColR KPC-Kp BSI was as high as 51%. Strict rules for the use of colistin are mandatory to staunch the dissemination of ColR in KPC-Kp-endemic hospitals.
In physically realistic, scalar-field-based dynamical dark energy models (including, e.g., quintessence), one naturally expects the scalar field to couple to the rest of the model’s degrees of ...freedom. In particular, a coupling to the electromagnetic sector leads to a time (redshift) dependence in the fine-structure constant and a violation of the weak equivalence principle. Here we extend the previous
Euclid
forecast constraints on dark energy models to this enlarged (but physically more realistic) parameter space, and forecast how well
Euclid
, together with high-resolution spectroscopic data and local experiments, can constrain these models. Our analysis combines simulated
Euclid
data products with astrophysical measurements of the fine-structure constant,
α
, and local experimental constraints, and it includes both parametric and non-parametric methods. For the astrophysical measurements of
α
, we consider both the currently available data and a simulated dataset representative of Extremely Large Telescope measurements that are expected to be available in the 2030s. Our parametric analysis shows that in the latter case, the inclusion of astrophysical and local data improves the
Euclid
dark energy figure of merit by between 8% and 26%, depending on the correct fiducial model, with the improvements being larger in the null case where the fiducial coupling to the electromagnetic sector is vanishing. These improvements would be smaller with the current astrophysical data. Moreover, we illustrate how a genetic algorithms based reconstruction provides a null test for the presence of the coupling. Our results highlight the importance of complementing surveys like
Euclid
with external data products, in order to accurately test the wider parameter spaces of physically motivated paradigms.
Euclid preparation Blanchard, A.; Camera, S.; Carbone, C. ...
Astronomy and astrophysics (Berlin),
10/2020, Letnik:
642
Journal Article
Recenzirano
Odprti dostop
Aims.
The
Euclid
space telescope will measure the shapes and redshifts of galaxies to reconstruct the expansion history of the Universe and the growth of cosmic structures. The estimation of the ...expected performance of the experiment, in terms of predicted constraints on cosmological parameters, has so far relied on various individual methodologies and numerical implementations, which were developed for different observational probes and for the combination thereof. In this paper we present validated forecasts, which combine both theoretical and observational ingredients for different cosmological probes. This work is presented to provide the community with reliable numerical codes and methods for
Euclid
cosmological forecasts.
Methods.
We describe in detail the methods adopted for Fisher matrix forecasts, which were applied to galaxy clustering, weak lensing, and the combination thereof. We estimated the required accuracy for
Euclid
forecasts and outline a methodology for their development. We then compare and improve different numerical implementations, reaching uncertainties on the errors of cosmological parameters that are less than the required precision in all cases. Furthermore, we provide details on the validated implementations, some of which are made publicly available, in different programming languages, together with a reference training-set of input and output matrices for a set of specific models. These can be used by the reader to validate their own implementations if required.
Results.
We present new cosmological forecasts for
Euclid
. We find that results depend on the specific cosmological model and remaining freedom in each setting, for example flat or non-flat spatial cosmologies, or different cuts at non-linear scales. The numerical implementations are now reliable for these settings. We present the results for an optimistic and a pessimistic choice for these types of settings. We demonstrate that the impact of cross-correlations is particularly relevant for models beyond a cosmological constant and may allow us to increase the dark energy figure of merit by at least a factor of three.
Context.
Stage IV weak lensing experiments will offer more than an order of magnitude leap in precision. We must therefore ensure that our analyses remain accurate in this new era. Accordingly, ...previously ignored systematic effects must be addressed.
Aims.
In this work, we evaluate the impact of the reduced shear approximation and magnification bias on information obtained from the angular power spectrum. To first-order, the statistics of reduced shear, a combination of shear and convergence, are taken to be equal to those of shear. However, this approximation can induce a bias in the cosmological parameters that can no longer be neglected. A separate bias arises from the statistics of shear being altered by the preferential selection of galaxies and the dilution of their surface densities in high-magnification regions.
Methods.
The corrections for these systematic effects take similar forms, allowing them to be treated together. We calculated the impact of neglecting these effects on the cosmological parameters that would be determined from
Euclid
, using cosmic shear tomography. To do so, we employed the Fisher matrix formalism, and included the impact of the super-sample covariance. We also demonstrate how the reduced shear correction can be calculated using a lognormal field forward modelling approach.
Results.
These effects cause significant biases in Ω
m
,
σ
8
,
n
s
, Ω
DE
,
w
0
, and
w
a
of −0.53
σ
, 0.43
σ
, −0.34
σ
, 1.36
σ
, −0.68
σ
, and 1.21
σ
, respectively. We then show that these lensing biases interact with another systematic effect: the intrinsic alignment of galaxies. Accordingly, we have developed the formalism for an intrinsic alignment-enhanced lensing bias correction. Applying this to
Euclid
, we find that the additional terms introduced by this correction are sub-dominant.
Display omitted
To use Multi-Criteria Decision Analysis (MCDA) to determine weights for eleven criteria in order to prioritize COVID-19 non-critical patients for admission to hospital in healthcare ...settings with limited resources.
The MCDA was applied in two main steps: specification of criteria for prioritizing COVID-19 patients (and levels within each criterion); and determination of weights for the criteria based on experts’ knowledge and experience in managing COVID-19 patients, via an online survey. Criteria were selected based on available COVID-19 evidence with a focus on low- and middle-income countries (LMICs).
The most important criteria (mean weights, summing to 100%) are: PaO2 (16.3%); peripheral O2 saturation (15.9%); chest X-ray (14.1%); Modified Early Warning Score-MEWS (11.4%); respiratory rate (9.5%); comorbidities (6.5%); living with vulnerable people (6.4%); body mass index (5.6%); duration of symptoms before hospital evaluation (5.4%); CRP (5.1%); and age (3.8%).
At the beginning of a new pandemic, when evidence for disease predictors is limited or unavailable and effective national contingency plans are difficult to establish, the MCDA prioritization model could play a pivotal role in improving the response of health systems.
Euclid preparation Tereno, I.; Dupac, X.; Gómez-Álvarez, P. ...
Astronomy and astrophysics (Berlin),
06/2022, Letnik:
662
Journal Article
Recenzirano
Odprti dostop
Euclid
is a mission of the European Space Agency that is designed to constrain the properties of dark energy and gravity via weak gravitational lensing and galaxy clustering. It will carry out a wide ...area imaging and spectroscopy survey (the
Euclid
Wide Survey: EWS) in visible and near-infrared bands, covering approximately 15 000 deg
2
of extragalactic sky in six years. The wide-field telescope and instruments are optimised for pristine point spread function and reduced stray light, producing very crisp images. This paper presents the building of the
Euclid
reference survey: the sequence of pointings of EWS, deep fields, and calibration fields, as well as spacecraft movements followed by
Euclid
as it operates in a step-and-stare mode from its orbit around the Lagrange point L2. Each EWS pointing has four dithered frames; we simulated the dither pattern at the pixel level to analyse the effective coverage. We used up-to-date models for the sky background to define the
Euclid
region-of-interest (RoI). The building of the reference survey is highly constrained from calibration cadences, spacecraft constraints, and background levels; synergies with ground-based coverage were also considered. Via purposely built software, we first generated a schedule for the calibrations and deep fields observations. On a second stage, the RoI was tiled and scheduled with EWS observations, using an algorithm optimised to prioritise the best sky areas, produce a compact coverage, and ensure thermal stability. The result is the optimised reference survey RSD_2021A, which fulfils all constraints and is a good proxy for the final solution. The current EWS covers ≈14 500 deg
2
. The limiting AB magnitudes (5
σ
point-like source) achieved in its footprint are estimated to be 26.2 (visible band
I
E
) and 24.5 (for near infrared bands
Y
E
,
J
E
,
H
E
); for spectroscopy, the H
α
line flux limit is 2 × 10
−16
erg
−1
cm
−2
s
−1
at 1600 nm; and for diffuse emission, the surface brightness limits are 29.8 (visible band) and 28.4 (near infrared bands) mag arcsec
−2
.
ABSTRACT
We present a new, updated version of the EuclidEmulator (called EuclidEmulator2), a fast and accurate predictor for the nonlinear correction of the matter power spectrum. 2 per cent level ...accurate emulation is now supported in the eight-dimensional parameter space of w0waCDM+∑mν models between redshift z = 0 and z = 3 for spatial scales within the range $0.01 \, h\, {\rm Mpc}^{-1}\le k \le 10\, h\, {\rm Mpc}^{-1}$. In order to achieve this level of accuracy, we have had to improve the quality of the underlying N-body simulations used as training data: (i) we use self-consistent linear evolution of non-dark matter species such as massive neutrinos, photons, dark energy, and the metric field, (ii) we perform the simulations in the so-called N-body gauge, which allows one to interpret the results in the framework of general relativity, (iii) we run over 250 high-resolution simulations with 30003 particles in boxes of 1(h−1 Gpc)3 volumes based on paired-and-fixed initial conditions, and (iv) we provide a resolution correction that can be applied to emulated results as a post-processing step in order to drastically reduce systematic biases on small scales due to residual resolution effects in the simulations. We find that the inclusion of the dynamical dark energy parameter wa significantly increases the complexity and expense of creating the emulator. The high fidelity of EuclidEmulator2 is tested in various comparisons against N-body simulations as well as alternative fast predictors such as HALOFIT, HMCode, and CosmicEmu. A blind test is successfully performed against the Euclid Flagship v2.0 simulation. Nonlinear correction factors emulated with EuclidEmulator2 are accurate at the level of $1{{\ \rm per\ cent}}$ or better for $0.01 \, h\, {\rm Mpc}^{-1}\le k \le 10\, h\, {\rm Mpc}^{-1}$ and z ≤ 3 compared to high-resolution dark-matter-only simulations. EuclidEmulator2 is publicly available at https://github.com/miknab/EuclidEmulator2.
Context. Future weak lensing surveys, such as the Euclid mission, will attempt to measure the shapes of billions of galaxies in order to derive cosmological information. These surveys will attain ...very low levels of statistical error, and systematic errors must be extremely well controlled. In particular, the point spread function (PSF) must be estimated using stars in the field, and recovered with high accuracy. Aims. The aims of this paper are twofold. Firstly, we took steps toward a nonparametric method to address the issue of recovering the PSF field, namely that of finding the correct PSF at the position of any galaxy in the field, applicable to Euclid . Our approach relies solely on the data, as opposed to parametric methods that make use of our knowledge of the instrument. Secondly, we studied the impact of imperfect PSF models on the shape measurement of galaxies themselves, and whether common assumptions about this impact hold true in an Euclid scenario. Methods. We extended the recently proposed resolved components analysis approach, which performs super-resolution on a field of under-sampled observations of a spatially varying, image-valued function. We added a spatial interpolation component to the method, making it a true 2-dimensional PSF model. We compared our approach to PSFEx , then quantified the impact of PSF recovery errors on galaxy shape measurements through image simulations. Results. Our approach yields an improvement over PSFEx in terms of the PSF model and on observed galaxy shape errors, though it is at present far from reaching the required Euclid accuracy. We also find that the usual formalism used for the propagation of PSF model errors to weak lensing quantities no longer holds in the case of an Euclid -like PSF. In particular, different shape measurement approaches can react differently to the same PSF modeling errors.