We implement support for a cosmological parameter estimation algorithm in
Commander
and quantify its computational efficiency and cost. For a semi-realistic simulation similar to
Planck
LFI 70 GHz, ...we find that the computational cost of producing one single sample is about 20 CPU-hours and that the typical Markov chain correlation length is ∼100 samples. The net effective cost per independent sample is ∼2000 CPU-hours, in comparison with all low-level processing costs of 812 CPU-hours for
Planck
LFI and WMAP in C
OSMOGLOBE
Data Release 1. Thus, although technically possible to run already in its current state, future work should aim to reduce the effective cost per independent sample by one order of magnitude to avoid excessive runtimes, for instance through multi-grid preconditioners and/or derivative-based Markov chain sampling schemes. This work demonstrates the computational feasibility of true Bayesian cosmological parameter estimation with end-to-end error propagation for high-precision CMB experiments without likelihood approximations, but it also highlights the need for additional optimizations before it is ready for full production-level analysis.
BEYONDPLANCK Brilenkov, M.; Fornazier, K. S. F.; Hergt, L. T. ...
Astronomy and astrophysics (Berlin),
07/2023, Letnik:
675
Journal Article
Recenzirano
Odprti dostop
End-to-end simulations play a key role in the analysis of any high-sensitivity cosmic microwave background (CMB) experiment, providing high-fidelity systematic error propagation capabilities that are ...unmatched by any other means. In this paper, we address an important issue regarding such simulations, namely, how to define the inputs in terms of sky model and instrument parameters. These may either be taken as a constrained realization derived from the data or as a random realization independent from the data. We refer to these as posterior and prior simulations, respectively. We show that the two options lead to significantly different correlation structures, as prior simulations (contrary to posterior simulations) effectively include cosmic variance, but they exclude realization-specific correlations from non-linear degeneracies. Consequently, they quantify fundamentally different types of uncertainties. We argue that as a result, they also have different and complementary scientific uses, even if this dichotomy is not absolute. In particular, posterior simulations are in general more convenient for parameter estimation studies, while prior simulations are generally more convenient for model testing. Before B
EYOND
P
LANCK
, most pipelines used a mix of constrained and random inputs and applied the same hybrid simulations for all applications, even though the statistical justification for this is not always evident. B
EYOND
P
LANCK
represents the first end-to-end CMB simulation framework that is able to generate both types of simulations and these new capabilities have brought this topic to the forefront. The B
EYOND
P
LANCK
posterior simulations and their uses are described extensively in a suite of companion papers. In this work, we consider one important applications of the corresponding prior simulations, namely, code validation. Specifically, we generated a set of one-year LFI 30 GHz prior simulations with known inputs and we used these to validate the core low-level B
EYOND
P
LANCK
algorithms dealing with gain estimation, correlated noise estimation, and mapmaking.
BEYONDPLANCK Herman, D.; Watson, R. A.; Andersen, K. J. ...
Astronomy and astrophysics (Berlin),
06/2023, Letnik:
675
Journal Article
Recenzirano
Odprti dostop
We describe the correction procedure for Analog-to-Digital Converter (ADC) differential non-linearities (DNL) adopted in the Bayesian end-to-end B
EYOND
P
LANCK
analysis framework. This method is ...nearly identical to that developed for the official
Planck
Low Frequency Instrument (LFI) Data Processing Center (DPC) analysis, and relies on the binned rms noise profile of each detector data stream. However, rather than building the correction profile directly from the raw rms profile, we first fit a Gaussian to each significant ADC-induced rms decrement, and then derive the corresponding correction model from this smooth model. The main advantage of this approach is that only samples which are significantly affected by ADC DNLs are corrected, as opposed to the DPC approach in which the correction is applied to all samples, filtering out signals not associated with ADC DNLs. The new corrections are only applied to data for which there is a clear detection of the non-linearities, and for which they perform at least comparably with the DPC corrections. Out of a total of 88 LFI data streams (sky and reference load for each of the 44 detectors) we apply the new minimal ADC corrections in 25 cases, and maintain the DPC corrections in 8 cases. All these corrections are applied to 44 or 70 GHz channels, while, as in previous analyses, none of the 30 GHz ADCs show significant evidence of non-linearity. By comparing the B
EYOND
P
LANCK
and DPC ADC correction methods, we estimate that the residual ADC uncertainty is about two orders of magnitude below the total noise of both the 44 and 70 GHz channels, and their impact on current cosmological parameter estimation is small. However, we also show that non-idealities in the ADC corrections can generate sharp stripes in the final frequency maps, and these could be important for future joint analyses with the
Planck
High Frequency Instrument (HFI), Wilkinson Microwave Anisotropy Probe (WMAP), or other datasets. We therefore conclude that, although the existing corrections are adequate for LFI-based cosmological parameter analysis, further work on LFI ADC corrections is still warranted.
BEYONDPLANCK Keihänen, E.; Suur-Uski, A.-S.; Andersen, K. J. ...
Astronomy and astrophysics (Berlin),
06/2023, Letnik:
675
Journal Article
Recenzirano
Odprti dostop
We present a Gibbs sampling solution to the mapmaking problem for cosmic microwave background (CMB) measurements that builds on existing destriping methodology. Gibbs sampling breaks the ...computationally heavy destriping problem into two separate steps: noise filtering and map binning. Considered as two separate steps, both are computationally much cheaper than solving the combined problem. This provides a huge performance benefit as compared to traditional methods and it allows us, for the first time, to bring the destriping baseline length to a single sample. Here, we applied the Gibbs procedure to simulated
Planck
30 GHz data. We find that gaps in the time-ordered data are handled efficiently by filling them in with simulated noise as part of the Gibbs process. The Gibbs procedure yields a chain of map samples, from which we are able to compute the posterior mean as a best-estimate map. The variation in the chain provides information on the correlated residual noise, without the need to construct a full noise covariance matrix. However, if only a single maximum-likelihood frequency map estimate is required, we find that traditional conjugate gradient solvers converge much faster than a Gibbs sampler in terms of the total number of iterations. The conceptual advantages of the Gibbs sampling approach lies in statistically well-defined error propagation and systematic error correction. This methodology thus forms the conceptual basis for the mapmaking algorithm employed in the B
EYOND
P
LANCK
framework, which implements the first end-to-end Bayesian analysis pipeline for CMB observations.
BEYONDPLANCK Herman, D.; Hensley, B.; Andersen, K. J. ...
Astronomy and astrophysics (Berlin),
07/2023, Letnik:
675
Journal Article
Recenzirano
Odprti dostop
We constrained the level of polarized anomalous microwave emission (AME) on large angular scales using
Planck
Low-Frequency Instrument (LFI) and WMAP polarization data within a Bayesian cosmic ...microwave background (CMB) analysis framework. We modeled synchrotron emission with a power-law spectral energy distribution, as well as the sum of AME and thermal dust emission through linear regression with the
Planck
High-Frequency Instrument (HFI) 353 GHz data. This template-based dust emission model allowed us to constrain the level of polarized AME while making minimal assumptions on its frequency dependence. We neglected CMB fluctuations, but show through simulations that these fluctuations have a minor impact on the results. We find that the resulting AME polarization fraction confidence limit is sensitive to the polarized synchrotron spectral index prior. In addition, for prior means
β
s
< −3.1 we find an upper limit of
p
AME
max
≲ 0.6% (95% confidence). In contrast, for means
β
s
= −3.0, we find a nominal detection of
p
AME
= 2.5 ± 1.0% (95% confidence). These data are thus not strong enough to simultaneously and robustly constrain both polarized synchrotron emission and AME, and our main result is therefore a constraint on the AME polarization fraction explicitly as a function of
β
s
. Combining the current
Planck
and WMAP observations with measurements from high-sensitivity low-frequency experiments such as C-BASS and QUIJOTE will be critical to improve these limits further.
Abstract We estimate the efficiency of mitigating the lensing B -mode polarization, the so-called delensing, for the LiteBIRD experiment with multiple external data sets of lensing-mass tracers. The ...current best bound on the tensor-to-scalar ratio, r , is limited by lensing rather than Galactic foregrounds. Delensing will be a critical step to improve sensitivity to r as measurements of r become more and more limited by lensing. In this paper, we extend the analysis of the recent LiteBIRD forecast paper to include multiple mass tracers, i.e., the CMB lensing maps from LiteBIRD and CMB-S4-like experiment, cosmic infrared background, and galaxy number density from Euclid - and LSST-like survey. We find that multi-tracer delensing will further improve the constraint on r by about 20%. In LiteBIRD , the residual Galactic foregrounds also significantly contribute to uncertainties of the B -modes, and delensing becomes more important if the residual foregrounds are further reduced by an improved component separation method.
Abstract We study the possibility of using the LiteBIRD satellite B -mode survey to constrain models of inflation producing specific features in CMB angular power spectra. We explore a particular ...model example, i.e. spectator axion-SU(2) gauge field inflation. This model can source parity-violating gravitational waves from the amplification of gauge field fluctuations driven by a pseudoscalar “axionlike” field, rolling for a few e-folds during inflation. The sourced gravitational waves can exceed the vacuum contribution at reionization bump scales by about an order of magnitude and can be comparable to the vacuum contribution at recombination bump scales. We argue that a satellite mission with full sky coverage and access to the reionization bump scales is necessary to understand the origin of the primordial gravitational wave signal and distinguish among two production mechanisms: quantum vacuum fluctuations of spacetime and matter sources during inflation. We present the expected constraints on model parameters from LiteBIRD satellite simulations, which complement and expand previous studies in the literature. We find that LiteBIRD will be able to exclude with high significance standard single-field slow-roll models, such as the Starobinsky model, if the true model is the axion-SU(2) model with a feature at CMB scales. We further investigate the possibility of using the parity-violating signature of the model, such as the TB and EB angular power spectra, to disentangle it from the standard single-field slow-roll scenario. We find that most of the discriminating power of LiteBIRD will reside in BB angular power spectra rather than in TB and EB correlations.
Abstract We explore the capability of measuring lensing signals in LiteBIRD full-sky polarization maps. With a 30 arcmin beam width and an impressively low polarization noise of 2.16 μ K-arcmin, ...LiteBIRD will be able to measure the full-sky polarization of the cosmic microwave background (CMB) very precisely. This unique sensitivity also enables the reconstruction of a nearly full-sky lensing map using only polarization data, even considering its limited capability to capture small-scale CMB anisotropies. In this paper, we investigate the ability to construct a full-sky lensing measurement in the presence of Galactic foregrounds, finding that several possible biases from Galactic foregrounds should be negligible after component separation by harmonic-space internal linear combination. We find that the signal-to-noise ratio of the lensing is approximately 40 using only polarization data measured over 80% of the sky. This achievement is comparable to Planck 's recent lensing measurement with both temperature and polarization and represents a four-fold improvement over Planck 's polarization-only lensing measurement. The LiteBIRD lensing map will complement the Planck lensing map and provide several opportunities for cross-correlation science, especially in the northern hemisphere.
Abstract We present a study of the impact of a beam far side-lobe lack of knowledge on the measurement of the Cosmic Microwave Background B -mode signal at large scale. Beam far side-lobes induce a ...mismatch in the transfer function of Galactic foregrounds between the dipole and higher multipoles which degrads the performances of component separation methods. This leads to foreground residuals in the CMB map. It is expected to be one of the main source of systematic effects in future CMB polarization observations. Thus, it becomes crucial for all-sky survey missions to take into account the interplays between beam systematic effects and all the data analysis steps. LiteBIRD is the ISAS/JAXA second strategic large-class satellite mission and is dedicated to target the measurement of CMB primordial B modes by reaching a sensitivity on the tensor-to-scalar ratio r of σ ( r ) ≤ 10 -3 assuming r = 0. The primary goal of this paper is to provide the methodology and develop the framework to carry out the end-to-end study of beam far side-lobe effects for a space-borne CMB experiment. We introduce uncertainties in the beam model, and propagate the beam effects through all the steps of the analysis pipeline, most importantly including component separation, up to the cosmological results in the form of a bias δr . As a demonstration of our framework, we derive requirements on the calibration and modeling for the LiteBIRD 's beams under given assumptions on design, simulation, component separation method and allocated error budget. In particular, we assume a parametric method of component separation with no mitigation of the far side-lobes effect at any stage of the analysis pipeline. We show that δr is mostly due to the integrated fractional power difference between the estimated beams and the true beams in the far side-lobes region, with little dependence on the actual shape of the beams, for low enough δr . Under our set of assumptions, in particular considering the specific foreground cleaning method we used, we find that the integrated fractional power in the far side-lobes should be known at the level of ∼ 10 -4 , to achieve the required limit on the bias δr < 1.9 × 10 -5 . The framework and tools developed for this study can be easily adapted to provide requirements under different design, data analysis frameworks and for other future space-borne experiments, such as PICO or CMB-Bharat. We further discuss the limitations of this framework and potential extensions to circumvent them.
Here, a methodology to provide the polarization angle requirements for different sets of detectors, at a given frequency of a CMB polarization experiment, is presented. The uncertainties in the ...polarization angle of each detector set are related to a given bias on the tensor-to-scalar ratio r parameter. The approach is grounded in using a linear combination of the detector sets to obtain the CMB polarization signal. In addition, assuming that the uncertainties on the polarization angle are in the small angle limit (lower than a few degrees), it is possible to derive analytic expressions to establish the requirements. The methodology also accounts for possible correlations among detectors, that may originate from the optics, wafers, etc. The approach is applied to the LiteBIRD space mission. We show that, for the most restrictive case (i.e., full correlation of the polarization angle systematics among detector sets), the requirements on the polarization angle uncertainties are of around 1 arcmin at the most sensitive frequency bands (i.e., ≈ 150 GHz) and of few tens of arcmin at the lowest (i.e., ≈ 40 GHz) and highest (i.e., ≈ 400 GHz) observational bands. Conversely, for the least restrictive case (i.e., no correlation of the polarization angle systematics among detector sets), the requirements are ≈ 5 times less restrictive than for the previous scenario. At the global and the telescope levels, polarization angle knowledge of a few arcmins is sufficient for correlated global systematic errors and can be relaxed by a factor of two for fully uncorrelated errors in detector polarization angle. The reported uncertainty levels are needed in order to have the bias on r due to systematics below the limit established by the LiteBIRD collaboration.