ABSTRACT The Planck mission detected thousands of extragalactic radio sources at frequencies from 28 to 857 GHz. Planck's calibration is absolute (in the sense that it is based on the satellite's ...annual motion around the Sun and the temperature of the cosmic microwave background), and its beams are well characterized at sub-percent levels. Thus, Planck's flux density measurements of compact sources are absolute in the same sense. We have made coordinated Very Large Array (VLA) and Australia Telescope Compact Array (ATCA) observations of 65 strong, unresolved Planck sources in order to transfer Planck's calibration to ground-based instruments at 22, 28, and 43 GHz. The results are compared to microwave flux density scales currently based on planetary observations. Despite the scatter introduced by the variability of many of the sources, the flux density scales are determined to 1%-2% accuracy. At 28 GHz, the flux density scale used by the VLA runs 2%-3% 1.0% below Planck values with an uncertainty of 1.0 % ; at 43 GHz, the discrepancy increases to 5%-6% 1.4% for both ATCA and the VLA.
We present the first application of the C
OSMOGLOBE
analysis framework by analyzing nine-year WMAP time-ordered observations that uses similar machinery to that of B
EYOND
P
LANCK
for the
Planck
Low ...Frequency Instrument (LFI). We analyzed only the
Q
-band (41 GHz) data and report on the low-level analysis process based on uncalibrated time-ordered data to calibrated maps. Most of the existing B
EYOND
P
LANCK
pipeline may be reused for WMAP analysis with minimal changes to the existing codebase. The main modification is the implementation of the same preconditioned biconjugate gradient mapmaker used by the WMAP team. Producing a single WMAP
Q
1-band sample requires 22 CPU-hrs, which is slightly more than the cost of a
Planck
44 GHz sample of 17 CPU-hrs; this demonstrates that a full end-to-end Bayesian processing of the WMAP data is computationally feasible. In general, our recovered maps are very similar to the maps released by the WMAP team, although with two notable differences. In terms of temperature, we find a ∼2 μK quadrupole difference that most likely is caused by different gain modeling, while in polarization we find a distinct 2.5 μK signal that has been previously referred to as poorly measured modes by the WMAP team. In the C
OSMOGLOBE
processing, this pattern arises from temperature-to-polarization leakage from the coupling between the CMB Solar dipole, transmission imbalance, and sidelobes. No traces of this pattern are found in either the frequency map or TOD residual map, suggesting that the current processing has succeeded in modeling these poorly measured modes within the assumed parametric model by using
Planck
information to break the sky-synchronous degeneracies inherent in the WMAP scanning strategy.
BEYONDPLANCK Brilenkov, M.; Fornazier, K. S. F.; Hergt, L. T. ...
Astronomy and astrophysics (Berlin),
07/2023, Letnik:
675
Journal Article
Recenzirano
Odprti dostop
End-to-end simulations play a key role in the analysis of any high-sensitivity cosmic microwave background (CMB) experiment, providing high-fidelity systematic error propagation capabilities that are ...unmatched by any other means. In this paper, we address an important issue regarding such simulations, namely, how to define the inputs in terms of sky model and instrument parameters. These may either be taken as a constrained realization derived from the data or as a random realization independent from the data. We refer to these as posterior and prior simulations, respectively. We show that the two options lead to significantly different correlation structures, as prior simulations (contrary to posterior simulations) effectively include cosmic variance, but they exclude realization-specific correlations from non-linear degeneracies. Consequently, they quantify fundamentally different types of uncertainties. We argue that as a result, they also have different and complementary scientific uses, even if this dichotomy is not absolute. In particular, posterior simulations are in general more convenient for parameter estimation studies, while prior simulations are generally more convenient for model testing. Before B
EYOND
P
LANCK
, most pipelines used a mix of constrained and random inputs and applied the same hybrid simulations for all applications, even though the statistical justification for this is not always evident. B
EYOND
P
LANCK
represents the first end-to-end CMB simulation framework that is able to generate both types of simulations and these new capabilities have brought this topic to the forefront. The B
EYOND
P
LANCK
posterior simulations and their uses are described extensively in a suite of companion papers. In this work, we consider one important applications of the corresponding prior simulations, namely, code validation. Specifically, we generated a set of one-year LFI 30 GHz prior simulations with known inputs and we used these to validate the core low-level B
EYOND
P
LANCK
algorithms dealing with gain estimation, correlated noise estimation, and mapmaking.
The BeyondPlanck and Cosmoglobe collaborations have implemented the first integrated Bayesian end-to-end analysis pipeline for CMB experiments. The primary long-term motivation for this work is to ...develop a common analysis platform that supports efficient global joint analysis of complementary radio, microwave, and sub-millimeter experiments. A strict prerequisite for this to succeed is broad participation from the CMB community, and two foundational aspects of the program are therefore reproducibility and Open Science. In this paper, we discuss our efforts toward this aim. We also discuss measures toward facilitating easy code and data distribution, community-based code documentation, user-friendly compilation procedures, etc. This work represents the first publicly released end-to-end CMB analysis pipeline that includes raw data, source code, parameter files, and documentation. We argue that such a complete pipeline release should be a requirement for all major future and publicly-funded CMB experiments, noting that a full public release significantly increases data longevity by ensuring that the data quality can be improved whenever better processing techniques, complementary datasets, or more computing power become available, and thereby also taxpayers’ value for money; providing only raw data and final products is not sufficient to guarantee full reproducibility in the future.
The Planck satellite in orbit mission ended in October 2013. Between the end of Low Frequency Instrument (LFI) routine mission operations and the satellite decommissioning, a dedicated test was also ...performed to measure the Planck telescope emissivity. The scope of the test was twofold: i) to provide, for the first time in flight, a direct measure of the telescope emissivity; and ii) to evaluate the possible degradation of the emissivity by comparing data taken in flight at the end of mission with those taken during the ground telescope characterization. The emissivity was determined by heating the Planck telescope and disentangling the system temperature excess measured by the LFI radiometers. Results show End of Life (EOL) performance in good agreement with the results from the ground optical tests and from
in-flight
indirect estimations measured during the Commissioning and Performance Verification (CPV) phase. Methods and results are presented and discussed.
We consider the application of high-pass Fourier filters to remove periodic systematic fluctuations from full-sky survey CMB datasets. We compare the filter performance with destriping codes commonly ...used to remove the effect of residual 1/f noise from timelines. As a realistic working case, we use simulations of the typical Planck scanning strategy and Planck Low Frequency Instrument noise performance, with spurious periodic fluctuations that mimic a typical thermal disturbance. We show that the application of Fourier high-pass filters in chunks always requires subsequent normalisation of induced offsets by means of destriping. For a complex signal containing all the astrophysical and instrumental components, the result obtained by applying filter and destriping in series is comparable to the result obtained by destriping only, which makes the usefulness of Fourier filters questionable for removing this kind of effects.
Quick Detection System for Planck satellite Aatrokoski, J.; Lähteenmäki, A.; Tornikoski, M. ...
Monthly notices of the Royal Astronomical Society,
01/2010, Letnik:
401, Številka:
1
Journal Article
Recenzirano
Odprti dostop
Quick Detection System or qds is a software package that has been developed for detecting point sources in the Planck satellite data as soon as the data become available, a few days after ...transmission to the Earth. Point sources are detected by filtering the data with a filter defined by the Mexican hat wavelet. An alert is generated on those detections that are found to be interesting, such as prominent flaring, according to the criteria specified to the software. The goal is to detect spectral or flux variability in active galactic nuclei so that instant multifrequency follow-up observations with other instruments could be arranged to study the interesting behaviour.