Abstract We present a detailed study of the observational biases of the DECam Ecliptic Exploration Project’s B1 data release and survey simulation software that enables direct statistical comparisons ...between models and our data. We inject a synthetic population of objects into the images, and then subsequently recover them in the same processing as our real detections. This enables us to characterize the survey’s completeness as a function of apparent magnitudes and on-sky rates of motion. We study the statistically optimal functional form for the magnitude, and develop a methodology that can estimate the magnitude and rate efficiencies for all survey’s pointing groups simultaneously. We have determined that our peak completeness is on average 80% in each pointing group, and our magnitude drops to 25% of this value at m 25 = 26.22. We describe the freely available survey simulation software and its methodology. We conclude by using it to infer that our effective search area for objects at 40 au is 14.8 deg 2 , and that our lack of dynamically cold distant objects means that there at most 8 × 10 3 objects with 60 < a < 80 au and absolute magnitudes H ≤ 8.
Abstract We present the first set of trans-Neptunian objects (TNOs) observed on multiple nights in data taken from the DECam Ecliptic Exploration Project. Of these 110 TNOs, 105 do not coincide with ...previously known TNOs and appear to be new discoveries. Each individual detection for our objects resulted from a digital tracking search at TNO rates of motion, using two-to-four-hour exposure sets, and the detections were subsequently linked across multiple observing seasons. This procedure allows us to find objects with magnitudes m VR ≈ 26. The object discovery processing also included a comprehensive population of objects injected into the images, with a recovery and linking rate of at least 94%. The final orbits were obtained using a specialized orbit-fitting procedure that accounts for the positional errors derived from the digital tracking procedure. Our results include robust orbits and magnitudes for classical TNOs with absolute magnitudes H ∼ 10, as well as a dynamically detached object found at 76 au (semimajor axis a ≈ 77 au). We find a disagreement between our population of classical TNOs and the CFEPS-L7 three-component model for the Kuiper Belt.
Abstract We present a scalable, cloud-based science platform solution designed to enable next-to-the-data analyses of terabyte-scale astronomical tabular data sets. The presented platform is built on ...Amazon Web Services (over Kubernetes and S3 abstraction layers), utilizes Apache Spark and the Astronomy eXtensions for Spark for parallel data analysis and manipulation, and provides the familiar JupyterHub web-accessible front end for user access. We outline the architecture of the analysis platform, provide implementation details and rationale for (and against) technology choices, verify scalability through strong and weak scaling tests, and demonstrate usability through an example science analysis of data from the Zwicky Transient Facility’s 1Bn+ light-curve catalog. Furthermore, we show how this system enables an end user to iteratively build analyses (in Python) that transparently scale processing with no need for end-user interaction. The system is designed to be deployable by astronomers with moderate cloud engineering knowledge, or (ideally) IT groups. Over the past 3 yr, it has been utilized to build science platforms for the DiRAC Institute, the ZTF partnership, the LSST Solar System Science Collaboration, and the LSST Interdisciplinary Network for Collaboration and Computing, as well as for numerous short-term events (with over 100 simultaneous users). In a live demo instance, the deployment scripts, source code, and cost calculators are accessible.<xref ref-type='fn' rid='ajac77fbfn1'>4 4<ext-link ext-link-type='uri' href='http://hub.astronomycommons.org/' type='simple'>http://hub.astronomycommons.org/</ext-link>
Abstract
We present a scalable, cloud-based science platform solution designed to enable next-to-the-data analyses of terabyte-scale astronomical tabular data sets. The presented platform is built on ...Amazon Web Services (over Kubernetes and S3 abstraction layers), utilizes Apache Spark and the Astronomy eXtensions for Spark for parallel data analysis and manipulation, and provides the familiar JupyterHub web-accessible front end for user access. We outline the architecture of the analysis platform, provide implementation details and rationale for (and against) technology choices, verify scalability through strong and weak scaling tests, and demonstrate usability through an example science analysis of data from the Zwicky Transient Facility’s 1Bn+ light-curve catalog. Furthermore, we show how this system enables an end user to iteratively build analyses (in Python) that transparently scale processing with no need for end-user interaction. The system is designed to be deployable by astronomers with moderate cloud engineering knowledge, or (ideally) IT groups. Over the past 3 yr, it has been utilized to build science platforms for the DiRAC Institute, the ZTF partnership, the LSST Solar System Science Collaboration, and the LSST Interdisciplinary Network for Collaboration and Computing, as well as for numerous short-term events (with over 100 simultaneous users). In a live demo instance, the deployment scripts, source code, and cost calculators are accessible.
4
4
http://hub.astronomycommons.org/
Photoelectron yields of extruded scintillation counters with titanium dioxide coating and embedded wavelength shifting fibers read out by silicon photomultipliers have been measured at the Fermilab ...Test Beam Facility using 120 GeV protons. The yields were measured as a function of transverse, longitudinal, and angular positions for a variety of scintillator compositions, reflective coating mixtures, and fiber diameters. Timing performance was also studied. These studies were carried out by the Cosmic Ray Veto Group of the Mu2e collaboration as part of their R&D program.
ABSTRACT
This paper presents a new optical imaging survey of four deep drilling fields (DDFs), two Galactic and two extragalactic, with the Dark Energy Camera (DECam) on the 4-m Blanco telescope at ...the Cerro Tololo Inter-American Observatory (CTIO). During the first year of observations in 2021, >4000 images covering 21 deg2 (seven DECam pointings), with ∼40 epochs (nights) per field and 5 to 6 images per night per filter in g, r, i, and/or z have become publicly available (the proprietary period for this program is waived). We describe the real-time difference-image pipeline and how alerts are distributed to brokers via the same distribution system as the Zwicky Transient Facility (ZTF). In this paper, we focus on the two extragalactic deep fields (COSMOS and ELAIS-S1) characterizing the detected sources, and demonstrating that the survey design is effective for probing the discovery space of faint and fast variable and transient sources. We describe and make publicly available 4413 calibrated light curves based on difference-image detection photometry of transients and variables in the extragalactic fields. We also present preliminary scientific analysis regarding the Solar system small bodies, stellar flares and variables, Galactic anomaly detection, fast-rising transients and variables, supernovae, and active Galactic nuclei.
ABSTRACT
This paper presents a new optical imaging survey of four deep drilling fields (DDFs), two Galactic and two extragalactic, with the Dark Energy Camera (DECam) on the 4-m Blanco telescope at ...the Cerro Tololo Inter-American Observatory (CTIO). During the first year of observations in 2021, >4000 images covering 21 deg2 (seven DECam pointings), with ∼40 epochs (nights) per field and 5 to 6 images per night per filter in g, r, i, and/or z have become publicly available (the proprietary period for this program is waived). We describe the real-time difference-image pipeline and how alerts are distributed to brokers via the same distribution system as the Zwicky Transient Facility (ZTF). In this paper, we focus on the two extragalactic deep fields (COSMOS and ELAIS-S1) characterizing the detected sources, and demonstrating that the survey design is effective for probing the discovery space of faint and fast variable and transient sources. We describe and make publicly available 4413 calibrated light curves based on difference-image detection photometry of transients and variables in the extragalactic fields. We also present preliminary scientific analysis regarding the Solar system small bodies, stellar flares and variables, Galactic anomaly detection, fast-rising transients and variables, supernovae, and active Galactic nuclei.
Fitting a theoretical model to experimental data in a Bayesian manner using Markov chain Monte Carlo typically requires one to evaluate the model thousands (or millions) of times. When the model is a ...slow-to-compute physics simulation, Bayesian model fitting becomes infeasible. To remedy this, a second statistical model that predicts the simulation output -- an "emulator" -- can be used in lieu of the full simulation during model fitting. A typical emulator of choice is the Gaussian process (GP), a flexible, non-linear model that provides both a predictive mean and variance at each input point. Gaussian process regression works well for small amounts of training data (\(n < 10^3\)), but becomes slow to train and use for prediction when the data set size becomes large. Various methods can be used to speed up the Gaussian process in the medium-to-large data set regime (\(n > 10^5\)), trading away predictive accuracy for drastically reduced runtime. This work examines the accuracy-runtime trade-off of several approximate Gaussian process models -- the sparse variational GP, stochastic variational GP, and deep kernel learned GP -- when emulating the predictions of density functional theory (DFT) models. Additionally, we use the emulators to calibrate, in a Bayesian manner, the DFT model parameters using observed data, resolving the computational barrier imposed by the data set size, and compare calibration results to previous work. The utility of these calibrated DFT models is to make predictions, based on observed data, about the properties of experimentally unobserved nuclides of interest e.g. super-heavy nuclei.
We demonstrate a fully functional implementation of (per-user) checkpoint, restore, and live migration capabilities for JupyterHub platforms. Checkpointing -- the ability to freeze and suspend to ...disk the running state (contents of memory, registers, open files, etc.) of a set of processes -- enables the system to snapshot a user's Jupyter session to permanent storage. The restore functionality brings a checkpointed session back to a running state, to continue where it left off at a later time and potentially on a different machine. Finally, live migration enables moving running Jupyter notebook servers between different machines, transparent to the analysis code and w/o disconnecting the user. Our implementation of these capabilities works at the system level, with few limitations, and typical checkpoint/restore times of O(10s) with a pathway to O(1s) live migrations. It opens a myriad of interesting use cases, especially for cloud-based deployments: from checkpointing idle sessions w/o interruption of the user's work (achieving cost reductions of 4x or more), execution on spot instances w. transparent migration on eviction (with additional cost reductions up to 3x), to automated migration of workloads to ideally suited instances (e.g. moving an analysis to a machine with more or less RAM or cores based on observed resource utilization). The capabilities we demonstrate can make science platforms fully elastic while retaining excellent user experience.