Abstract
The Vera C. Rubin Observatory will, over a period of 10 yr, repeatedly survey the southern sky. To ensure that images generated by Rubin meet the quality requirements for precision science, ...the observatory will use an active-optics system (AOS) to correct for alignment and mirror surface perturbations introduced by gravity and temperature gradients in the optical system. To accomplish this, Rubin will use out-of-focus images from sensors located at the edge of the focal plane to learn and correct for perturbations to the wave front. We have designed and integrated a deep-learning (DL) model for wave-front estimation into the AOS pipeline. In this paper, we compare the performance of this DL approach to Rubin’s baseline algorithm when applied to images from two different simulations of the Rubin optical system. We show the DL approach is faster and more accurate, achieving the atmospheric error floor both for high-quality images and low-quality images with heavy blending and vignetting. Compared to the baseline algorithm, the DL model is 40× faster, the median error 2× better under ideal conditions, 5× better in the presence of vignetting by the Rubin camera, and 14× better in the presence of blending in crowded fields. In addition, the DL model surpasses the required optical quality in simulations of the AOS closed loop. This system promises to increase the survey area useful for precision science by up to 8%. We discuss how this system might be deployed when commissioning and operating Rubin.
Accurate photometric redshift (photo-z) estimates are essential to the cosmological science goals of the Vera C. Rubin Observatory Legacy Survey of Space and Time (LSST). In this work, we use ...simulated photometry for mock galaxy catalogs to explore how LSST photo-z estimates can be improved by the addition of near-infrared (NIR) and/or ultraviolet (UV) photometry from the Euclid, Wide-Field InfrarRed Survey Telescope (WFIRST), and/or Cosmological Advanced Survey Telescope for Optical and ultraviolet Research (CASTOR) space telescopes. Generally, we find that deeper optical photometry can reduce the standard deviation of the photo-z estimates more than adding NIR or UV filters but that additional filters are the only way to significantly lower the fraction of galaxies with catastrophically under- or overestimated photo-z. For Euclid, we find that the addition of JH 5 photometric detections can reduce the standard deviation for galaxies with z > 1 (z > 0.3) by ∼20% (∼10%), and the fraction of outliers by ∼40% (∼25%). For WFIRST, we show how the addition of deep YJHK photometry could reduce the standard deviation by 50% at z > 1.5 and drastically reduce the fraction of outliers to just ∼2% overall. For CASTOR, we find that the addition of its UV- and u-band photometry could reduce the standard deviation by ∼30% and the fraction of outliers by ∼50% for galaxies with z < 0.5. We also evaluate the photo-z results within sky areas that overlap with both the NIR and UV surveys and when spectroscopic training sets built from the surveys' small-area deep fields are used.
ABSTRACT
We report a study exploring how the use of deep neural networks with astronomical Big Data may help us find and uncover new insights into underlying phenomena: through our experiments ...towards unsupervised knowledge extraction from astronomical Big Data we serendipitously found that deep convolutional autoencoders tend to reject telluric lines in stellar spectra. With further experiments, we found that only when the spectra are in the barycentric frame does the network automatically identify the statistical independence between two components, stellar versus telluric, and rejects the latter. We exploit this finding and turn it into a proof-of-concept method for removal of the telluric lines from stellar spectra in a fully unsupervised fashion: we increase the interobservation entropy of telluric absorption lines by imposing a random, virtual radial velocity to the observed spectrum. This technique results in a non-standard form of ‘whitening’ in the atmospheric components of the spectrum, decorrelating them across multiple observations. We process more than 250 000 spectra from the High Accuracy Radial velocity Planetary Search and with qualitative and quantitative evaluations against a data base of known telluric lines, show that most of the telluric lines are successfully rejected. Our approach, ‘Stellar Karaoke’, has zero need for prior knowledge about parameters such as observation time, location, or the distribution of atmospheric molecules and processes each spectrum in milliseconds. We also train and test on Sloan Digital Sky Survey and see a significant performance drop due to the low resolution. We discuss directions for developing tools on top of the introduced method in the future.
Estimating Spectra from Photometry Kalmbach, J. Bryce; Connolly, Andrew J.
The Astronomical journal,
12/2017, Letnik:
154, Številka:
6
Journal Article
Recenzirano
Odprti dostop
Measuring the physical properties of galaxies such as redshift frequently requires the use of spectral energy distributions (SEDs). SED template sets are, however, often small in number and cover ...limited portions of photometric color space. Here we present a new method to estimate SEDs as a function of color from a small training set of template SEDs. We first cover the mathematical background behind the technique before demonstrating our ability to reconstruct spectra based upon colors and then compare our results to other common interpolation and extrapolation methods. When the photometric filters and spectra overlap, we show that the error in the estimated spectra is reduced by more than 65% compared to the more commonly used techniques. We also show an expansion of the method to wavelengths beyond the range of the photometric filters. Finally, we demonstrate the usefulness of our technique by generating 50 additional SED templates from an original set of 10 and by applying the new set to photometric redshift estimation. We are able to reduce the photometric redshifts standard deviation by at least 22.0% and the outlier rejected bias by over 86.2% compared to original set for z ≤ 3.
Abstract
Trans-Neptunian objects provide a window into the history of the solar system, but they can be challenging to observe due to their distance from the Sun and relatively low brightness. Here ...we report the detection of 75 moving objects that we could not link to any other known objects, the faintest of which has a
VR
magnitude of 25.02 ± 0.93 using the Kernel-Based Moving Object Detection (KBMOD) platform. We recover an additional 24 sources with previously known orbits. We place constraints on the barycentric distance, inclination, and longitude of ascending node of these objects. The unidentified objects have a median barycentric distance of 41.28 au, placing them in the outer solar system. The observed inclination and magnitude distribution of all detected objects is consistent with previously published KBO distributions. We describe extensions to KBMOD, including a robust percentile-based lightcurve filter, an in-line graphics-processing unit filter, new coadded stamp generation, and a convolutional neural network stamp filter, which allow KBMOD to take advantage of difference images. These enhancements mark a significant improvement in the readiness of KBMOD for deployment on future big data surveys such as LSST.
Abstract
We present “Tracklet-less Heliocentric Orbit Recovery” (THOR), an algorithm for linking of observations of Solar System objects across multiple epochs that does not require intranight ...tracklets or a predefined cadence of observations within a search window. By sparsely covering regions of interest in the phase space with “test orbits,” transforming nearby observations over a few nights into the corotating frame of the test orbit at each epoch, and then performing a generalized Hough transform on the transformed detections followed by orbit determination filtering, candidate clusters of observations belonging to the same objects can be recovered at moderate computational cost and with little to no constraint on cadence. We validate the effectiveness of this approach by running on simulations as well as on real data from the Zwicky Transient Facility (ZTF). Applied to a short, two-week slice of ZTF observations, we demonstrate THOR can recover 97.4% of all previously known and discoverable objects in the targeted (
a
> 1.7 au) population with five or more observations and with purity between 97.7% and 100%. This includes 10 likely new discoveries, and a recovery of an
e
∼ 1 comet C/2018 U1 (the comet would have been a ZTF discovery had THOR been running in 2018 when the data were taken). The THOR package and demo Jupyter notebooks are open source and available at
https://github.com/moeyensj/thor
.
We introduce a new computational technique for searching for faint moving sources in astronomical images. Starting from a maximum-likelihood estimate for the probability of the detection of a source ...within a series of images, we develop a massively parallel algorithm for searching through candidate asteroid trajectories that utilizes graphics processing units (GPU). This technique can search over 1010 possible asteroid trajectories in stacks of the order of 10-15 4K × 4K images in under a minute using a single consumer grade GPU. We apply this algorithm to data from the 2015 campaign of the High Cadence Transient Survey (HiTS) obtained with the Dark Energy Camera (DECam). We find 39 previously unknown Kuiper belt objects (KBOs) in the 150 square degrees of the survey. Comparing these asteroids to an existing model for the inclination distribution of the Kuiper belt we demonstrate that we recover a KBO population above our detection limit consistent with previous studies. Software used in this analysis is made available as an open source package.
The use of high-quality simulated sky catalogs is essential for the success of cosmological surveys. The catalogs have diverse applications, such as investigating signatures of fundamental physics in ...cosmological observables, understanding the effect of systematic uncertainties on measured signals and testing mitigation strategies for reducing these uncertainties, aiding analysis pipeline development and testing, and survey strategy optimization. The list of applications is growing with improvements in the quality of the catalogs and the details that they can provide. Given the importance of simulated catalogs, it is critical to provide rigorous validation protocols that enable both catalog providers and users to assess the quality of the catalogs in a straightforward and comprehensive way. For this purpose, we have developed the DESCQA framework for the Large Synoptic Survey Telescope Dark Energy Science Collaboration as well as for the broader community. The goal of DESCQA is to enable the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. In this paper, we present the design concept and first implementation of DESCQA. In order to establish and demonstrate its full functionality we use a set of interim catalogs and validation tests. We highlight several important aspects, both technical and scientific, that require thoughtful consideration when designing a validation framework, including validation metrics and how these metrics impose requirements on the synthetic sky catalogs.
ABSTRACT
This paper presents a new optical imaging survey of four deep drilling fields (DDFs), two Galactic and two extragalactic, with the Dark Energy Camera (DECam) on the 4-m Blanco telescope at ...the Cerro Tololo Inter-American Observatory (CTIO). During the first year of observations in 2021, >4000 images covering 21 deg2 (seven DECam pointings), with ∼40 epochs (nights) per field and 5 to 6 images per night per filter in g, r, i, and/or z have become publicly available (the proprietary period for this program is waived). We describe the real-time difference-image pipeline and how alerts are distributed to brokers via the same distribution system as the Zwicky Transient Facility (ZTF). In this paper, we focus on the two extragalactic deep fields (COSMOS and ELAIS-S1) characterizing the detected sources, and demonstrating that the survey design is effective for probing the discovery space of faint and fast variable and transient sources. We describe and make publicly available 4413 calibrated light curves based on difference-image detection photometry of transients and variables in the extragalactic fields. We also present preliminary scientific analysis regarding the Solar system small bodies, stellar flares and variables, Galactic anomaly detection, fast-rising transients and variables, supernovae, and active Galactic nuclei.
Abstract The DECam Ecliptic Exploration Project (DEEP) is a deep survey of the trans-Neptunian solar system being carried out on the 4 m Blanco telescope at the Cerro Tololo Inter-American ...Observatory in Chile using the Dark Energy Camera (DECam). By using a shift-and-stack technique to achieve a mean limiting magnitude of r ∼ 26.2, DEEP achieves an unprecedented combination of survey area and depth, enabling quantitative leaps forward in our understanding of the Kuiper Belt populations. This work reports results from an analysis of 20, 3 deg 2 DECam fields along the invariable plane. We characterize the efficiency and false-positive rates for our moving-object detection pipeline, and use this information to construct a Bayesian signal probability for each detected source. This procedure allows us to treat all of our Kuiper Belt object (KBO) detections statistically, simultaneously accounting for efficiency and false positives. We detect approximately 2300 candidate sources with KBO-like motion with signal-to-noise ratios > 6.5. We use a subset of these objects to compute the luminosity function of the Kuiper Belt as a whole, as well as the cold classical (CC) population. We also investigate the absolute magnitude ( H ) distribution of the CCs, and find consistency with both an exponentially tapered power law, which is predicted by streaming instability models of planetesimal formation, and a rolling power law. Finally, we provide an updated mass estimate for the CC Kuiper Belt of M CC ( H r < 12 ) = 0.0017 − 0.0004 + 0.0010 M ⊕ , assuming albedo p = 0.15 and density ρ = 1 g cm −3 .