We present Zwicky Transient Facility (ZTF) observations of the tidal disruption flare AT2018zr/PS18kh reported by Holoien et al. and detected during ZTF commissioning. The ZTF light curve of the ...tidal disruption event (TDE) samples the rise-to-peak exceptionally well, with 50 days of g- and r-band detections before the time of maximum light. We also present our multi-wavelength follow-up observations, including the detection of a thermal (kT 100 eV) X-ray source that is two orders of magnitude fainter than the contemporaneous optical/UV blackbody luminosity, and a stringent upper limit to the radio emission. We use observations of 128 known active galactic nuclei (AGNs) to assess the quality of the ZTF astrometry, finding a median host-flare distance of 0 2 for genuine nuclear flares. Using ZTF observations of variability from known AGNs and supernovae we show how these sources can be separated from TDEs. A combination of light-curve shape, color, and location in the host galaxy can be used to select a clean TDE sample from multi-band optical surveys such as ZTF or the Large Synoptic Survey Telescope.
The IceCube Neutrino Observatory is a cubic kilometer neutrino telescope located at the Geographic South Pole. For every observed neutrino event, there are over 10 6 background events caused by ...cosmic ray air shower muons. In order to properly separate signal from background, it is necessary to produce Monte Carlo simulations of these air showers. Although to-date, IceCube has produced large quantities of background simulation, these studies still remain statistics limited. The first stage of simulation requires heavy CPU usage while the second stage requires heavy GPU usage. Processing both of these stages on the same node will result in an underutilized GPU but using different nodes will encounter bandwidth bottlenecks. Furthermore, due to the power-law energy spectrum of cosmic rays, the memory footprint of the detector response often exceeded the limit in unpredictable ways. This proceeding presents new client–server code which parallelizes the first stage onto multiple CPUs on the same node and then passes it on to the GPU for photon propagation. This results in GPU utilization of greater than 90% as well as more predictable memory usage and an overall factor of 20 improvement in speed over previous techniques.
While tidal disruption events (TDEs) have long been heralded as laboratories for the study of quiescent black holes, the small number of known TDEs and uncertainties in their emission mechanism have ...hindered progress toward this promise. Here we present 17 new TDEs that have been detected recently by the Zwicky Transient Facility along with Swift UV and X-ray follow-up observations. Our homogeneous analysis of the optical/UV light curves, including 22 previously known TDEs from the literature, reveals a clean separation of light-curve properties with spectroscopic class. The TDEs with Bowen fluorescence features in their optical spectra have smaller blackbody radii, lower optical luminosities, and higher disruption rates compared to the rest of the sample. The small subset of TDEs that show only helium emission lines in their spectra have the longest rise times, the highest luminosities, and the lowest rates. A high detection rate of Bowen lines in TDEs with small photometric radii could be explained by the high density that is required for this fluorescence mechanism. The stellar debris can provide a source for this dense material. Diffusion of photons through this debris may explain why the rise and fade timescale of the TDEs in our sample are not correlated. We also report, for the first time, the detection of soft X-ray flares from a TDE on ∼day timescales. Based on the fact that the X-ray flares peak at a luminosity similar to the optical/UV blackbody luminosity, we attribute them to brief glimpses through a reprocessing layer that otherwise obscures the inner accretion flow.
ABSTRACT
The Zwicky Transient Facility (ZTF) performs a systematic neutrino follow-up programme, searching for optical counterparts to high-energy neutrinos with dedicated Target-of-Opportunity (ToO) ...observations. Since first light in March 2018, ZTF has taken prompt observations for 24 high-quality neutrino alerts from the IceCube Neutrino Observatory, with a median latency of 12.2 h from initial neutrino detection. From two of these campaigns, we have already reported tidal disruption event (TDE) AT 2019dsg and likely TDE AT 2019fdr as probable counterparts, suggesting that TDEs contribute >7.8 per cent of the astrophysical neutrino flux. We here present the full results of our programme through to December 2021. No additional candidate neutrino sources were identified by our programme, allowing us to place the first constraints on the underlying optical luminosity function of astrophysical neutrino sources. Transients with optical absolutes magnitudes brighter that −21 can contribute no more than 87 per cent of the total, while transients brighter than −22 can contribute no more than 58 per cent of the total, neglecting the effect of extinction and assuming they follow the star formation rate. These are the first observational constraints on the neutrino emission of bright populations such as superluminous supernovae. None of the neutrinos were coincident with bright optical AGN flares comparable to that observed for TXS 0506+056/IC170922A, with such optical blazar flares producing no more than 26 per cent of the total neutrino flux. We highlight the outlook for electromagnetic neutrino follow-up programmes, including the expected potential for the Rubin Observatory.
Detector response to a high-energy physics process is often estimated by Monte Carlo simulation. For purposes of data analysis, the results of this simulation are typically stored in large ...multi-dimensional histograms, which can quickly become both too large to easily store and manipulate and numerically problematic due to unfilled bins or interpolation artifacts. We describe here an application of the penalized spline technique (Marx and Eilers, 1996) 1 to efficiently compute B-spline representations of such tables and discuss aspects of the resulting B-spline fits that simplify many common tasks in handling tabulated Monte Carlo data in high-energy physics analysis, in particular their use in maximum-likelihood fitting.
Program title: Photospline
Catalogue identifier: AEPK_v1_0
Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEPK_v1_0.html
Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland
Licensing provisions: 2-clause BSD
No. of lines in distributed program, including test data, etc.: 9723
No. of bytes in distributed program, including test data, etc.: 156138
Distribution format: tar.gz
Programming language: C, Python
Computer: 32- and 64-bit x86, 32- and 64-bit PowerPC
Operating system: Linux, Mac OS X, FreeBSD
Has the code been vectorized or parallelized?: Both
RAM: Approximately proportional to number of knots used in fitting, depends on problem condition
Classification: 4.9
External routines: SuiteSparse (http://www.cise.ufl.edu/research/sparse/SuiteSparse/), Python (http://www.python.org/), BLAS (http://www.netlib.org/blas/), Numpy (http://www.numpy.org/)
Nature of problem:
An algorithm to smoothly represent histograms, including mathematical operations and convolutions. Using histograms of Monte Carlo simulation for likelihood fitting can be unstable due to binning artifacts from statistical fluctuations and hard bin-to-bin transitions. This package provides a toolkit for using penalized spline fits on extremely large multi-dimensional datasets to reduce or eliminate such issues.
Solution method:
Uses sparse matrix operations, non-negative least-squares fitting, and generalized linear array models in conjunction with a number of other algorithms to allow fits to be made, manipulated, and saved with very low computational requirements. This enables even very large problems to be solved on commercially available machines.
Restrictions:
Required computation time and memory increase very rapidly with the number of dimensions. Fits without stacking involving more than 5 dimensions and 20 knots on each are usually not practical on 2012-era hardware.
Running time:
Roughly proportional to the cube of the number of knots used, depends strongly on conditioning of the problem.
The origins of the high-energy cosmic neutrino flux remain largely unknown. Recently, one high-energy neutrino was associated with a tidal disruption event (TDE). Here we present AT2019fdr, an ...exceptionally luminous TDE candidate, coincident with another high-energy neutrino. Our observations, including a bright dust echo and soft late-time x-ray emission, further support a TDE origin of this flare. The probability of finding two such bright events by chance is just 0.034%. We evaluate several models for neutrino production and show that AT2019fdr is capable of producing the observed high-energy neutrino, reinforcing the case for TDEs as neutrino sources.
Full text
Available for:
CMK, CTK, FMFMET, NUK, UL
The IceCube Neutrino Observatory 1 was designed primarily to search for high-energy (▪100 TeV) neutrinos produced in distant astrophysical objects, and a search for high-energy neutrinos interacting ...inside the instrumented volume has recently provided evidence for a diffuse flux of such neutrinos 2. Its energy threshold is however low enough to detect large numbers of neutrinos from the weak decays of pions and kaons produced in air showers; both the atmospheric νμ flux 3 and the sub-dominant atmospheric νe flux 4 have been observed by IceCube. A second, harder component of the atmospheric neutrino flux from decays of short-lived, charmed mesons in air showers has yet to be conclusively observed. Here, we present a strategy for extending the search for neutrino interactions in the instrumented volume down to ∼ 1 TeV and discuss the challenges of disentangling possible contributions of atmospheric charm decay from the high-energy extraterrestrial neutrino flux.
Neutrinos offer a unique window to the distant, high-energy universe. Several next-generation instruments are being designed and proposed to characterize the flux of TeV--EeV neutrinos. The projected ...physics reach of the detectors is often quantified with simulation studies. However, a complete Monte Carlo estimate of detector performance is costly from a computational perspective, restricting the number of detector configurations considered when designing the instruments. In this paper, we present a new Python-based software framework, toise, which forecasts the performance of a high-energy neutrino detector using parameterizations of the detector performance, such as the effective areas, angular and energy resolutions, etc. The framework can be used to forecast performance of a variety of physics analyses, including sensitivities to diffuse fluxes of neutrinos and sensitivity to both transient and steady state point sources. This parameterized approach reduces the need for extensive simulation studies in order to estimate detector performance, and allows the user to study the influence of single performance metrics, like the angular resolution, in isolation. The framework is designed to allow for multiple detector components, each with different responses and exposure times, and supports paramterization of both optical- and radio-Cherenkov (Askaryan) neutrino telescopes. In the paper, we describe the mathematical concepts behind toise and provide detailed instructive examples to introduce the reader to use of the framework.