We study the implications of ultrahigh-energy cosmic-ray (UHECR) data from the Pierre Auger Observatory for potential accelerator candidates and cosmogenic neutrino fluxes for different combinations ...of nuclear disintegration and air-shower models. We exploit the most recent spectral and mass composition data (2017) with a new, computationally efficient simulation code, PriNCe. We extend a systematic framework, which has been previously applied in a combined fit by the Pierre Auger Collaboration, with the cosmological source evolution as an additional free parameter. In this framework, an ensemble of generalized UHECR accelerators is characterized by a universal spectral index (equal for all injection species), a maximal rigidity, and the normalizations for five nuclear element groups. We find that the 2017 data favor a small but constrained contribution of heavy elements (iron) at the source. We demonstrate that the results moderately depend on the nuclear disintegration (Puget-Stecker-Bredekamp, Peanut, or Talys) model and more strongly on the air-shower (EPOS-LHC, Sibyll 2.3, or QGSjetII-04) model. Variations of these models result in different source evolution and spectral indices, limiting the interpretation in terms of a particular class of cosmic accelerators. Better-constrained parameters include the maximal rigidity and the mass composition at the source. Hence, the cosmogenic neutrino flux can be robustly predicted. Depending on the source evolution at high redshifts, the flux is likely out of reach of future neutrino observatories in most cases, and a minimal cosmogenic neutrino flux cannot be claimed from data without assuming a cosmological distribution of the sources.
We investigate whether the emission of neutrinos observed in 2014-2015 from the direction of the blazar TXS 0506+056 can be accommodated with leptohadronic multiwavelength models of the source ...commonly adopted for the 2017 flare. While multiwavelength data during the neutrino flare are sparse, the large number of neutrino events (13 5) challenges the missing activity in gamma-rays. We illustrate that two to five neutrino events during the flare can be explained with leptohadronic models of different categories: a one-zone model, a compact-core model, and an external radiation field model. If, however, significantly more events were to be accommodated, the predicted multiwavelength emission levels would be in conflict with observational X-ray constraints, or with the high-energy gamma-ray fluxes observed by the Fermi Large Area Telescope, depending on the model. For example, while the external radiation field model can predict up to five neutrino events without violating X-ray constraints, the absorption of high-energy gamma-rays is in minor tension with the data. We therefore do not find any model that can simultaneously explain the high event number quoted by IceCube and the (sparse) electromagnetic data during the neutrino flare.
We discuss the production of ultra-high-energy cosmic-ray (UHECR) nuclei and neutrinos from blazars. We compute the nuclear cascade in the jet for both BL Lac objects and flat-spectrum radio quasars ...(FSRQs), and in the ambient radiation zones for FSRQs as well. By modeling representative spectral energy distributions along the blazar sequence, two distinct regimes are identified, which we call "nuclear survival" (typically found in low-luminosity BL Lacs) and "nuclear cascade" (typically found in high-luminosity FSRQs). We quantify how the neutrino and cosmic-ray (CR) emission efficiencies evolve over the blazar sequence, and we demonstrate that neutrinos and CRs come from very different object classes. For example, high-frequency-peaked BL Lacs (HBLs) tend to produce CRs, and high-luminosity FSRQs are the more efficient neutrino emitters. This conclusion does not depend on the CR escape mechanism, for which we discuss two alternatives (diffusive and advective escape). Finally, the neutrino spectrum from blazars is shown to significantly depend on the injection composition into the jet, especially in the nuclear cascade case: Injection compositions heavier than protons lead to reduced neutrino production at the peak, which moves at the same time to lower energies. Thus, these sources will exhibit better compatibility with the observed IceCube and UHECR data.
High-energy cosmic rays are observed indirectly by detecting the extensive air showers initiated in Earth’s atmosphere. The interpretation of these observations relies on accurate models of air ...shower physics, which is a challenge and an opportunity to test QCD under extreme conditions. Air showers are hadronic cascades, which give rise to a muon component through hadron decays. The muon number is a key observable to infer the mass composition of cosmic rays. Air shower simulations with state-of-the-art QCD models show a significant muon deficit with respect to measurements; this is called the Muon Puzzle. By eliminating other possibilities, we conclude that the most plausible cause for the muon discrepancy is a deviation in the composition of secondary particles produced in high-energy hadronic interactions from current model predictions. The muon discrepancy starts at the TeV scale, which suggests that this deviation is observable at the Large Hadron Collider. An enhancement of strangeness production has been observed at the LHC in high-density events, which can potentially explain the puzzle, but the impact of the effect on forward produced hadrons needs further study, in particular with future data from oxygen beam collisions.
The determination of the injection composition of cosmic ray nuclei within astrophysical sources requires sufficiently accurate descriptions of the source physics and the propagation - apart from ...controlling astrophysical uncertainties. We therefore study the implications of nuclear data and models for cosmic ray astrophysics, which involves the photo-disintegration of nuclei up to iron in astrophysical environments. We demonstrate that the impact of nuclear model uncertainties is potentially larger in environments with non-thermal radiation fields than in the cosmic microwave background. We also study the impact of nuclear models on the nuclear cascade in a gamma-ray burst radiation field, simulated at a level of complexity comparable to the most precise cosmic ray propagation code. We conclude with an isotope chart describing which information is in principle necessary to describe nuclear interactions in cosmic ray sources and propagation.
We discuss the production of multiple astrophysical messengers (neutrinos, cosmic rays, gamma-rays) in the Gamma-Ray Burst (GRB) internal shock scenario, focusing on the impact of the collision ...dynamics between two shells on the fireball evolution. In addition to the inelastic case, in which plasma shells merge when they collide, we study the Ultra Efficient Shock scenario, in which a fraction of the internal energy is re-converted into kinetic energy and, consequently, the two shells survive and remain in the system. We find that in all cases, a quasi-diffuse neutrino flux from GRBs at the level of 10−11- (per flavor) is expected for protons and a baryonic loading of 10, which is potentially within the reach of IceCube-Gen2. The highest impact of the collision model for multi-messenger production is observed for the Ultra Efficient Shock scenario, that promises high conversion efficiencies from kinetic to radiated energy. However, the assumption that the plasma shells separate after a collision and survive as separate shells within the fireball is found to be justified too rarely in a multicollision model that uses hydrodynamical simulations with the PLUTO code for individual shell collisions.
An efficient method for calculating inclusive conventional and prompt atmospheric leptons fluxes is presented. The coupled cascade equations are solved numerically by formulating them as matrix ...equation. The presented approach is very flexible and allows the use of different hadronic interaction models, realistic parametrizations of the primary cosmic-ray flux and the Earth's atmosphere, and a detailed treatment of particle interactions and decays. The power of the developed method is illustrated by calculating lepton flux predictions for a number of different scenarios.
Abstract
The origin of ultra-high-energy cosmic rays is a 60 yr old mystery. We show that with more events at the highest energies (above 150 EeV) it may be possible to limit the character of the ...sources and learn about the intervening magnetic fields. Individual sources become more prominent, relative to the background, as the horizon diminishes. An event-by-event, composition-dependent observatory would allow a “tomography” of the sources as different mass and energy groups probe different GZK horizons. A major goal here is to provide a methodology to distinguish between steady and transient or highly variable sources. Using recent Galactic magnetic field models, we calculate “treasure” sky maps to identify the most promising directions for detecting Extreme Energy Cosmic Rays doublets, events that are close in arrival time and direction. On this basis, we predict the incidence of doublets as a function of the nature of the source host galaxy. Based on the asymmetry in the distribution of time delays, we show that observation of doublets might distinguish source models. In particular, the Telescope Array hotspot could exhibit temporal variability as it is in a “magnetic window” of small time delays. These considerations could improve the use of data with existing facilities and the planning of future ones such as Global Cosmic Ray Observatory (GCOS).