The search for astrophysical high-energy neutrinos is one of the most important approaches to pin-point the sources of cosmic rays. The advantage of using these neutral and only weakly-interacting ...particles as messengers in order to look deep into the sources themselves is at the same time the main challenge, as extremely large detectors are needed to measure a significant signal. With the finalization of the large underground detectors IceCube and ANTARES, the quantity and the quality of the recorded data are now at a stage where many analyses have a sensitivity limited by the systematic error rather than statistical uncertainties. Such an error source is the Monte Carlo description of the lepton energy losses before a lepton reaches the detector and of all leptons within the detector. A very accurate simulation of the propagation of muons through large amount of matter is needed because a muon may sustain hundreds of interactions before it is detected by the experiment. Requirements on the precision of the muon propagation code are very stringent. A stochastical correct description of the series of lepton interactions within the detector is needed for a correct conclusion from the measured signature to the lepton energy respectively neutrino energy. In this paper, the Monte Carlo code PROPOSAL (Propagator with optimal precision and optimized speed for all leptons) is presented as a public tool for muon propagation through transparent media. Up-to-date cross sections for ionization, bremsstrahlung, photonuclear interactions, electron pair production, Landau–Pomeranchuk–Migdal and Ter-Mikaelian effects, muon and tau decay, as well as Molière scattering are implemented for different parametrizations. Thus, a full study of the systematic uncertainties is possible from the theoretical description of lepton energy loss in the context of high-energy neutrino analyses and other astroparticle physics experiments that rely on the proper description of lepton propagation. A numerical precision of better than 10−6 is achieved, setting the systematic error for high-energy neutrino analyses to a minimum from the numerical prospective.
A growing empirical literature uses patent citations as a quality-adjusted measure for innovation, despite concerns about the validity of this measure. This paper links patents with objective ...measures of improvements in the quality of patented inventions—measured through performance in field trials for hybrid corn—to examine three potential factors that influence citations: (1) improvements in performance, (2) citing practices of patent attorneys, and (3) citing practices of patent examiners. This analysis reveals that citations are robustly correlated with performance, which confirms that citations are a useful quality-adjusted measure for innovation. The citing practices of patent attorneys and examiners, however, also influence citations. Patent attorneys cite early patents, which help establish the patentability of an invention; this practice may inflate citation counts for early patents, particularly for inventions that have only recently become patentable. Attorneys also add self-citations; our analysis indicates that self-citations are an indicator of follow-on invention. By comparison, examiner-added citations are typically unrelated to improvements in performance or follow-on invention.
Data and the online appendix are available at
https://doi.org/10.1287/mnsc.2016.2688
.
This paper was accepted by Lee Fleming, entrepreneurship and innovation.
Muons are the dominant event signature for neutrino telescopes like IceCube and they are the main background for neutrino searches. Furthermore, they are used to investigate extended air showers. In ...both cases, the stochasticity of the muon propagation plays a key role in the data extraction step and an accurate understanding, even of the edge cases, is crucial. The main process driving stochastic losses for TeV scale muons is bremsstrahlung. In this paper, a feasibility study is presented to measure the cross section of stochastic losses using neutrino-induced muons. The simulation study is based on the propagation of muons using the Monte-Carlo library PROPOSAL. For different reconstruction methods and resolutions, the energy loss distribution for different muon energies is used to estimate the sensitivity to measure the bremsstrahlung cross section. Two further systematic parameters, the detection efficiency, which scales the amount of detected light, and the spectral index are also estimated to analyze their correlation to the estimated bremsstrahlung normalization. The statistics of the simulated dataset correspond to 10 years of up-going muon neutrino data in IceCube.
ABSTRACT
By studying the variability of blazars across the electromagnetic spectrum, it is possible to resolve the underlying processes responsible for rapid flux increases, so-called flares. We ...report on an extremely bright X-ray flare in the high-peaked BL Lacertae object Markarian 421 (Mrk 421) that occurred simultaneously with enhanced γ-ray activity detected at very high energies by First G-APD Cherenkov Telescope on 2019 June 9. We triggered an observation with XMM–Newton, which observed the source quasi-continuously for 25 h. We find that the source was in the brightest state ever observed using XMM–Newton, reaching a flux of 2.8 × 10−9 $\mathrm{erg\, cm^{-2}\, s^{-1}}$ over an energy range of 0.3–10 keV. We perform a spectral and timing analysis to reveal the mechanisms of particle acceleration and to search for the shortest source-intrinsic time-scales. Mrk 421 exhibits the typical harder-when-brighter behaviour throughout the observation and shows a clock-wise hysteresis pattern, which indicates that the cooling dominates over the acceleration process. While the X-ray emission in different sub-bands is highly correlated, we can exclude large time lags as the computed z-transformed discrete correlation functions are consistent with a zero lag. We find rapid variability on time-scales of 1 ks for the 0.3–10 keV band and down to 300 s in the hard X-ray band (4–10 keV). Taking these time-scales into account, we discuss different models to explain the observed X-ray flare, and find that a plasmoid-dominated magnetic reconnection process is able to describe our observation best.
This paper argues that long-run trends in geographic segregation are inconsistent with models where residential choice depends solely on local public goods (the Tiebout hypothesis). We develop an ...extension of the Tiebout model that predicts as mobility costs fall, the heterogeneity across communities of individual public good preferences and of public good provision must (weakly) increase. Given the secular decline in mobility costs, these predictions can be evaluated using historical data. We find decreasing heterogeneity in policies and proxies for preferences across (i) a sample of U.S. municipalities (1870-1990); (ii) all Boston-area municipalities (1870-1990); and (iii) all U.S. counties (1850-1990).
This article evaluates the high-profile claim that enslaved African-Americans produced over 50 percent of US national product in the pre-Civil War period. The accounting exercise shows the fraction ...was closer to (and indeed likely slightly below) the share of the population, that is, about 12.6 percent in 1860. The enslaved population had higher rates of labor force participation, but they were also forced to work in sectors–agriculture and domestic service—with below average output per worker. The economic surplus generated by the enslaved was due chiefly to the low value of the very basic consumption bundle provided rather than to exceptionally high values of production per capita.
Recent advances in artificial intelligence and robotics have generated a robust debate about the future of work. An analogous debate occurred in the late nineteenth century when mechanization first ...transformed manufacturing. We analyze an extraordinary dataset from the late nineteenth century, the Hand and Machine Labor study carried out by the US Department of Labor in the mid-1890s. We focus on transitions at the task level from hand to machine production, and on the impact of inanimate power, especially of steam power, on labor productivity. Our analysis sheds light on the ability of modern task-based models to account for the effects of historical mechanization.
The “New History of Capitalism” grounds the rise of industrial capitalism on the production of raw cotton by American slaves. Recent works include Sven Beckert's Empire of Cotton, Walter Johnson's ...River of Dark Dreams, and Edward Baptist's The Half Has Never Been Told. All three authors mishandle historical evidence and mis-characterize important events in ways that affect their major interpretations on the nature of slavery, the workings of plantations, the importance of cotton and slavery in the broader economy, and the sources of the Industrial Revolution and world development.
Abstract The mammalian cochlear nucleus (CN) consists of a diverse set of neurons both physiologically and morphologically that are involved in processing different aspects of the sound signal. One ...class of CN neurons that is located near the entrance of the auditory nerve (AN) to the CN has an oval soma with an eccentric nucleus and a short-bushy dendritic tree and is called a globular/bushy cell (GBC). They contact the principal cells of the medial nucleus of the trapezoid body (MNTB) with the very large calyx of Held that is one of the most secure synapses in the brain. Because MNTB cells provide an inhibitory input to the lateral superior olive (LSO), a structure purported to play a role in lateralizing high frequency sounds, GBC physiology is of great interest. Results were obtained with intracellular recording and subsequent labeling with neurobiotin of 32 GBCs along with a number of cells characterized extracellularly as likely GBCs in the cochlear nucleus (CN) of cat. Their poststimulus discharge response pattern to repeated tones varies from a primarylike pattern, i.e. similar to the AN, to a primarylike pattern with a 0.5–2 ms notch after the initial spike, to an onset pattern with a low-sustained rate. They can represent low frequency tones and amplitude modulated signals exceptionally well with a temporal code.