The CERN Large Hadron Collider (LHC) is designed to collide proton beams of unprecedented energy, in order to extend the frontiers of high-energy particle physics. During the first very successful ...running period in 2010–2013, the LHC was routinely storing protons at 3.5–4 TeV with a total beam energy of up to 146 MJ, and even higher stored energies are foreseen in the future. This puts extraordinary demands on the control of beam losses. An uncontrolled loss of even a tiny fraction of the beam could cause a superconducting magnet to undergo a transition into a normal-conducting state, or in the worst case cause material damage. Hence a multistage collimation system has been installed in order to safely intercept high-amplitude beam protons before they are lost elsewhere. To guarantee adequate protection from the collimators, a detailed theoretical understanding is needed. This article presents results of numerical simulations of the distribution of beam losses around the LHC that have leaked out of the collimation system. The studies include tracking of protons through the fields of more than 5000 magnets in the 27 km LHC ring over hundreds of revolutions, and Monte Carlo simulations of particle-matter interactions both in collimators and machine elements being hit by escaping particles. The simulation results agree typically within a factor 2 with measurements of beam loss distributions from the previous LHC run. Considering the complex simulation, which must account for a very large number of unknown imperfections, and in view of the total losses around the ring spanning over 7 orders of magnitude, we consider this an excellent agreement. Our results give confidence in the simulation tools, which are used also for the design of future accelerators.
One source of experimental background in the CERN Large Hadron Collider (LHC) is particles entering the detectors from the machine. These particles are created in cascades, caused by upstream ...interactions of beam protons with residual gas molecules or collimators. We estimate the losses on the collimators with SixTrack and simulate the showers with FLUKA and MARS to obtain the flux and distribution of particles entering the ATLAS and CMS detectors. We consider some machine configurations used in the first LHC run, with focus on 3.5TeV operation as in 2011. Results from FLUKA and MARS are compared and a very good agreement is found. An analysis of logged LHC data provides, for different processes, absolute beam loss rates, which are used together with further simulations of vacuum conditions to normalize the results to rates of particles entering the detectors. We assess the relative importance of background from elastic and inelastic beam–gas interactions, and the leakage out of the LHC collimation system, and show that beam–gas interactions are the dominating source of machine-induced background for the studied machine scenarios. Our results serve as a starting point for the experiments to perform further simulations in order to estimate the resulting signals in the detectors.
•We simulate sources of machine-induced experimental background at the CERN LHC.•We focus on the ATLAS and CMS experiments.•The LHC machine conditions are analyzed to normalize the simulation results.•Beam–gas interactions is found to be the dominating source of particles entering the detectors.
In this paper we report on the performance of a Hybrid Pixel Detector — consisting on an IBEX ASIC bump-bonded to a 450 μm-thick Silicon sensor with a pixel size of 75 μm — used as a direct electron ...detector up to 300 keV. The count homogeneity was found to have a dispersion below 1%. Energy spectra were recorded for electron energies in the range 20–80 keV, showing a significant full-peak energy loss due to the entrance contact dead layer only for impinging electrons below 30 keV. MTF and DQE were measured at 100 keV and 200 keV in low-flux regime for several threshold energies. Zero-frequency DQE values are up to 0.85 and 0.9 for the two electron energies, respectively, while for increasing frequencies the DQE fall more rapidly for the 200 keV case as a consequence of the higher degradation in spatial resolution. Based on the analysis of the cluster distribution of single-events we were also able to estimate the DQE(0) at 300 keV, which is close to unity for low thresholds. The system response linearity was tested at 80 keV in both paralyzable and non-paralyzable counting mode, yielding a 10% count loss at 0.8 Mcts/s/pix and 1.7 Mcts/s/pix, respectively, and a 50% count loss at 7 Mcts/s/pix and 16 Mcts/s/pix, respectively. FLUKA-based Monte Carlo simulations were used to cross-validate the experimental results and to gain a better understanding of the underlying physical processes.
The First G-APD Cherenkov Telescope (FACT) is designed to detect cosmic gammarays with energies from several hundred GeV up to about 10 TeV using the Imaging Atmospheric Cherenkov Technique. In ...contrast to former or existing telescopes, the camera of the FACT telescope is comprised of solid-state Geiger-mode Avalanche Photodiodes (G-APD) instead of photomultiplier tubes for photo detection. It is the first full-scale device of its kind employing this new technology. The telescope is operated at the Observatorio del Roque de los Muchachos (La Palma, Canary Islands, Spain) since fall 2011. This paper describes in detail the design, construction and operation of the system, including hardware and software aspects. Technical experiences gained after one year of operation are discussed and conclusions with regard to future projects are drawn.
•A novel method for coupled simulations of HE particle beam effects on structures.•Multi-physics problem approach.•Integration of Monte-Carlo based particle physics and explicit FE numerical ...models.•Analysis of high energy deposition shockwaves.•Numerical prediction of material change of phase and damage.
The prediction of material response in case of interaction with successive high energy proton bunches requires new tools and multidisciplinary approaches. The impact leads the propagation of shock-waves, which travels through the hit component causing a substantial density reduction and the appearance of tunneling effect along the beam direction. For taking into account this effect, an automatic procedure, consisting in coupling FLUKA Monte-Carlo and FE LS-DYNA codes, is developed. The case study consists of the accidental loss of 60 bunches of one of the 7TeV proton beams of the Large Hadron Collider (CERN) on a tungsten collimator.
The focal-plane cameras of γ-ray telescopes frequently use light concentrators in front of the light sensors. The purpose of these concentrators is to increase the effective area of the camera as ...well as to reduce the stray light coming at large incident angles. These light concentrators are usually based on the Winston cone design. In this contribution we present the design of a hexagonal hollow light concentrator with a lateral profile optimized using a cubic Bézier function to achieve a higher collection efficiency in the angular region of interest. The design presented here is optimized for a Davies–Cotton telescope with a primary mirror of about 4m in diameter and a focal length of 5.6m. The described concentrators are part of an innovative camera made up of silicon-photomultiplier sensors, although a similar approach can be used for other sizes of single-mirror telescopes with different camera sensors, including photomultipliers. The challenge of our approach is to achieve a cost-effective design suitable for standard industrial production of both the plastic concentrator substrate and the reflective coating. At the same time we maximize the optical performance. In this paper we also describe the optical set-up to measure the absolute collection efficiency of the light concentrators and demonstrate our good understanding of the measured data using a professional ray-tracing simulation.
Predicting the consequences of highly energetic particle beams impacting protection devices as collimators or high power target stations is a fundamental issue in the design of state-of-the-art ...facilities for high-energy particle physics.
These complex dynamic phenomena can be successfully simulated resorting to highly non-linear numerical tools (Hydrocodes). In order to produce accurate results, however, these codes require reliable material constitutive models that, at the extreme conditions induced by a destructive beam impact, are scarce and often inaccurate.
In order to derive or validate such models a comprehensive, first-of-its-kind experiment has been recently carried out at CERN HiRadMat facility: performed tests entailed the controlled impact of intense and energetic proton pulses on a number of specimens made of six different materials. Experimental data were acquired relying on embedded instrumentation (strain gauges, temperature probes and vacuum sensors) and on remote-acquisition devices (laser Doppler vibrometer and high-speed camera).
The method presented in this paper, combining experimental measurements with numerical simulations, may find applications to assess materials under very high strain rates and temperatures in domains well beyond particle physics (severe accidents in fusion and fission nuclear facilities, space debris impacts, fast and intense loadings on materials and structures etc.).
ArDM is a new-generation WIMP detector which will measure simultaneously light and charge from scintillation and ionization of liquid argon. Our goal is to construct, characterize and operate a 1 ton ...liquid argon underground detector. The project relies on the possibility to extract the electrons produced by ionization from the liquid into the gas phase of the detector, to amplify and read out with Large Electron Multipliers detectors. Argon VUV scintillation light has to be converted with wavelength shifters such as TetraPhenyl Butadiene in order to be detected by photomultipliers with bialkali photocathodes. We describe the status of the LEM based charge readout and light readout system R&D and the first light readout tests with warm and cold argon gas in the full size detector.
The radiation field generated by a high energy and intensity accelerator is of concern in terms of element functionality threat, component damage, electronics reliability, and material activation, ...but also provides signatures that allow actual operating conditions to be monitored. The shower initiated by an energetic hadron involves many different physical processes, down to slow neutron interactions and fragment de-excitation, which need to be accurately described for design purposes and to interpret operation events. The experience with the transport and interaction Monte Carlo code FLUKA at the Large Hadron Collider (LHC), operating at CERN with 4 TeV proton beams (and equivalent magnetic rigidity Pb beams) and approaching nominal luminosity and energy, is presented. Design, operation and upgrade challenges are reviewed in the context of beam-machine interaction account and relevant benchmarking examples based on radiation monitor measurements are shown.