The Large Hadron Collider (LHC) at CERN is built to collide intense proton beams with an unprecedented energy of 7TeV. The design stored energy per beam of 362MJ makes the LHC beams highly ...destructive, so that any beam losses risk to cause quenches of superconducting magnets or damage to accelerator components. Collimators are installed to protect the machine and they define a minimum normalized aperture, below which no other element is allowed. This imposes a limit on the achievable luminosity, since when squeezing β* (the β-function at the collision point) to smaller values for increased luminosity, the β-function in the final focusing system increases. This leads to a smaller normalized aperture that risks to go below the allowed collimation aperture. In the first run of the LHC, this was the main limitation on β*, which was constrained to values above the design specification. In this article, we show through theoretical and experimental studies how tighter collimator openings and a new optics with specific phase-advance constraints allows a β* as small as 40cm, a factor 2 smaller than β*=80cm used in 2015 and significantly below the design value β*=55cm, in spite of a lower beam energy. The proposed configuration with β*=40cm has been successfully put into operation and has been used throughout 2016 as the LHC baseline. The decrease in β* compared to 2015 has been an essential contribution to reaching and surpassing, in 2016, the LHC design luminosity for the first time, and to accumulating a record-high integrated luminosity of around 40 fb−1 in one year, in spite of using less bunches than in the design.
Abstract
Renewed international interest in muon colliders motivates
the continued investigation of the impacts of beam-induced
background on detector performance. This continues the effort
initiated ...by the Muon Accelerator Program and carried out until
2017. The beam-induced background from muon decays directly impacts
detector performance and must be mitigated by optimizing the overall
machine design, with particular attention paid to the machine
detector interface region. In order to produce beam-induced
background events and to study their characteristics in coordination
with the collider optimization, a flexible simulation approach is
needed. To achieve this goal we have chosen to utilize the
combination of LineBuilder and Monte Carlo FLUKA codes. We report
the results of beam-induced background studies with these tools
obtained for a 1.5 TeV center of mass energy collider
configuration. Good agreement with previous simulations using the
MARS15 code demonstrates that our choice of tools meet the accuracy
and performance requirements to perform future optimization studies
on muon collider designs.
The CERN Large Hadron Collider (LHC) is designed to collide proton beams of unprecedented energy, in order to extend the frontiers of high-energy particle physics. During the first very successful ...running period in 2010–2013, the LHC was routinely storing protons at 3.5–4 TeV with a total beam energy of up to 146 MJ, and even higher stored energies are foreseen in the future. This puts extraordinary demands on the control of beam losses. An uncontrolled loss of even a tiny fraction of the beam could cause a superconducting magnet to undergo a transition into a normal-conducting state, or in the worst case cause material damage. Hence a multistage collimation system has been installed in order to safely intercept high-amplitude beam protons before they are lost elsewhere. To guarantee adequate protection from the collimators, a detailed theoretical understanding is needed. This article presents results of numerical simulations of the distribution of beam losses around the LHC that have leaked out of the collimation system. The studies include tracking of protons through the fields of more than 5000 magnets in the 27 km LHC ring over hundreds of revolutions, and Monte Carlo simulations of particle-matter interactions both in collimators and machine elements being hit by escaping particles. The simulation results agree typically within a factor 2 with measurements of beam loss distributions from the previous LHC run. Considering the complex simulation, which must account for a very large number of unknown imperfections, and in view of the total losses around the ring spanning over 7 orders of magnitude, we consider this an excellent agreement. Our results give confidence in the simulation tools, which are used also for the design of future accelerators.
Abstract
The design of smaller and less costly gantries for carbon ion particle therapy represents a major challenge to the diffusion of this treatment. Here we present the work done on the linear ...beam optics of possible gantry layouts, differing for geometry, momentum acceptance, and magnet technology, which share the use of combined function superconducting magnets with a bending field of 4 T. We performed parallel– to–point and point–to–point optics matching at different magnification factors to provide two different beam sizes at the isocenter. Moreover, we considered the orbit distortion generated by magnet errors and we introduced beam position monitors and correctors. The study, together with considerations on the criteria for comparison, is the basis for the design of a novel and compact gantry for hadrontherapy.
One source of experimental background in the CERN Large Hadron Collider (LHC) is particles entering the detectors from the machine. These particles are created in cascades, caused by upstream ...interactions of beam protons with residual gas molecules or collimators. We estimate the losses on the collimators with SixTrack and simulate the showers with FLUKA and MARS to obtain the flux and distribution of particles entering the ATLAS and CMS detectors. We consider some machine configurations used in the first LHC run, with focus on 3.5TeV operation as in 2011. Results from FLUKA and MARS are compared and a very good agreement is found. An analysis of logged LHC data provides, for different processes, absolute beam loss rates, which are used together with further simulations of vacuum conditions to normalize the results to rates of particles entering the detectors. We assess the relative importance of background from elastic and inelastic beam–gas interactions, and the leakage out of the LHC collimation system, and show that beam–gas interactions are the dominating source of machine-induced background for the studied machine scenarios. Our results serve as a starting point for the experiments to perform further simulations in order to estimate the resulting signals in the detectors.
•We simulate sources of machine-induced experimental background at the CERN LHC.•We focus on the ATLAS and CMS experiments.•The LHC machine conditions are analyzed to normalize the simulation results.•Beam–gas interactions is found to be the dominating source of particles entering the detectors.
The High Luminosity (HL) upgrade of the Large Hadron Collider (LHC) will increase the peak luminosity at the experiments by more than a factor of 5 with respect to the LHC design value. To achieve ...this goal, among the upgrade of several beam and machine parameters, the beam intensity will nearly double with respect to the operational LHC value, and the transverse beam emittance will decrease by 50% compared to the LHC design value. Past operational experience showed that coherent beam instabilities may occur for low, positive values of chromaticity, and a higher tune spread than predicted from simulations is required from the dedicated octupole magnets to provide enough Landau damping. With the HL-LHC brighter beams, stability margins will become tighter, and coherent instabilities become stronger if no dedicated mitigation measures are taken. An impedance reduction plan is therefore taking place targeting the collimation system, and the main contributor to the transverse beam coupling impedance at the flattop energy. New collimators with lower resistivity materials will replace the current LHC ones. In this work, we assess the benefits of this impedance reduction with respect to the transverse mode coupling instability threshold. This study quantifies the discrepancy between measured and predicted beam stability thresholds at low chromaticity. It also probes the expected gain of the impedance reduction plan of HL-LHC.
The data produced at the particle physics experiments at the Large Hadron Collider (LHC) contain not only the signals from the collisions, but also a background component from proton losses around ...the accelerator. Understanding, identifying and possibly mitigating this machine-induced background is essential for an efficient data taking, especially for some new physics searches. Among the sources of background are hadronic and electromagnetic showers from proton losses on nearby collimators due to beam-halo cleaning. In this article, the first dedicated LHC measurements of this type of background are presented. Controlled losses of a low-intensity beam on collimators were induced, while monitoring the backgrounds in the ATLAS detector. The results show a clear correlation between the experimental backgrounds and the setting of the tertiary collimators (TCTs). Furthermore, the results are used to show that during normal LHC physics operation the beam halo contributes to the total beam-induced background at the level of a percent or less. A second measurement, where the collimator positions are tightened during physics operation, confirms this finding by setting a limit of about 10% to the contribution from all losses on the TCTs, i.e. the sum of beam halo and elastic beam-gas scattering around the ring. Dedicated simulations of the halo-related background are presented and good agreement with data is demonstrated. These simulations provide information about features that are not experimentally accessible, like correlations between backgrounds and the distributions of proton impacts on the collimators. The results provide vital information about the dependence between background and collimator settings, which is of central importance when optimizing the LHC optics for maximum peak luminosity.
Abstract
The Heavy Ion Therapy Research Integration plus (HITRIplus) is an European project that aims to integrate and propel research and technologies related to cancer treatment with heavy ion ...beams. Among the ambitious goals of the project, a specific work package includes the design of a gantry for carbon ions, based on superconducting magnets. The first milestone to achieve is the choice of the fundamental gantry parameters, namely the beam optics layout, the superconducting magnet technology, and the main user requirements. Starting from a reference 3 T design, the collaboration widely explored dozens of possible gantry configurations at 4 T, aiming to find the best compromise in terms of footprint, capital cost, and required R&D. We present here a summary of these configurations, underlying the initial correlation between the beam optics, the mechanics and the main superconducting dipoles design: the bending field (up to 4 T), combined function features (integrated quadrupoles), magnet aperture (up to 90 mm), and angular length (30° – 45°). The resulting main parameters are then listed, compared, and used to drive the choice of the best gantry layout to be developed in HITRI
plus
.
The design stored beam energy in the CERN high-luminosity large hadron collider (HL-LHC) upgrade is about 700 MJ, with about 36 MJ in the beam tails, according to estimates based on scaling ...considerations from measurements at the LHC. Such a large amount of stored energy in the beam tails poses serious challenges on its control and safe disposal. In particular, orbit jitters can cause significant losses on primary collimators, which can lead to accidental beam dumps, magnet quenches, or even permanent damage to collimators and other accelerator elements. Thus, active control of the diffusion speed of halo particles is necessary and the use of hollow electron lenses (HELs) represents the most promising approach to handle overpopulated tails at the HL-LHC. HEL is a very powerful and advanced tool that can be used for controlled depletion of beam tails, thus enhancing the performance of beam halo collimation. For these reasons, HELs have been recently included in the HL-LHC baseline. In this paper, we present detailed beam dynamics calculations performed with the goal of defining HEL specifications and operational scenarios for HL-LHC. The prospects for effective halo control in HL-LHC are presented.
Abstract
The FAMU experiment aims at an indirect measurement of the Zemach radius of the
proton. The measurement is carried out on muonic hydrogen atoms (μH) produced through the
low-momentum (50–60 ...MeV/c) muon beam a the RIKEN-RAL μ
-
facility. The particle flux plays
an important role in this measurement, as it is proportional to the number of μH atoms
produced, which is the target of the FAMU experimental method. The beam monitor calibration
technique and results, presented here, are meant to extract a reliable estimation of the muon flux
during the FAMU data taking. These measurements were carried out at the CNAO synchrotron in Pavia,
Italy, using proton beams and supported by Monte Carlo simulation of the detector in Geant4.