The ALICE experiment at the CERN LHC will feature several upgrades for Run 3, one of which is a new Inner Tracking System (ITS). The ITS upgrade is currently under development and commissioning, and ...will be installed during the ongoing long shutdown 2.
A number of factors will have an impact on the performance and readout efficiency of the ITS in run 3, and to that end, a simulation model of the readout logic in the ALPIDE pixel sensor chips for the ITS was developed, using the SystemC library for system level modeling in C++. This simulation model is three orders of magnitude faster than a normal HDL simulation of the chip and facilitates simulations of an increased number of events for a large portion of the detector.
In this paper, we present simulation results, where we have been able to quantify detector performance under different running conditions. The results are used for system configuration as well as for the ongoing development of the readout electronics.
A prototype of a new type of calorimeter has been designed and constructed, based on a silicon–tungsten sampling design using pixel sensors with digital readout. It makes use of the ALPIDE sensor ...developed for the ALICE Inner Tracking System (ITS) upgrade. A binary readout is possible due to the pixel size of ≈30×30μm2. This prototype has been successfully tested with cosmic muons and with test beams at DESY and the CERN SPS. We report on performance results obtained at DESY, showing good energy resolution and linearity, and compare to detailed MC simulations. Also shown are preliminary results of the high-energy performance as measured at the SPS. The two-shower separation capabilities are discussed.
•First fully digital electromagnetic calorimeter with high-speed readout built.•ALPIDE pixel sensors work well in high particle-density environment.•Basic calorimetric performance of pixel calorimeter on par with state of the art.•Has unique capabilities in terms of position resolution and two-shower separation.
We propose a novel technique for reconstructing charged particles in digital tracking calorimeters using reinforcement learning aiming to benefit from the rapid progress and success of neural network ...architectures without the dependency on simulated or manually-labeled data. Here we optimize by trial-and-error a behavior policy acting as an approximation to the full combinatorial optimization problem, maximizing the physical plausibility of sampled trajectories. In modern processing pipelines used in high energy physics and related applications, tracking plays an essential role allowing to identify and follow charged particle trajectories traversing particle detectors. Due to the high multiplicity of charged particles and their physical interactions, randomly deflecting the particles, the reconstruction is a challenging undertaking, requiring fast, accurate and robust algorithms. Our approach works on graph-structured data, capturing track hypotheses through edge connections between particles in the detector layers. We demonstrate in a comprehensive study on simulated data for a particle detector used for proton computed tomography, the high potential as well as the competitiveness of our approach compared to a heuristic search algorithm and a model trained on ground truth. Finally, we point out limitations of our approach, guiding towards a robust foundation for further development of reinforcement learning based tracking.
Gradient-based optimization using algorithmic derivatives can be a useful technique to improve engineering designs with respect to a computer-implemented objective function. Likewise, uncertainty ...quantification through computer simulations can be carried out by means of derivatives of the computer simulation. However, the effectiveness of these techniques depends on how 'well-linearizable' the software is. In this study, we assess how promising derivative information of a typical proton computed tomography (pCT) scan computer simulation is for the aforementioned applications.
This study is mainly based on numerical experiments, in which we repeatedly evaluate three representative computational steps with perturbed input values. We support our observations with a review of the algorithmic steps and arithmetic operations performed by the software, using debugging techniques.
The model-based iterative reconstruction (MBIR) subprocedure (at the end of the software pipeline) and the Monte Carlo (MC) simulation (at the beginning) were piecewise differentiable. However, the observed high density and magnitude of jumps was likely to preclude most meaningful uses of the derivatives. Jumps in the MBIR function arose from the discrete computation of the set of voxels intersected by a proton path, and could be reduced in magnitude by a 'fuzzy voxels' approach. The investigated jumps in the MC function arose from local changes in the control flow that affected the amount of consumed random numbers. The tracking algorithm solves an inherently non-differentiable problem.
Besides the technical challenges of merely applying AD to existing software projects, the MC and MBIR codes must be adapted to compute smoother functions. For the MBIR code, we presented one possible approach for this while for the MC code, this will be subject to further research. For the tracking subprocedure, further research on surrogate models is necessary.
First experimental results are presented on event-by-event net-proton fluctuation measurements in Pb–Pb collisions at sNN=2.76 TeV, recorded by the ALICE detector at the CERN LHC. The ALICE detector ...is well suited for such studies due to its excellent particle identification capabilities and large acceptance, which is crucial for fluctuation analysis. The studies are focussed on second order cumulants, but the analysis technique used is more general and will be applied, in the near future, also to higher order cumulants.