A
bstract
We study the impact of precision timing detection systems on the LHC experiments’ long-lived particle search program during the HL-LHC era. We develop algorithms that allow us to ...reconstruct the mass of such charged particles and perform particle identification using the time-of-flight measurement. We investigate the reach for benchmark scenarios as a function of the timing resolution, and find sensitivity improvement of up to a factor of ten over searches that use ionization energy loss information, depending on the particle’s mass.
A
bstract
We interpret within the phenomenological MSSM (pMSSM) the results of SUSY searches published by the CMS collaboration based on the first ~1 fb
−1
of data taken during the 2011 LHC run at 7 ...TeV. The pMSSM is a 19-dimensional parametrization of the MSSM that captures most of its phenomenological features. It encompasses, and goes beyond, a broad range of more constrained SUSY models. Performing a global Bayesian analysis, we obtain posterior probability densities of parameters, masses and derived observables. In contrast to constraints derived for particular SUSY breaking schemes, such as the CMSSM, our results provide more generic conclusions on how the current data constrain the MSSM.
Abstract
The Exa.TrkX project presents a graph neural network (GNN) technique for low-level reconstruction of neutrino interactions in a Liquid Argon Time Projection Chamber (LArTPC). GNNs are still ...a relatively novel technique, and have shown great promise for similar reconstruction tasks in the Large Hadron Collider (LHC). Graphs describing particle interactions are formed by treating each detector hit as a node, with edges describing the relationships between hits. We utilise a multi-head attention message passing network which performs graph convolutions in order to label each node with a particle type.
We present an updated variant of our GNN architecture, with several improvements. After testing the model on more realistic simulation with regions of unresponsive wires, the target was modified from edge classification to node classification in order to increase robustness. Removing edges as a classification target opens up a broader possibility space for edge-forming techniques; we explore the model’s performance across a variety of approaches, such as Delaunay triangulation, kNN, and radius-based methods. We also extend this model to the 3D context, sharing information between detector views. By using reconstructed 3D spacepoints to map detector hits from each wire plane, the model naively constructs 2D representations that are independent yet fully consistent.
MonALISA, which stands for Monitoring Agents using a Large Integrated Services Architecture, has been developed over the last fifteen years by California Insitute of Technology (Caltech) and its ...partners with the support of the software and computing program of the CMS and ALICE experiments at the Large Hadron Collider (LHC). The framework is based on Dynamic Distributed Service Architecture and is able to provide complete system monitoring, performance metrics of applications, Jobs or services, system control and global optimization services for complex systems. A short overview and status of MonALISA is given in this paper.
The SDN Next Generation Integrated Architecture (SDN-NGeNIA) project addresses some of the key challenges facing the present and next generations of science programs in HEP, astrophysics, and other ...fields, whose potential discoveries depend on their ability to distribute, process and analyze globally distributed Petascale to Exascale datasets. The SDN-NGenIA system under development by Caltech and partner HEP and network teams is focused on the coordinated use of network, computing and storage infrastructures, through a set of developments that build on the experience gained in recently completed and previous projects that use dynamic circuits with bandwidth guarantees to support major network flows, as demonstrated across LHC Open Network Environment 1 and in large scale demonstrations over the last three years, and recently integrated with PhEDEx and Asynchronous Stage Out data management applications of the CMS experiment at the Large Hadron Collider. In addition to the general program goals of supporting the network needs of the LHC and other science programs with similar needs, a recent focus is the use of the Leadership HPC facility at Argonne National Lab (ALCF) for data intensive applications.
Future calorimeters and shower maximum detectors at high luminosity colliders need to be highly radiation resistant and very fast. One exciting option for such a detector is a calorimeter composed of ...a secondary emitter as the active element. In this report we outline the study and development of a secondary emission calorimeter prototype using micro-channel plates (MCP) as the active element, which directly amplify the electromagnetic shower signal. We demonstrate the feasibility of using a bare MCP within an inexpensive and robust housing without the need for any photo cathode, which is a key requirement for high radiation tolerance. Test beam measurements of the prototype were performed with 120 GeV primary protons and secondary beams at the Fermilab Test Beam Facility, demonstrating basic calorimetric measurements and precision timing capabilities. Using multiple pixel readout on the MCP, we demonstrate a transverse spatial resolution of 0.8 mm, and time resolution better than 40 ps for electromagnetic showers.
LYSO based precision timing calorimeters Bornheim, A; Apresyan, A; Ronzhin, A ...
Journal of physics. Conference series,
11/2017, Letnik:
928, Številka:
1
Journal Article
Recenzirano
Odprti dostop
In this report we outline the study of the development of calorimeter detectors using bright scintillating crystals. We discuss how timing information with a precision of a few tens of pico seconds ...and below can significantly improve the reconstruction of the physics events under challenging high pileup conditions to be faced at the High-Luminosity LHC or a future hadron collider. The particular challenge in measuring the time of arrival of a high energy photon lies in the stochastic component of the distance of initial conversion and the size of the electromagnetic shower. We present studies and measurements from test beams for calorimeter based timing measurements to explore the ultimate timing precision achievable for high energy photons of 10 GeV and above. We focus on techniques to measure the timing with a high precision in association with the energy of the photon. We present test-beam studies and results on the timing performance and characterization of the time resolution of LYSO-based calorimeters. We demonstrate time resolution of 30 ps is achievable for a particular design.
The next-generation exascale network integrated architecture (NGENIA-ES) is a project specifically designed to accomplish new levels of network and computing capabilities in support of global science ...collaborations through the development of a new class of intelligent, agile networked systems. Its path to success is built upon our ongoing developments in multiple areas, strong ties among our high energy physics, computer and network science, and engineering teams, and our close collaboration with key technology developers and providers deeply engaged in the national strategic computing initiative (NSCI). This paper describes the building of a new class of distributed systems, our work with the leadership computing facilities (LFCs), the use of software-defined networking (SDN) methods, and the use of data-driven methods for the scheduling and optimization of network resources. Sections I-III present the challenges of data-intensive research and the important ingredients of this ecosystem. Sections IV-VI describe some crucial elements of the foreseen solution and some of the progress so far. Sections VII-IX go into the details of orchestration, software-defined networking, and scheduling optimization. Finally, Section X talks about engagement and partnerships, and Section XI gives a summary. References are given at the end.
The high luminosity upgrade of the Large Hadron Collider (HL-LHC) at CERN is expected to provide instantaneous luminosities of 5 × 1034cm−2s−1. The high luminosities expected at the HL-LHC will be ...accompanied by a factor of 5 to 10 more pileup compared with LHC conditions in 2015, causing general confusion for particle identification and event reconstruction. Precision timing allows to extend calorimetric measurements into such a high density environment by subtracting the energy deposits from pileup interactions. Calorimeters employing silicon as the active component have recently become a popular choice for the HL- LHC and future collider experiments which face very high radiation environments. We present studies of basic calorimetric and precision timing measurements using a prototype composed of tungsten absorber and silicon sensor as the active medium. We show that for the bulk of electromagnetic showers induced by electrons in the range of 20 GeV to 30 GeV, we can achieve time resolutions better than 25 ps per single pad sensor.
One possibility to make a fast and radiation resistant shower maximum (SM) detector is to use a secondary emitter as an active element. We present below test beam results, obtained with different ...types of photo detectors based on micro channel plates (MCP) as secondary emitter. The SM time resolution – we obtained for this new type of detector is at the level of 20-30 ps. We estimate that a significant contribution to the detector response originates from secondary emission of the MCP.