A
bstract
Fixed-target experiments using primary electron beams can be powerful discovery tools for light dark matter in the sub-GeV mass range. The Light Dark Matter eXperiment (LDMX) is designed to ...measure missing momentum in high-rate electron fixed-target reactions with beam energies of 4 GeV to 16 GeV. A prerequisite for achieving several important sensitivity milestones is the capability to efficiently reject backgrounds associated with few-GeV bremsstrahlung, by twelve orders of magnitude, while maintaining high efficiency for signal. The primary challenge arises from events with photo-nuclear reactions faking the missing-momentum property of a dark matter signal. We present a methodology developed for the LDMX detector concept that is capable of the required rejection. By employing a detailed Geant4-based model of the detector response, we demonstrate that the sampling calorimetry proposed for LDMX can achieve better than 10
−
13
rejection of few-GeV photons. This suggests that the luminosity-limited sensitivity of LDMX can be realized at 4 GeV and higher beam energies.
Building a Distributed Computing System for LDMX Bryngemark, Lene Kristian; Cameron, David; Dutta, Valentina ...
EPJ Web of Conferences,
01/2021, Letnik:
251
Conference Proceeding, Journal Article
Recenzirano
Odprti dostop
Particle physics experiments rely extensively on computing and data services, making e-infrastructure an integral part of the research collaboration. Constructing and operating distributed computing ...can however be challenging for a smaller-scale collaboration. The Light Dark Matter eXperiment (LDMX) is a planned small-scale accelerator-based experiment to search for dark matter in the sub-GeV mass region. Finalizing the design of the detector relies on Monte-Carlo simulation of expected physics processes. A distributed computing pilot project was proposed to better utilize available resources at the collaborating institutes, and to improve scalability and reproducibility. This paper outlines the chosen lightweight distributed solution, presenting requirements, the component integration steps, and the experiences using a pilot system for tests with large-scale simulations. The system leverages existing technologies wherever possible, minimizing the need for software development, and deploys only non-intrusive components at the participating sites. The pilot proved that integrating existing components can dramatically reduce the effort needed to build and operate a distributed e-infrastructure, making it attainable even for smaller research collaborations.
Particles beyond the Standard Model (SM) can generically have lifetimes that are long compared to SM particles at the weak scale. When produced at experiments such as the Large Hadron Collider (LHC) ...at CERN, these long-lived particles (LLPs) can decay far from the interaction vertex of the primary proton-proton collision. Such LLP signatures are distinct from those of promptly decaying particles that are targeted by the majority of searches for new physics at the LHC, often requiring customized techniques to identify, for example, significantly displaced decay vertices, tracks with atypical properties, and short track segments. Given their non-standard nature, a comprehensive overview of LLP signatures at the LHC is beneficial to ensure that possible avenues of the discovery of new physics are not overlooked. Here we report on the joint work of a community of theorists and experimentalists with the ATLAS, CMS, and LHCb experiments-as well as those working on dedicated experiments such as MoEDAL, milliQan, MATHUSLA, CODEX-b, and FASER-to survey the current state of LLP searches at the LHC, and to chart a path for the development of LLP searches into the future, both in the upcoming Run 3 and at the high-luminosity LHC. The work is organized around the current and future potential capabilities of LHC experiments to generally discover new LLPs, and takes a signature-based approach to surveying classes of models that give rise to LLPs rather than emphasizing any particular theory motivation. We develop a set of simplified models; assess the coverage of current searches; document known, often unexpected backgrounds; explore the capabilities of proposed detector upgrades; provide recommendations for the presentation of search results; and look towards the newest frontiers, namely high-multiplicity 'dark showers', highlighting opportunities for expanding the LHC reach for these signals.
This report of the BOOST2012 workshop presents the results of four working groups that studied key aspects of jet substructure. We discuss the potential of first-principle QCD calculations to yield a ...precise description of the substructure of jets and study the accuracy of state-of-the-art Monte Carlo tools. Limitations of the experiments’ ability to resolve substructure are evaluated, with a focus on the impact of additional (pile-up) proton proton collisions on jet substructure performance in future LHC operating scenarios. A final section summarizes the lessons learnt from jet substructure analyses in searches for new physics in the production of boosted top quarks.
A
bstract
The Light Dark Matter eXperiment (LDMX) is an electron-beam fixed-target experiment designed to achieve comprehensive model independent sensitivity to dark matter particles in the sub-GeV ...mass region. An upgrade to the LCLS-II accelerator will increase the beam energy available to LDMX from 4 to 8 GeV. Using detailed GEANT4-based simulations, we investigate the effect of the increased beam energy on the capabilities to separate signal and background, and demonstrate that the veto methodology developed for 4 GeV successfully rejects photon-induced backgrounds for at least 2
×
10
14
electrons on target at 8 GeV.
Particle physics experiments rely extensively on computing and data services, making e-infrastructure an integral part of the research collaboration. Constructing and operating distributed computing ...can however be challenging for a smaller-scale collaboration.
The Light Dark Matter eXperiment (LDMX) is a planned small-scale accelerator-based experiment to search for dark matter in the sub-GeV mass region. Finalizing the design of the detector relies on Monte-Carlo simulation of expected physics processes. A distributed computing pilot project was proposed to better utilize available resources at the collaborating institutes, and to improve scalability and reproducibility.
This paper outlines the chosen lightweight distributed solution, presenting requirements, the component integration steps, and the experiences using a pilot system for tests with large-scale simulations. The system leverages existing technologies wherever possible, minimizing the need for software development, and deploys only non-intrusive components at the participating sites. The pilot proved that integrating existing components can dramatically reduce the effort needed to build and operate a distributed e-infrastructure, making it attainable even for smaller research collaborations.
This Letter reports a measurement of the exclusive gamma gamma -> l(+)l(-) (l = e, mu) cross-section in proton-proton collisions at a centre-of-mass energy of 7 TeV by the ATLAS experiment at the ...LHC, based on an integrated luminosity of 4.6 fb(-1). For the electron or muon pairs satisfying exclusive selection criteria, a fit to the dilepton acoplanarity distribution is used to extract the fiducial cross-sections. The cross-section in the electron channel is determined to be sigma(excl)(gamma gamma -> e+e-) = 0.428 +/- 0.035 (stat.) +/- 0.018 (syst.) pbfor a phase-space region with invariant mass of the electron pairs greater than 24GeV, in which both electrons have transverse momentum p(T) > 12 GeV and pseudorapidity vertical bar eta vertical bar < 2.4. For muon pairs with invariant mass greater than 20GeV, muon transverse momentum pT> 10 GeV and pseudorapidity vertical bar eta vertical bar < 2.4, the cross-section is determined to be sigma(excl)(gamma gamma -> mu+mu-) = 0.628 +/- 0.032(stat.) +/- 0.021 (syst.) pb. When proton absorptive effects due to the finite size of the proton are taken into account in the theory calculation, the measured cross-sections are found to be consistent with the theory prediction.
This paper describes the trigger and offline reconstruction, identification and energy calibration algorithms for hadronic decays of tau leptons employed for the data collected from pp collisions in ...2012 with the ATLAS detector at the LHC center-of-mass energy root s = 8 TeV. The performance of these algorithms is measured in most cases with Z decays to tau leptons using the full 2012 dataset, corresponding to an integrated luminosity of 20.3 fb(-1). An uncertainty on the offline reconstructed tau energy scale of 2-4%, depending on transverse energy and pseudorapidity, is achieved using two independent methods. The offline tau identification efficiency is measured with a precision of 2.5% for hadronically decaying tau leptons with one associated track, and of 4% for the case of three associated tracks, inclusive in pseudorapidity and for a visible transverse energy greater than 20 GeV. For hadronic tau lepton decays selected by offline algorithms, the tau trigger identification efficiency is measured with a precision of 2-8%, depending on the transverse energy. The performance of the tau algorithms, both offline and at the trigger level, is found to be stable with respect to the number of concurrent proton-proton interactions and has supported a variety of physics results using hadronically decaying tau leptons at ATLAS.