Punzi-loss Abudinén, F; Bertemes, M; Bilokin, S ...
The European physical journal. C, Particles and fields,
02/2022, Volume:
82, Issue:
2
Journal Article
Peer reviewed
Open access
We present the novel implementation of a non-differentiable metric approximation and a corresponding loss-scheduling aimed at the search for new particles of unknown mass in high energy physics ...experiments. We call the loss-scheduling, based on the minimisation of a figure-of-merit related function typical of particle physics, a Punzi-loss function, and the neural network that utilises this loss function a Punzi-net. We show that the Punzi-net outperforms standard multivariate analysis techniques and generalises well to mass hypotheses for which it was not trained. This is achieved by training a single classifier that provides a coherent and optimal classification of all signal hypotheses over the whole search space. Our result constitutes a complementary approach to fully differentiable analyses in particle physics. We implemented this work using PyTorch and provide users full access to a public repository containing all the codes and a training example.
AbstractThe CMS experiment will take data at the CERN LHC starting from 2007. The CMS muon system has been designed to identify, reconstruct and measure muons with high efficiency and accuracy. In ...this paper the layout of the system and the key features of the detectors are presented. The reconstruction algorithms, in the context of the High Level Trigger, and their performance in terms of resolution, rate reduction and trigger efficiency are also discussed.PACS: 25.70.Ef – 21.60.Gx – 27.30.+t
Beginning in 2009, the CMS experiment will produce several petabytes of data each year which will be distributed over many computing centres geographically distributed in different countries. The CMS ...computing model defines how the data is to be distributed and accessed to enable physicists to efficiently run their analyses over the data. The analysis will be performed in a distributed way using Grid infrastructure. CRAB (CMS remote analysis builder) is a specific tool, designed and developed by the CMS collaboration, that allows the end user to transparently access distributed data. CRAB interacts with the local user environment, the CMS data management services and with the Grid middleware; it takes care of the data and resource discovery; it splits the user's task into several processes (jobs) and distributes and parallelizes them over different Grid environments; it performs process tracking and output handling. Very limited knowledge of the underlying technical details is required of the end user. The tool can be used as a direct interface to the computing system or can delegate the task to a server, which takes care of the job handling, providing services such as automatic resubmission in case of failures and notification to the user of the task status. Its current implementation is able to interact with gLite and OSG Grid middlewares. Furthermore, with the same interface, it enables access to local data and batch systems such as load sharing facility (LSF). CRAB has been in production and in routine use by end users since Spring 2004. It has been extensively used in studies to prepare the Physics Technical Design Report, in the analysis of reconstructed event samples generated during the Computing Software and Analysis Challenges and in the preliminary cosmic ray data taking. The CRAB architecture and the usage inside the CMS community will be described in detail, as well as the current status and future development.
We present a search for the direct production of a light pseudoscalar a decaying into two photons with the Belle II detector at the SuperKEKB collider. We search for the process e+e−→γa, a→γγ in the ...mass range 0.2<ma<9.7 GeV/c2 using data corresponding to an integrated luminosity of (445±3) pb−1. Light pseudoscalars interacting predominantly with standard model gauge bosons (so-called axionlike particles or ALPs) are frequently postulated in extensions of the standard model. We find no evidence for ALPs and set 95% confidence level upper limits on the coupling strength gaγγ of ALPs to photons at the level of 10−3 GeV−1. The limits are the most restrictive to date for 0.2<ma<1 GeV/c2.
Theories beyond the standard model often predict the existence of an additional neutral boson, the Z^{'}. Using data collected by the Belle II experiment during 2018 at the SuperKEKB collider, we ...perform the first searches for the invisible decay of a Z^{'} in the process e^{+}e^{-}→μ^{+}μ^{-}Z^{'} and of a lepton-flavor-violating Z^{'} in e^{+}e^{-}→e^{±}μ^{∓}Z^{'}. We do not find any excess of events and set 90% credibility level upper limits on the cross sections of these processes. We translate the former, in the framework of an L_{μ}-L_{τ} theory, into upper limits on the Z^{'} coupling constant at the level of 5×10^{-2}-1 for M_{Z^{'}}≤6 GeV/c^{2}.
CMS has a distributed computing model, based on a hierarchy of tiered regional computing centres. However, the end physicist is not interested in the details of the computing model nor the complexity ...of the underlying infrastructure, but only to access and use efficiently and easily the remote services. The CMS Remote Analysis Builder (CRAB) is the official CMS tool that allows the access to the distributed data in a transparent way. We present the current development direction, which is focused on improving the interface presented to the user and adding intelligence to CRAB such that it can be used to automate more and more the work done on behalf of user. We also present the status of deployment of the CRAB system and the lessons learnt in deploying this tool to the CMS collaboration.
The CMS experiment at LHC started using the Resource Broker (by the EDG and LCG projects) to submit Monte Carlo production and analysis jobs to distributed computing resources of the WLCG ...infrastructure over 6 years ago. Since 2006 the gLite Workload Management System (WMS) and Logging & Bookkeeping (LB) are used. The interaction with the gLite-WMS/LB happens through the CMS production and analysis frameworks, respectively ProdAgent and CRAB, through a common component, BOSSLite. The important improvements recently made in the gLite-WMS/LB as well as in the CMS tools and the intrinsic independence of different WMS/LB instances allow CMS to reach the stability and scalability needed for LHC operations. In particular the use of a multi-threaded approach in BOSSLite allowed to increase the scalability of the systems significantly. In this work we present the operational set up of CMS production and analysis based on the gLite-WMS and the performances obtained in the past data challenges and in the daily Monte Carlo productions and user analysis usage in the experiment.
The CMS analysis chain in a distributed environment Barrass, T.; Bonacorsi, D.; Ciraolo, G. ...
Nuclear instruments & methods in physics research. Section A, Accelerators, spectrometers, detectors and associated equipment,
04/2006, Volume:
559, Issue:
1
Journal Article
Peer reviewed
Open access
The CMS collaboration is undertaking a big effort to define the analysis model and to develop software tools with the purpose of analysing several millions of simulated and real data events by a ...large number of people in many geographically distributed sites. From the computing point of view, one of the most complex issues when doing remote analysis is the data discovery and access. Some software tools were developed in order to move data, make them available to the full international community and validate them for the subsequent analysis. The batch analysis processing is performed with workload management tools developed on purpose, which are mainly responsible for the job preparation and the job submission. The job monitoring and the output management are implemented as the last part of the analysis chain. Grid tools provided by the LCG project are evaluated to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the analysis jobs. An overview of the current implementation and of the interactions between the previous components of the CMS analysis system is presented in this work.