We propose a method to organize experimental data from particle collision experiments in a general format which can enable a simple visualization and effective classification of collision data using ...machine learning techniques. The method is based on sparse fixed-size matrices with single- and two-particle variables containing information on identified particles and jets. We illustrate this method using an example of searches for new physics at the LHC experiments.
A file repository for calculations of cross sections and kinematic distributions using Monte Carlo generators for high-energy collisions is discussed. The repository is used to facilitate effective ...preservation and archiving of data from theoretical calculations and for comparisons with experimental data. The HepSim data library is publicly accessible and includes a number of Monte Carlo event samples with Standard Model predictions for current and future experiments. The HepSim project includes a software package to automate the process of downloading and viewing online Monte Carlo event samples. Data streaming over a network for end-user analysis is discussed.
The physics potential of timing layers with a few tens of pico-second resolution in the calorimeters of future collider detectors is explored. These studies show how such layers can be used for ...particle identification and illustrate the potential for detecting new event signatures originating from physics beyond the standard model.
This paper explores the physics reach of the High-Luminosity Large Hadron Collider (HL-LHC) for searches of new particles decaying to two jets. We discuss inclusive searches in dijets and b-jets, as ...well as searches in semi-inclusive events by requiring an additional lepton that increases sensitivity to different aspects of the underlying processes. We discuss the expected exclusion limits for generic models predicting new massive particles that result in resonant structures in the dijet mass. Prospects of the Higher-Energy LHC (HE-LHC) collider are also discussed. The study is based on the Pythia8 Monte Carlo generator using representative event statistics for the HL-LHC and HE-LHC running conditions. The event samples were created using supercomputers at NERSC.
Congruence-Permutable -Acts Stepanova, A. A.; Chekanov, S. G.
Siberian mathematical journal,
2022/1, Letnik:
63, Številka:
1
Journal Article
Recenzirano
An algebra is congruence-permutable if its congruences commute under composition. Many familiar varieties of algebras, such as the variety of groups, consist of congruence-permutable algebras, but ...the variety of
-acts have not congruence-permutable members. We describe congruence-permutable
-acts in the cases that
is a commutative monoid or a group.
A new data format for Monte Carlo (MC) events, or any structural data, including experimental data, is discussed. The format is designed to store data in a compact binary form using variable-size ...integer encoding as implemented in the Google’s Protocol Buffers package. This approach is implemented in the ProMC library which produces smaller file sizes for MC records compared to the existing input–output libraries used in high-energy physics (HEP). Other important features of the proposed format are a separation of abstract data layouts from concrete programming implementations, self-description and random access. Data stored in ProMC files can be written, read and manipulated in a number of programming languages, such C++, JAVA, FORTRAN and PYTHON.
A jet algorithm based on the k-means clustering procedure is proposed which can be used for the invariant-mass reconstruction of heavy states decaying to hadronic jets. The proposed algorithm was ...tested by reconstructing e+e-→tt̄→6 jets and e+e-→W+W-→4 jets processes at \(\sqrt{s}=500\,\mathrm{GeV}\) using a Monte Carlo simulation. It was shown that the algorithm has a reconstruction efficiency similar to traditional jet-finding algorithms, and leads to 25% and 40% improvement of the top-quark and W mass resolution, respectively, compared to the kT (Durham) algorithm. In addition, it is expected that the peak positions measured with the new algorithm have smaller systematical uncertainty.
Rapidly applying the effects of detector response to physics objects (e.g. electrons, muons, showers of particles) is essential in high energy physics. Currently available tools for the ...transformation from truth-level physics objects to reconstructed detector-level physics objects involve manually defining resolution functions. These resolution functions are typically derived in bins of variables that are correlated with the resolution (e.g. pseudorapidity and transverse momentum). This process is time consuming, requires manual updates when detector conditions change, and can miss important correlations. Machine learning offers a way to automate the process of building these truth-to-reconstructed object transformations and can capture complex correlation for any given set of input variables. Such machine learning algorithms, with sufficient optimization, could have a wide range of applications: improving phenomenological studies by using a better detector representation, allowing for more efficient production of Geant4 simulation by only simulating events within an interesting part of phase space, and studies on future experimental sensitivity to new physics.
Jet substructure variables for hadronic jets with transverse momenta in the range from 2.5 TeV to 20 TeV were studied using several designs for the spatial size of calorimeter cells. The studies used ...the full Geant4 simulation of calorimeter response combined with realistic reconstruction of calorimeter clusters. In most cases, the results indicate that the performance of jet-substructure reconstruction improves with reducing cell size of a hadronic calorimeter from Δη×Δφ=0.087×0.087, which are similar to the cell sizes of the calorimeters of LHC experiments, by a factor of four, to 0.022×0.022.