Among the physics goals of LHC experiments, precision tests of the Standard Model in the Strong and Electroweak sectors play an important role. Because of nature of the proton-proton processes, ...observables based on the measurement of the direction and energy of leptons provide the most precise signatures. In the present paper, we concentrate on the angular distribution of Drell–Yan process leptons, in the lepton-pair rest-frame. The vector nature of the intermediate state imposes that distributions are to a good precision described by spherical polynomials of at most second order. We show that with the proper choice of the coordinate frames, only one coefficient in this polynomial decomposition remains sizable, even in the presence of one or two high
p
T
jets. The necessary stochastic choice of the frames relies on probabilities independent from any coupling constants. This remains true when one or two partons accompany the lepton pairs. In this way electroweak effects can be better separated from strong interaction ones for the benefit of the interpretation of the measurements. Our study exploits properties of single gluon emission matrix elements which are clearly visible if a conveniently chosen form of their representation is used. We rely also on distributions obtained from matrix element based Monte Carlo generated samples of events with two leptons and up to two additional partons in test samples. Incoming colliding protons’ partons are distributed accordingly to PDFs and are strictly collinear to the corresponding beams.
Precision tests of the Standard Model in the Strong and Electroweak sectors play a crucial role, among the physics program of LHC experiments. Because of the nature of proton–proton processes, ...observables based on the measurement of the direction and energy of final state leptons provide the most precise probes of such processes. In the present paper, we concentrate on the angular distribution of leptons from
W
→
ℓ
ν
decays in the lepton-pair rest-frame. The vector nature of the intermediate state imposes that distributions are to a good precision described by spherical harmonics of at most second order. We argue, that contrary to general belief often expressed in the literature, the full set of angular coefficients can be measured experimentally, despite the presence of escaping detection neutrino in the final state. There is thus no principle difference with respect to the phenomenology of the
Z
/
γ
→
ℓ
+
ℓ
-
Drell–Yan process. We show also, that with the proper choice of the reference frames, only one coefficient in this polynomial decomposition remains sizable, even in the presence of one or more high
p
T
jets. The necessary stochastic choice of the frames relies on probabilities independent from any coupling constants. In this way, electroweak effects (dominated by the
V
-
A
nature of
W
couplings to fermions), can be better separated from the ones of strong interactions. The separation is convenient for the measurements interpretation.
We investigate the potential for measuring the CP state of the Higgs boson in H→ττ decays with consecutive τ-lepton decays in the channels: τ±→ρ±ντ and τ±→a1±ντ combined. Subsequent decays ρ±→π±π0, ...a1±→ρ0π± and ρ0→π+π− are taken into account. We will explore extensions of the method, where the acoplanarity angle for the planes build on the visible decay products, π±π0 of τ±→π±π0ντ, was used. The angle is sensitive to transverse spin correlations, thus to parity. We show that in the case of the cascade decays of τ→a1ν, information on the CP state of Higgs can be extracted from the acoplanarity angles as well. Because in the cascade decay a1±→ρ0π±, ρ0→π+π− up to four planes can be defined, up to 16 distinct acoplanarity angles are available for H→ττ→a1+a1−νν decays. These acoplanarities carry in part supplementary but also correlated information. It is thus cumbersome to evaluate an overall sensitivity. We investigate the sensitivity potential of such analysis, by developing and applying machine learning techniques. We quantify possible improvements when multidimensional phase space of outgoing decay products directions is used, instead of one-dimensional projections i.e. the acoplanarity angles. We do not take into account ambiguities resulting from detector uncertainties or background contamination; we concentrate on the usefulness of machine learning methods and τ→3πν decays for Higgs boson parity measurement.
Matching and comparing the measurements of past and future experiments call for consistency checks of electroweak (EW) calculations used for their interpretation. On the other hand, new calculation ...schemes of the field theory can be beneficial for precision, even if they may obscure comparisons with earlier results. Over the years, concepts of
Improved Born
,
Effective Born
, as well as of effective couplings, in particular of
sin
2
θ
W
eff
mixing angle for EW interactions, have evolved. In our discussion, we use four versions of DIZET EW library for phenomenology of practically all HEP accelerator experiments over the last 30 years. We rely on the codes published and archived with the KKMC Monte Carlo program for
e
+
e
-
→
f
f
¯
n
(
γ
)
and available for the TauSpinner as well. TauSpinner re-weighs generated events for introduction of EW effects. To this end, DIZET is first invoked, and its results are stored in data file and later used. Documentation of TauSpinner upgrade, to version 2.1.0, and that of its new arrangement for semi-automated benchmark plots are provided. In our paper, focus is placed on the numerical results, on the different approximations introduced in Improved Born to obtain Effective Born, which is simpler for applications of strong or QED processes in
pp
or
e
+
e
-
colliders. The
τ
lepton polarization
P
τ
, forward–backward asymmetry
A
FB
and parton-level total cross section
σ
tot
are used to monitor the size of EW effects and effective
sin
2
θ
W
eff
picture limitations for precision physics. Collected results include: (i)
Effective Born
approximations and
sin
2
θ
W
eff
, (ii) differences between versions of EW libraries and (iii) parametric uncertainties due to, for example,
m
t
or
Δ
α
h
(
5
)
(
s
)
. These results can be considered as benchmarks and also allow to evaluate the adequacy of
Effective Born
with respect to
Improved Born
. Definitions are addressed too.
Formula omitted-leptons produced in pp collisions allow to measure Standard Model parameters and offer probes for New Physics. The TauSpinner program can be used to modify spin (or production matrix ...elements) effects in any Formula omitted sample. It relies on the kinematics of outgoing particles: Formula omitted lepton(s) (also Formula omitted in case of W-mediated processes, optionally also four-moments of accompanying hard jets) and Formula omitted decay products. No other information is required from the event record. With calculated spin (or production/decay matrix element) weights, attributed on the event-by-event basis, modifications to the spin/decay/production features, is possible without the need for regenerating events. With TauSpinner algorithms, the experimental techniques developed over years since LEP 1 times are already used and extended for LHC applications. The purpose of the present publication is to systematically document physics basis of the program, and to overview its application domain and systematic errors.
The consecutive steps of cascade decay initiated by H→ττ can be useful for the measurement of Higgs couplings and in particular of the Higgs boson parity. In the previous papers we have found that ...multidimensional signatures of the τ±→π±π0ν and τ±→3π±ν decays can be used to distinguish between scalar and pseudoscalar Higgs state. The machine learning techniques (ML) of binary classification, offered break-through opportunities to manage such complex multidimensional signatures. The classification between two possible CP states: scalar and pseudoscalar, is now extended to the measurement of the hypothetical mixing angle of Higgs boson parity states. The functional dependence of H→ττ matrix element on the mixing angle is predicted by theory. The potential to determine preferred mixing angle of the Higgs boson events sample including τ-decays is studied using deep neural network. The problem is addressed as classification or regression with the aim to determine the per-event: (a) probability distribution (spin weight) of the mixing angle; (b) parameters of the functional form of the spin weight; (c) the most preferred mixing angle. Performance of proposed methods is evaluated and compared.
The H→ττ decays form a promising channel for tests of the CP invariance of Higgs boson couplings. A previous study has shown the viability of deep learning techniques for such a measurement. In this ...paper, the study is extended to consider potential sources of uncertainty. Effects due to the limited experimental resolution are discussed. Furthermore, systematics due to τ decay modeling for complex cascade decays to τ±→a1±ντ→ρ0π±ντ→3π±ντ are also addressed. Various parametrizations using low-energy collision data are considered.
Because of their narrow width,
τ decays can be well separated from their production process. Only spin degrees of freedom connect these two parts of the physics process of interest for high energy ...collision experiments. In the following, we present a Monte Carlo algorithm which is based on that property. The interface supplements events generated by other programs, with
τ decays. Effects of spin, including transverse degrees of freedom, genuine weak corrections or of new physics may be taken into account at the time when a
τ decay is generated and written into an event record. The physics content of the C++ interface is already now richer than its FORTRAN predecessor.
Program title: TAUOLA++, versions 1.0.2, 1.0.3, 1.0.4, 1.0.5, 1.0.6
Catalogue identifier: AELH_v1_0
Program summary URL:
http://cpc.cs.qub.ac.uk/summaries/AELH_v1_0.html
Program obtainable from: CPC Program Library, Queenʼs University, Belfast, N. Ireland
Licensing provisions: Standard CPC licence,
http://cpc.cs.qub.ac.uk/licence/licence.html
No. of lines in distributed program, including test data, etc.: 649 068
No. of bytes in distributed program, including test data, etc.: 6 544 479
Distribution format: tar.gz
Programming language: C++, FORTRAN77
Computer: PCs, workstations
Operating system: Linux, MacOS
RAM:
<
10
MB
Classification: 11.2
External routines: HepMC (
http://lcgapp.cern.ch/project/simu/HepMC/), optional; PYTHIA8 (
http://home.thep.lu.se/~torbjorn/Pythia.html)
Subprograms used:
Cat Id
Title
Reference
ADSM_v2_0
MC-TESTER
Comput. Phys. Commun. 182 (2011) 779
Nature of problem: The code of Monte Carlo generators often has to be tuned to the needs of large HEP Collaborations and experiments. In particular
τ lepton decays need to be added (or modified) to the previously generated (or measured) events encapsulated in an event record.
Solution method: The new algorithm, the universal interface of TAUOLA which works with the HepMC event record of C++ applications is documented. It uses the
τ decay generator as described in 2 with the updates explained in 1. For the new interface spin treatment was improved. For example it features complete spin effects in processes mediated by
Z
/
γ
⁎
interactions. The effects of electroweak corrections can be taken into account in this case as well. In general, the program superseeds its FORTRAN predecessor 1. The event record analysis as well as initialization is also modernized.
Restrictions: The input event record must meet the requirements described in Section 2.3.1 of the documentation.
Unusual features: Two sets of installation scripts; an additional tool for calculating tables for electroweak corrections.
Running time: Depends on the size and complexity of the events. Small events (<50 particles), require 1 to 7 minutes for processing 1 M events on PC/Linux with a 2.4 GHz processor.
References:
1
P. Golonka, B. Kersevan, T. Pierzchala, E. Richter-Was, Z. Was, M. Worek, Comput. Phys. Commun. 174 (2006) 818.
2
S. Jadach, Z. Was, R. Decker, J.H. Kühn, Comput. Phys. Commun. 76 (1993) 361.
Machine learning (ML) techniques are rapidly finding a place among the methods of high-energy physics data analysis. Different approaches are explored concerning how much effort should be put into ...building high-level variables based on physics insight into the problem, and when it is enough to rely on low-level ones, allowing ML methods to find patterns without an explicit physics model. In this paper we continue the discussion of previous publications on the CP state of the Higgs boson measurement of the H→ττ decay channel with the consecutive τ±→ρ±ν; ρ±→π±π0 and τ±→a1±ν; a1±→ρ0π±→3π± cascade decays. The discrimination of the Higgs boson CP state is studied as a binary classification problem between CP even (scalar) and CP odd (pseudoscalar) states using a deep neural network (DNN). Improvements on the classification from the constraints on directly nonmeasurable outgoing neutrinos are discussed. We find that, once added, they enhance the sensitivity sizably, even if only imperfect information is provided. In addition to DNNs we also evaluate and compare other ML methods: boosted trees, random forests, and support vector machines.
Abstract $$\uptau $$ τ -leptons produced in pp collisions allow to measure Standard Model parameters and offer probes for New Physics. The TauSpinner program can be used to modify spin (or production ...matrix elements) effects in any $$\uptau $$ τ sample. It relies on the kinematics of outgoing particles: $$\uptau $$ τ lepton(s) (also $$\upnu _\uptau $$ ντ in case of W-mediated processes, optionally also four-moments of accompanying hard jets) and $$\uptau $$ τ decay products. No other information is required from the event record. With calculated spin (or production/decay matrix element) weights, attributed on the event-by-event basis, modifications to the spin/decay/production features, is possible without the need for regenerating events. With TauSpinner algorithms, the experimental techniques developed over years since LEP 1 times are already used and extended for LHC applications. The purpose of the present publication is to systematically document physics basis of the program, and to overview its application domain and systematic errors.