The High Luminosity LHC (HL-LHC) will integrate 10 times more luminosity than the LHC, posing significant challenges for radiation tolerance and event pileup on detectors, especially for forward ...calorimetry, and hallmarks the issue for future colliders. As part of its HL-LHC upgrade program, the CMS Collaboration is designing a High Granularity Calorimeter (HGCAL) to replace the existing endcap calorimeters. It features unprecedented transverse and longitudinal segmentation for both electromagnetic (CE-E) and hadronic (CE-H) compartments. This will facilitate particle-flow (PF) calorimetry, where the fine structure of showers can be measured and used to enhance pileup rejection and particle identification, whilst still achieving good energy resolution. The CE-E and a large fraction of CE-H will be based on hexagonal silicon sensors of
0
.
5
−
1
cm
2
cell size, with the remainder of the CE-H based on highly-segmented scintillators with SiPM readout. The intrinsic high-precision timing capabilities of the silicon sensors will add an extra dimension to event reconstruction, especially in terms of pileup rejection. An overview of the HGCAL project is presented in this paper.
The CMS experiment implements a sophisticated two-level online selection system that achieves a rejection factor of nearly 105. The first level (L1) is based on coarse information coming from the ...calorimeters and the muon detectors while the High Level Trigger combines fine-grain information from all sub-detectors. To guarantee a successful and ambitious physics program despite the very large backgrounds and proton-proton collision rates, the CMS Trigger and Data acquisition system must be consolidated. In particular the L1 Calorimeter Trigger hardware and architecture will be upgraded, benefiting from the recent microTCA technology allowing the calorimeter granularity to be better exploited in more advanced algorithms. Benefiting from the enhanced granularity provided by the new system, an innovative dynamic clustering technique has been developed to obtain an optimized tau selection algorithm.
Lot of efforts have been devoted by ATLAS and CMS teams to improve the quality of LHC events analysis with the Matrix Element Method (MEM). Up to now, very few implementations try to face up the huge ...computing resources required by this method. We propose here a highly parallel version, combining MPI and OpenCL, which makes the MEM exploitation reachable for the whole CMS datasets with a moderate cost. In the article, we describe the status of two software projects under development, one focused on physics and one focused on computing. We also showcase their preliminary performance obtained with classical multi-core processors, CUDA accelerators and MIC co-processors. This let us extrapolate that with the help of 6 high-end accelerators, we should be able to reprocess the whole LHC run 1 within 10 days, and that we have a satisfying metric for the upcoming run 2. The future work will consist in finalizing a single merged system including all the physics and all the parallelism infrastructure, thus optimizing implementation for best hardware platforms.
This document summarises the current theoretical and experimental status of the di-Higgs boson production searches, and of the direct and indirect constraints on the Higgs boson self-coupling, with ...the wish to serve as a useful guide for the next years. The document discusses the theoretical status, including state-of-the-art predictions for di-Higgs cross sections, developments on the effective field theory approach, and studies on specific new physics scenarios that can show up in the di-Higgs final state. The status of di-Higgs searches and the direct and indirect constraints on the Higgs self-coupling at the LHC are presented, with an overview of the relevant experimental techniques, and covering all the variety of relevant signatures. Finally, the capabilities of future colliders in determining the Higgs self-coupling are addressed, comparing the projected precision that can be obtained in such facilities. The work has started as the proceedings of the Di-Higgs workshop at Colliders, held at Fermilab from the 4th to the 9th of September 2018, but it went beyond the topics discussed at that workshop and included further developments. FERMILAB-CONF-19-468-E-T, LHCHXSWG-2019-005
The acute effects of a major ozonized autohaemotransfusion on blood fibrinolytic capacity were evaluated in 20 subjects affected by peripheral arterial occlusive disease (PAOD). The parameters ...examined were tissue-type plasminogen activator (t-PA) and plasminogen activator inhibitor type-1 (PAI-1). In subjects not previously submitted to autohaemotransfusion ('unaccustomed' subjects), whether they were PAOD patients or healthy volunteers, the PAI-1/t-PA ratio in the blood samples taken 15 min before the autohaemotransfusion was higher (P < or = 0.05) than at baseline. These changes were independent of the presence of ozone in the autohaemotransfusion blood. Values in both healthy and PAOD-affected individuals were again at baseline 120 min after the end of autohaemotransfusion. In PAOD patients and in healthy subjects previously submitted to several autohaemotransfusions ('accustomed' subjects), the PAI-1/t-PA ratio did not significantly change before, during and after an additional autohaemotransfusion. The results (the increased heart rate and epinephrine and norepinephrine urinary excretion only in non-accustomed subjects) suggest that the acute fibrinolytic imbalance is caused by the apprehensive state produced by the procedure in unaccustomed subjects. Autohaemotransfusion with ozonized blood per se does not significantly influence the fibrinolytic balance.
Two-particle correlations in pPb collisions at a nucleon-nucleon center-of-mass energy of 5.02 TeV are studied as a function of the pseudorapidity separation (Delta eta) of the particle pair at small ...relative azimuthal angle (abs(Delta phi)< pi/3). The correlations are decomposed into a jet component that dominates the short-range correlations (abs(Delta eta) < 1), and a component that persists at large Delta eta and may originate from collective behavior of the produced system. The events are classified in terms of the multiplicity of the produced particles. Finite azimuthal anisotropies are observed in high-multiplicity events. The second and third Fourier components of the particle-pair azimuthal correlations, V2 and V3, are extracted after subtraction of the jet component. The single-particle anisotropy parameters v2 and v3 are normalized by their lab frame mid-rapidity value and are studied as a function of etacm. The normalized v2 distribution is found to be asymmetric about etacm = 0, with smaller values observed at forward pseudorapidity, corresponding to the direction of the proton beam, while no significant pseudorapidity dependence is observed for the normalized v3 distribution within the statistical uncertainties.
A measurement of the top quark pair production (t-tbar) cross section in proton-proton collisions at the centre-of-mass energy of 8 TeV is presented using data collected with the CMS detector at the ...LHC, corresponding to an integrated luminosity of 19.6 inverse-femtobarns. This analysis is performed in the t-tbar decay channels with one isolated, high transverse momentum electron or muon and at least four jets, at least one of which is required to be identified as originating from hadronization of a b quark. The calibration of the jet energy scale and the efficiency of b jet identification are determined from data. The measured t-tbar cross section is 228.5 +/- 3.8 (stat) +/- 13.7 (syst) +/- 6.0 (lumi) pb. This measurement is compared with an analysis of 7 TeV data, corresponding to an integrated luminosity of 5.0 inverse-femtobarns, to determine the ratio of 8 TeV to 7 TeV cross sections, which is found to be 1.43 +/- 0.04 (stat) +/- 0.07 (syst) +/- 0.05 (lumi). The measurements are in agreement with QCD predictions up to next-to-next-to-leading order.