The Higgs Machine Learning Challenge Adam-Bourdarios, C; Cowan, G; Germain-Renaud, C ...
Journal of physics. Conference series,
12/2015, Letnik:
664, Številka:
7
Journal Article
Recenzirano
Odprti dostop
The Higgs Machine Learning Challenge was an open data analysis competition that took place between May and September 2014. Samples of simulated data from the ATLAS Experiment at the LHC corresponding ...to signal events with Higgs bosons decaying to τ+τ- together with background events were made available to the public through the website of the data science organization Kaggle (kaggle.com). Participants attempted to identify the search region in a space of 30 kinematic variables that would maximize the expected discovery significance of the signal process. One of the primary goals of the Challenge was to promote communication of new ideas between the Machine Learning (ML) and HEP communities. In this regard it was a resounding success, with almost 2,000 participants from HEP, ML and other areas. The process of understanding and integrating the new ideas, particularly from ML into HEP, is currently underway.
A recent common theme among HEP computing is exploitation of opportunistic resources in order to provide the maximum statistics possible for Monte Carlo simulation. Volunteer computing has been used ...over the last few years in many other scientific fields and by CERN itself to run simulations of the LHC beams. The ATLAS@Home project was started to allow volunteers to run simulations of collisions in the ATLAS detector. So far many thousands of members of the public have signed up to contribute their spare CPU cycles for ATLAS, and there is potential for volunteer computing to provide a significant fraction of ATLAS computing resources. Here we describe the design of the project, the lessons learned so far and the future plans.
ATLAS@Home is a volunteer computing project which allows the public to contribute to computing for the ATLAS experiment through their home or office computers. The project has grown continuously ...since its creation in mid-2014 and now counts almost 100,000 volunteers. The combined volunteers' resources make up a sizeable fraction of overall resources for ATLAS simulation. This paper takes stock of the experience gained so far and describes the next steps in the evolution of the project. These improvements include running natively on Linux to ease the deployment on for example university clusters, using multiple cores inside one task to reduce the memory requirements and running different types of workload such as event generation. In addition to technical details the success of ATLAS@Home as an outreach tool is evaluated.
The production of a $W$ boson in association with a single charm quark is studied using 140 fb–1 of $\sqrt{s}$ = $13$ $\mathrm{TeV}$ proton-proton collision data collected with the ATLAS detector at ...the Large Hadron Collider. The charm quark is tagged by the presence of a charmed hadron reconstructed with a secondary-vertex fit. The $W$ boson is reconstructed from the decay to either an electron or a muon and the missing transverse momentum present in the event. The charmed mesons reconstructed are D+ → K– π+ π+ and D*+ → D0 π+ → (K– π+) π+ and the charge conjugate decays in the fiducial regions where $p$T($e,μ$) > 30 GeV, |$η(e,μ)$| < 2.5, $p$T (D(*)) > 8 GeV, and |$η$(D(*))| < 2.2. The integrated and normalized differential cross sections as a function of the pseudorapidity of the lepton from the $W$ boson decay, and of the transverse momentum of the charmed hadron, are extracted from the data using a profile likelihood fit. The measured total fiducial cross sections are ${σ}_{fid}^{OS – SS}$ (W– + D+) = 50.2 ± 0.2 ${(stat)}_{–2.3}^{+2.4}$(syst) pb, ${σ}_{fid}^{OS – SS}$ (W+ + D–) = 48.5 ± 0.2 ${(stat)}_{–2.2}^{+2.3}$(syst) pb, ${σ}_{fid}^{OS – SS}$ (W– + D*+) = 51.1 ± 0.4 ${(stat)}_{–1.8}^{+1.9}$(syst) pb, ${σ}_{fid}^{OS – SS}$ (W+ + D*–) = 50.0 ± 0.4 ${(stat)}_{–1.8}^{+1.9}$(syst) pb. Results are compared with the predictions of next-to-leading-order quantum chromodynamics calculations performed using state-of-the-art parton distribution functions. Additionally, the ratio of charm to anticharm production cross sections is studied to probe the $s$-$\bar{s}$ quark asymmetry. The ratio is found to be ${R}_{c}^{±}$= 0.971 ± 0.006 (stat) ± 0.011 (syst). The ratio and cross-section measurements are consistent with the predictions obtained with parton distribution function sets that have a symmetric $s$-$\bar{s}$ sea, indicating that any $s$-$\bar{s}$ asymmetry in the Bjorken-x region relevant for this measurement is small.
The ATLAS Distributed Computing (ADC) group established a new Computing Run Coordinator (CRC) shift at the start of LHC Run 2 in 2015. The main goal was to rely on a person with a good overview of ...the ADC activities to ease the ADC experts' workload. The CRC shifter keeps track of ADC tasks related to their fields of expertise and responsibility. At the same time, the shifter maintains a global view of the day-to-day operations of the ADC system. During Run 1, this task was accomplished by a person of the expert team called the ADC Manager on Duty (AMOD), a position that was removed during the shutdown period due to the reduced number and availability of ADC experts foreseen for Run 2. The CRC position was proposed to cover some of the AMODs former functions, while allowing more people involved in computing to participate. In this way, CRC shifters help with the training of future ADC experts. The CRC shifters coordinate daily ADC shift operations, including tracking open issues, reporting, and representing ADC in relevant meetings. The CRC also facilitates communication between the ADC experts team and the other ADC shifters. These include the Distributed Analysis Support Team (DAST), which is the first point of contact for addressing all distributed analysis questions, and the ATLAS Distributed Computing Shifters (ADCoS), which check and report problems in central services, sites, Tier-0 export, data transfers and production tasks. Finally, the CRC looks at the level of ADC activities on a weekly or monthly timescale to ensure that ADC resources are used efficiently.
A measurement of the top quark pair-production cross section in the lepton+jets decay channel is presented. It is based on 4.6 fb−1 of √s=7 TeV pp collision data collected during 2011 by the ATLAS ...experiment at the CERN Large Hadron Collider. A three-class, multidimensional event classifier based on support vector machines is used to differentiate t¯t events from backgrounds. The t¯t production cross section is found to be σt¯t=168.5±0.7(stat) +6.2−5.9(syst) +3.4−3.2(lumi) pb. The result is consistent with the Standard Model prediction based on QCD calculations at next-to-next-to-leading order.