Dijet events are studied in the proton-proton collision data set recorded at s=13 TeV with the ATLAS detector at the Large Hadron Collider in 2015 and 2016, corresponding to integrated luminosities ...of 3.5 fb−1 and 33.5 fb−1 respectively. Invariant mass and angular distributions are compared to background predictions and no significant deviation is observed. For resonance searches, a new method for fitting the background component of the invariant mass distribution is employed. The data set is then used to set upper limits at a 95% confidence level on a range of new physics scenarios. Excited quarks with masses below 6.0 TeV are excluded, and limits are set on quantum black holes, heavy W′ bosons, W* bosons, and a range of masses and couplings in a Z′ dark matter mediator model. Model-independent limits on signals with a Gaussian shape are also set, using a new approach allowing factorization of physics and detector effects. From the angular distributions, a scale of new physics in contact interaction models is excluded for scenarios with either constructive or destructive interference. These results represent a substantial improvement over those obtained previously with lower integrated luminosity.
The ATLAS detector at CERN's LHC will be exposed to proton-proton collisions from beams crossing at 40 MHz. At the design luminosity there are roughly 23 collisions per bunch crossing. ATLAS has ...designed a three-level trigger system to select potentially interesting events. The first-level trigger, implemented in custom-built electronics, reduces the incoming rate to less than 100 kHz with a total latency of less than 2.5μs. The next two trigger levels run in software on commercial PC farms. They reduce the output rate to 100-200 Hz. In preparation for collision data-taking which is scheduled to commence in May 2008, several cosmic-ray commissioning runs have been performed. Among the first sub-detectors available for commissioning runs are parts of the barrel muon detector including the RPC detectors that are used in the first-level trigger. Data have been taken with a full slice of the muon trigger and readout chain, from the detectors in one sector of the RPC system, to the second-level trigger algorithms and the data-acquisition system. The system is being prepared to include the inner-tracking detector in the readout and second-level trigger. We will present the status and results of these cosmic-ray based commissioning activities. This work will prove to be invaluable not only during the commissioning phase but also for cosmic-ray data-taking during the normal running for detector performance studies.
The ATLAS experiment is one of the two general-purpose experiments due to start operation soon at the Large Hadron Collider (LHC). The LHC will collide protons at a centre of mass energy of 14 TeV, ...with a bunch-crossing rate of 40 MHz. The ATLAS three-level trigger will reduce this input rate to match the foreseen offline storage capability of 100-200 Hz. This paper gives an overview of the ATLAS High Level Trigger focusing on the system design and its innovative features. We then present the ATLAS trigger strategy for the initial phase of LHC exploitation. Finally, we report on the valuable experience acquired through in-situ commissioning of the system where simulated events were used to exercise the trigger chain. In particular we show critical quantities such as event processing times, measured in a large-scale HLT farm using a complex trigger menu.
Calorimetry triggering in ATLAS O, Igonkina; I, Aracena; S, Backlund ...
Journal of physics. Conference series,
01/2009, Letnik:
160, Številka:
1
Journal Article
Recenzirano
Odprti dostop
The ATLAS experiment is preparing for data taking at 14 TeV collision energy. A rich discovery physics program is being prepared in addition to the detailed study of Standard Model processes which ...will be produced in abundance. The ATLAS multi-level trigger system is designed to accept one event in 2 ∣ 105 to enable the selection of rare and unusual physics events. The ATLAS calorimeter system is a precise instrument, which includes liquid Argon electro-magnetic and hadronic components as well as a scintillator-tile hadronic calorimeter. All these components are used in the various levels of the trigger system. A wide physics coverage is ensured by inclusively selecting events with candidate electrons, photons, taus, jets or those with large missing transverse energy. The commissioning of the trigger system is being performed with cosmic ray events and by replaying simulated Monte Carlo events through the trigger and data acquisition system.
During 2006 and the first half of 2007, the installation, integration and commissioning of trigger and data acquisition (TDAQ) equipment in the ATLAS experimental area have progressed. There have ...been a series of technical runs using the final components of the system already installed in the experimental area. Various tests have been run including ones where level 1 preselected simulated proton-proton events have been processed in a loop mode through the trigger and dataflow chains. The system included the readout buffers containing the events, event building, level 2 and event filter trigger algorithms. The scalability of the system with respect to the number of event building nodes used has been studied and quantities critical for the final system, such as trigger rates and event processing times, have been measured using different trigger algorithms as well as different TDAQ components. This paper presents the TDAQ architecture, the current status of the installation and commissioning and highlights the main test results that validate the system.
The ATLAS experiment under construction at CERN is due to begin operation at the end of 2007. The detector will record the results of proton-proton collisions at a center-of-mass energy of 14 TeV. ...The trigger is a three-tier system designed to identify in real-time potentially interesting events that are then saved for detailed offline analysis. The trigger system will select approximately 200 Hz of potentially interesting events out of the 40 MHz bunch-crossing rate (with 109 interactions per second at the nominal luminosity). Algorithms used in the trigger system to identify different event features of interest will be described, as well as their expected performance in terms of selection efficiency, background rejection and computation time per event. The talk will concentrate on recent improvements and on performance studies, using a very detailed simulation of the ATLAS detector and electronics chain that emulates the raw data as it will appear at the input to the trigger system.
Flows with moving interfaces appear in a wide range of real world problems. This report, accompanying the video Two fluids level set: High performance simulation and post processing" presents the ...implementation of a Level Set method for two fluid flows in the parallel finite element code Alya that can scale up to thousands of processors. To give an idea of the versatility of the implementation examples extending from the flushing of a toilet to the simulation of free surface flows around ship hulls are presented. The spatial discretization is based on unstructured linear finite elements, tetrahedras and prisms that allow a great degree of flexibility for complex geometries as will be shown in the examples. The time discretization uses a standard trapezoidal rule. The position of the moving interface is captured with the Level Set technique that is better suited for complex flows than interface tracking schemes. The jump in the fluid properties is smoothed in a region close to the interface. For ship hydrodynamics simulations the model has been coupled with the SST k-ω turbulence model.
The 2011 Les Houches workshop was the first to confront LHC data. In the two years since the previous workshop there have been significant advances in both soft and hard QCD, particularly in the ...areas of multi-leg NLO calculations, the inclusion of those NLO calculations into parton shower Monte Carlos, and the tuning of the non-perturbative parameters of those Monte Carlos. These proceedings describe the theoretical advances that have taken place, the impact of the early LHC data, and the areas for future development.