We discuss the physics potential and the experimental challenges of an upgraded LHC running at an instantaneous luminosity of 1035 cm-2s-1. The detector R&D needed to operate ATLAS and CMS in a very ...high radiation environment and the expected detector performance are discussed. A few examples of the increased physics potential are given, ranging from precise measurements within the Standard Model (in particular in the Higgs sector) to the discovery reach for several New Physics processes.
Collisions of the Large Hadron Collider (LHC) were collected by ATLAS end of 2009 at a centre of mass energy of 900 GeV and are now being collected at √s = 7 TeV. This paper gives an overview of the ...performance of the ATLAS high level trigger for the selection of electrons, photons, taus, jets and missing transverse energy. Comparisons of the selection variables based on the calorimeter and tracking information calculated by the different trigger levels and the offline reconstruction are shown. This has been an important step in the commissioning of these triggers to ensure their correct functioning and the results from the first data are very encouraging. Furthermore, examples of comparisons between data and Monte Carlo simulations of some of the selection variables are presented. At the end a brief outlook will be given on the steps to be taken to fully commission these triggers with 7 TeV collision data.
The ATLAS experiment at the Large Hadron Collider (LHC) will face the challenge of efficiently selecting interesting candidate events in pp collisions at 14 TeV center-of-mass energy, whilst ...rejecting the enormous number of background events. The High-Level Trigger (HLT=second level trigger and Event Filter), which is a software based trigger will need to reduce the level-1 output rate of ap75 kHz to ap200 Hz written out to mass storage. In this talk an overview of the current physics and system performance of the HLT selection for electrons and photons is given. The performance has been evaluated using Monte Carlo simulations and has been partly demonstrated in the ATLAS testbeam in 2004. The efficiency for the signal channels, the rate expected for the selection, the global data preparation and execution times will be highlighted. Furthermore, some physics examples will be discussed to demonstrate that the triggers are well adapted for the physics programme envisaged at the LHC
The Event Filter (EF) selection stage is a fundamental component of the ATLAS Trigger and Data Acquisition architecture. Its primary function is the reduction of data flow and rate to values ...acceptable by the mass storage operations and by the subsequent offline data reconstruction and analysis steps. The computing instrument of the EF is organized as a set of independent subfarms, each connected to one output of the Event Builder (EB) switch fabric. Each subfarm comprises a number of processors analyzing several complete events in parallel. This paper describes the design of the ATLAS EF system, its deployment in the 2004 ATLAS combined test beam together with some examples of integrating selection and monitoring algorithms. Since the processing algorithms are not explicitly designed for EF but are adapted from the offline ones, special emphasis is reserved to system reliability and data security, in particular for the case of failures in the processing algorithms. Other key design elements have been system modularity and scalability. The EF shall be able to follow technology evolution and should allow for using additional processing resources possibly remotely located
The ATLAS High Level Trigger's (HLT) primary function of event selection will be accomplished with a Level-2 trigger farm and an event filter (EF) farm, both running software components developed in ...the ATLAS offline reconstruction framework. While this approach provides a unified software framework for event selection, it poses strict requirements on offline components critical for the Level-2 trigger. A Level-2 decision in ATLAS must typically be accomplished within 10 ms and with multiple event processing in concurrent threads. To address these constraints, prototypes have been developed that incorporate elements of the ATLAS data flow, high level trigger, and offline framework software. To realize a homogeneous software environment for offline components in the HLT, the Level-2 Steering Controller was developed. With electron/gamma- and muon-selection slices it has been shown that the required performance can be reached, if the offline components used are carefully designed and optimized for the application in the HLT.
ATLAS is one of the four major Large Hadron Collider (LHC) experiments that will start data taking in 2007. It is designed to cover a wide range of physics topics. The ATLAS trigger system has to be ...able to reduce an initial 40 MHz event rate, corresponding to an average of 23 proton-proton inelastic interactions per every 25 ns bunch crossing, to 200 Hz admissible by the Data Acquisition System. The ATLAS trigger is divided in three different levels. The first one provides a signal describing an event signature using dedicated custom hardware. This signature must be confirmed by the High Level Trigger (HLT) which using commercial computing farms performs an event reconstruction by running a sequence of algorithms. The validity of a signature is checked after every algorithm execution. A main characteristic of the ATLAS HLT is that only the data in a certain window around the position flagged by the first level trigger are analyzed. In this work, the performance of one sequence that runs at the Event Filter level (third level) is demonstrated. The goal of this sequence is to reconstruct and identify high transverse momentum electrons by performing cluster reconstruction at the electromagnetic calorimeter, track reconstruction at the Inner Detector, and cluster track matching.
The DELPHI Silicon Tracker at LEP2 Chochula, P; Rosinský, P; Andreazza, A ...
Nuclear instruments & methods in physics research. Section A, Accelerators, spectrometers, detectors and associated equipment,
08/1998, Letnik:
412, Številka:
2-3
Journal Article
Recenzirano
Odprti dostop
It is now considered that the information available in bibliographical databases is dated, validated through a long process which does not make it very innovative. Furthermore, database processing is ...normally performed using boolean operators : the results obtained from a query provides a sum of expected information which, in itself, does not deliver any novelty. Don Swanson's work demonstrates the unsuspected potential of bibliographical databases in revealing and discovering knowledge.The interest of his approach lies less on the available information itself than on the methodology used to disclose new knowledge. This general methodology fits perfectly well within an environment of validated and structured information, as is the case for bibliographical data. The expression Knowledge Discovery in Databases (KDD) indicates a methodology which creates new knowledge based upon bibliographical data. In this article, we will cover the principals of KDD based on Don Swanson's work as well as the method used to disclose knowledge within biomedical bibliographical databases.
The DELPHI pixels Becks, K.H.; Borghi, P.; Brunet, J.M. ...
Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment,
02/1997, Letnik:
386, Številka:
1
Journal Article, Conference Proceeding
Recenzirano
To improve tracking in the very forward direction for running at LEP200, the angular acceptance of the DELPHI Vertex detector has been extended from 45° to 11° with respect to the beam axis. Pixel ...detector crowns cover the region between 25° and 13°. Due to very tight space and material thickness constraints it was necessary to develop new techniques (integrated busses in the detector substrate, high density layout on Kapton, etc.). About 1000 cm
2 of pixels are already installed and working in DELPHI. Techniques, tests and production of these detectors will be described, as well as the main problems encountered during this work.
Pollution of liquid argon after neutron irradiation Andrieux, M.L.; Belymam, A.; Collot, J. ...
Nuclear instruments & methods in physics research. Section B, Beam interactions with materials and atoms,
2001, Letnik:
183, Številka:
3
Journal Article
Recenzirano
The purpose of the neutron facility installed at SARA is to investigate the behavior of various materials to be used in the ATLAS liquid argon calorimeter, when submitted to fast neutron radiation. ...The samples are placed in a liquid argon cryostat a few cm away from the neutron source. Various pieces of the electromagnetic calorimeter have been tested in order to evaluate the rate of pollution of the liquid and consequently the possible signal loss in energy measurements. The average fluence was equivalent to the maximum expected in the calorimeter in about 10 years. The most striking feature of the results is that the pollution is not due to oxygen, at least for most of it. Using a particular value of the absorption length derived from these data, a simulation was carried out and the energy signal loss in the calorimeter could be predicted. Within the limits of our present knowledge, the conclusion is that damages due to this pollution will not be a problem.
A full azimuthal
φ
-wedge of the ATLAS liquid argon end-cap calorimeter has been exposed to beams of electrons, muons and pions in the energy range
6
GeV
⩽
E
⩽
200
GeV
at the CERN SPS. The angular ...region studied corresponds to the ATLAS impact position around the pseudorapidity interval
1.6
<
|
η
|
<
1.8
. The beam test setup is described. A detailed study of the performance is given as well as the related intercalibration constants obtained. Following the ATLAS hadronic calibration proposal, a first study of the hadron calibration using a weighting ansatz is presented. The results are compared to predictions from Monte Carlo simulations, based on GEANT 3 and GEANT 4 models.