Computers are no longer getting faster: instead, they are growing more and more CPUs, each of which is no faster than the previous generation. This increase in the number of cores evidently calls for ...more parallelism in HENP software. If end-users' stand-alone analysis applications are relatively easy to modify, LHC experiments frameworks, being mostly written with a single 'thread' of execution in mind and consequent code bases, are on the other hand more challenging to parallelize. Widespread and inconsiderate changes so close to data taking are out of the equation: we need clear strategies and guidelines to reap the benefits out of the multicore/manycore era while minimizing the code changes.
The ATLAS detector at CERN's LHC will be exposed to proton-proton collisions at a rate of 40 MHz. To reduce the data rate, only potentially interesting events are selected by a three-level trigger ...system. The first level is implemented in custom-made electronics, with an output rate to less than 100 kHz. The second and third level are software triggers with a final output rate of 100 to 200 Hz. A system has been designed and implemented that holds and records the full configuration information of all three trigger levels at a centrally maintained location. This system provides fast access to consistent configuration information of the online trigger system for the purpose of data taking as well as to all parts of the offline trigger simulation. The use of relational database technology provides a means of reliable recording of the trigger configuration history over the lifetime of the experiment. In addition to the online system, tools for flexible browsing and manipulation of trigger configurations, and for their distribution across the ATLAS reconstruction sites have been developed. The usability of this design has been demonstrated in dedicated configuration tests of the ATLAS level-1 Central Trigger and of a 600-node software trigger computing farm. Further tests on a computing cluster which is part of the final high level trigger system were also successful.
The ATLAS Event Builder Vandelli, W.; Abolins, M.; Battaglia, A. ...
IEEE transactions on nuclear science,
12/2008, Letnik:
55, Številka:
6
Journal Article
Recenzirano
Odprti dostop
Event data from proton-proton collisions at the LHC will be selected by the ATLAS experiment in a three-level trigger system, which, at its first two trigger levels (LVL1+LVL2), reduces the initial ...bunch crossing rate of 40 MHz to ~ 3 kHz. At this rate, the Event Builder collects the data from the readout system PCs (ROSs) and provides fully assembled events to the Event Filter (EF). The EF is the third trigger level and its aim is to achieve a further rate reduction to ~ 200 Hz on the permanent storage. The Event Builder is based on a farm of O (100) PCs, interconnected via a gigabit Ethernet to O (150) ROSs. These PCs run Linux and multi-threaded software applications implemented in C++. All the ROSs, and substantial fractions of the Event Builder and EF PCs have been installed and commissioned. We report on performance tests on this initial system, which is capable of going beyond the required data rates and bandwidths for event building for the ATLAS experiment.
Configuration of the ATLAS trigger dos Anjos, A.; Ellis, N.; Haller, J. ...
IEEE transactions on nuclear science,
06/2006, Letnik:
53, Številka:
3
Journal Article
Recenzirano
Odprti dostop
The ATLAS detector at CERN's LHC will be exposed to proton-proton collisions at a nominal rate of 1 GHz from beams crossing at 40 MHz. In order to reduce the data rate to about 200 Hz, only ...potentially interesting events are selected by a three-level trigger system. Its first level is implemented in electronics and firmware whereas the higher trigger levels are based on software. To prepare the full trigger chain for the online event selection according to a certain strategy, a system is being set up that provides the relevant configuration information, e.g., values for hardware registers in level-1 or parameters of high-level trigger algorithms-and stores the corresponding history. The same information is used to configure the offline trigger simulation. In this presentation an overview of the ATLAS trigger system is given concentrating on the event selection strategy and its description. The technical implementation of the configuration system is summarized.
Several scenarios, both present and future, require re-simulation of the trigger response in the ATLAS experiment at the LHC. While software for the detector response simulation and event ...reconstruction is allowed to change and improve, the trigger response simulation has to reflect the conditions at which data was taken. This poses a maintenance and data preservation problem. Several strategies have been considered and a proof-of-concept model using virtualization has been developed. While the virtualization with CernVM elegantly solves several aspects of the data preservation problem, the limitations of current methods for contextualization of the virtual machine as well as incompatibilities in the currently used data format introduces new challenges. In this proceeding these challenges, their current solutions and the proof of concept model for precise trigger simulation are discussed.
Analyses at the LHC which search for rare physics processes or determine with high precision Standard Model parameters require accurate simulations of the detector response and the event selection ...processes. The accurate determination of the trigger response is crucial for the determination of overall selection efficiencies and signal sensitivities. For the generation and the reconstruction of simulated event data, the most recent software releases are usually used to ensure the best agreement between simulated data and real data. For the simulation of the trigger selection process, however, ideally the same software release that was deployed when the real data were taken should be used. This potentially requires running software dating many years back. Having a strategy for running old software in a modern environment thus becomes essential when data simulated for past years start to present a sizable fraction of the total. We examined the requirements and possibilities for such a simulation scheme within the ATLAS software framework and successfully implemented a proof-of-concept simulation chain. One of the greatest challenges was the choice of a data format which promises long term compatibility with old and new software releases. Over the time periods envisaged, data format incompatibilities are also likely to emerge in databases and other external support services. Software availability may become an issue, when e.g. the support for the underlying operating system might stop. In this paper we present the encountered problems and developed solutions, and discuss proposals for future development. Some ideas reach beyond the retrospective trigger simulation scheme in ATLAS as they also touch more generally aspects of data preservation.
To cope with the 40 MHz event production rate of LHC, the trigger of the ATLAS experiment selects events in three sequential steps of increasing complexity and accuracy whose final results are close ...to the offline reconstruction. The Level-1, implemented with custom hardware, identifies physics objects within Regions of Interests and operates with a first reduction of the event rate to 75 kHz. The higher trigger levels, Level-2 and Level-3, provide a software based event selection which further reduces the event rate to about 100 Hz. This paper presents the algorithm (/spl mu/Fast) employed at Level-2 to confirm the muon candidates flagged by the Level-1. /spl mu/Fast identifies hits of muon tracks inside the barrel region of the Muon Spectrometer and provides a precise measurement of the muon momentum at the production vertex. The algorithm must process the Level-1 muon output rate (/spl sim/20 kHz), thus particular care has been taken for its optimization. The result is a very fast track reconstruction algorithm with good physics performance which, in some cases, approaches that of the offline reconstruction: it finds muon tracks with an efficiency of about 95% and computes p/sub T/ of prompt muons with a resolution of 5.5% at 6 GeV and 4.0% at 20 GeV. The algorithm requires an overall execution time of /spl sim/1 ms on a 100 SpecInt95 machine and has been tested in the online environment of the Atlas detector test beam.
Algorithms for the ATLAS high-level trigger Armstrong, S.; Baines, J.T.; Bee, C.P. ...
IEEE transactions on nuclear science,
06/2004, Letnik:
51, Številka:
3
Journal Article
Recenzirano
Odprti dostop
Following rigorous software design and analysis methods, an object-based architecture has been developed to derive the second- and third-level trigger decisions for the future ATLAS detector at the ...LHC. The functional components within this system responsible for generating elements of the trigger decisions are algorithms running within the software architecture. Relevant aspects of the architecture are reviewed along with concrete examples of specific algorithms and their performance in "vertical" slices of various physics selection strategies.