This paper will describe the hardware and the software developed to build a random trigger simulator used to test the detectors of the ALICE experiment. It will also describe the tests performed in ...our laboratory at CERN on the random trigger generator to confirm its correct behavior and its installation details in one of the counting rooms of ALICE, where it provides the triggers for all the sub-detectors.
The ALICE online data storage system Divià, R; Fuchs, U; Makhlyueva, I ...
Journal of physics. Conference series,
04/2010, Letnik:
219, Številka:
5
Journal Article
Recenzirano
Odprti dostop
The ALICE (A Large Ion Collider Experiment) Data Acquisition (DAQ) system has the unprecedented requirement to ensure a very high volume, sustained data stream between the ALICE Detector and the ...Permanent Data Storage (PDS) system which is used as main data repository for Event processing and Offline Computing. The key component to accomplish this task is the Transient Data Storage System (TDS), a set of data storage elements with its associated hardware and software components, which supports raw data collection, its conversion into a format suitable for subsequent high-level analysis, the storage of the result using highly parallelized architectures, its access via a cluster file system capable of creating high-speed partitions via its affinity feature, and its transfer to the final destination via dedicated data links. We describe the methods and the components used to validate, test, implement, operate, and monitor the ALICE Online Data Storage system and the way it has been used in the early days of commissioning and operation for the ALICE Detector. We will also introduce the future developments needed from next year, when the ALICE Data Acquisition System will shift its requirements from those associated to the test and commissioning phase to those imposed by long-duration data taking periods alternated by shorter validation and maintenance tasks which will be needed to adequately operate the ALICE Experiment.
ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). A ...large bandwidth and flexible Data Acquisition System (DAQ) has been designed and deployed to collect sufficient statistics in the short running time available per year for heavy ions and to accommodate very different requirements originated from the 18 sub-detectors. This paper will present the large scale tests conducted to assess the standalone DAQ performances, the interfaces with the other online systems and the extensive commissioning performed in order to be fully prepared for physics data taking. It will review the experience accumulated since May 2007 during the standalone commissioning of the main detectors and the global cosmic runs and the lessons learned from this exposure on the "battle field". It will also discuss the test protocol followed to integrate and validate each sub-detector with the online systems and it will conclude with the first results of the LHC injection tests and startup in September 2008. Several papers of the same conference present in more details some elements of the ALICE DAQ system.
Preparing the ALICE DAQ upgrade Carena, F; Carena, W; Chapeland, S ...
Journal of physics. Conference series,
01/2012, Letnik:
396, Številka:
1
Journal Article
Recenzirano
Odprti dostop
In November 2009, after 15 years of design and installation, the ALICE experiment started to detect and record the first collisions produced by the LHC. It has been collecting hundreds of millions of ...events ever since with both proton and heavy ion collisions. The future scientific programme of ALICE has been refined following the first year of data taking. The physics targeted beyond 2018 will be the study of rare signals. Several detectors will be upgraded, modified, or replaced to prepare ALICE for future physics challenges. An upgrade of the triggering and readout systems is also required to accommodate the needs of the upgraded ALICE and to better select the data of the rare physics channels. The ALICE upgrade will have major implications in the detector electronics and controls, data acquisition, event triggering and offline computing and storage systems. Moreover, the experience accumulated during more than two years of operation has also lead to new requirements for the control software. We will review all these new needs and the current R&D activities to address them. Several papers of the same conference present in more details some elements of the ALICE online system.
ALICE moves into warp drive Carena, F; Carena, W; Chapeland, S ...
Journal of physics. Conference series,
01/2012, Letnik:
396, Številka:
1
Journal Article
Recenzirano
Odprti dostop
A Large Ion Collider Experiment (ALICE) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). ...Since its successful start-up in 2010, the LHC has been performing outstandingly, providing to the experiments long periods of stable collisions and an integrated luminosity that greatly exceeds the planned targets. To fully explore these privileged conditions, we aim at maximizing the experiment's data taking productivity during stable collisions. We present in this paper the evolution of the online systems towards helping us understand reasons of inefficiency and address new requirements. This paper describes the features added to the ALICE Electronic Logbook (eLogbook) to allow the Run Coordination team to identify, prioritize, fix and follow causes of inefficiency in the experiment. Thorough monitoring of the data taking efficiency provides reports for the collaboration to portray its evolution and evaluate the measures (fixes and new features) taken to increase it. In particular, the eLogbook helps decision making by providing quantitative input, which can be used to better balance risks of changes in the production environment against potential gains in quantity and quality of physics data. It will also present the evolution of the Experiment Control System (ECS) to allow on-the-fly error recovery actions of the detector apparatus while limiting as much as possible the loss of integrated luminosity. The paper will conclude with a review of the ALICE efficiency so far and the future plans to improve its monitoring.
ALICE 1 (A Large Ion Collider Experiment) is the detector system at the LHC (Large Hadron Collider) optimized for the study of heavy-ion collisions. Its main aim is to study the behavior of strongly ...interacting matter and the quark gluon plasma. Currently all the information sent by the 18 sub-detectors composing ALICE are read out by DATE 2 (Data Acquisition and Test Environment), the ALICE data acquisition software, using several optical links called DDL3 (Detector Data Link), each one with a maximum throughput of 200 MB/s. In the last year a commercial transmission link with a throughput of 10 Gb/s has become a reality, with a low price affordable for everyone. The DATE system has been upgraded to also support this technology in addition to the DDL. This contribution will describe the VHDL firmware of a detector readout board, sending data using the UDP protocol and the changes made to the readout 4 part of DATE software to receive information coming from the 1 or 10 Gb/s Ethernet link. It will also describe the relevant details of the test firmware and software and will conclude with the results of the performance tests done at CERN using the new setup.
ALICE experiment web site, May 10, 2009
〈
http://public.web.cern.ch/public/en/LHC/ALICEen.html
〉
(A Large Ion Collider Experiment) is an experiment at the LHC (Large Hadron Collider) optimized for ...the study of heavy-ion collisions.
The main aim of the experiment is to study the behavior of strongly interaction matter and quark gluon plasma. The ALICE DAQ (Data Acquisition) system has been deployed and used intensively during the commissioning of the experiment. This paper will present the evolution of one particular area of the system: the detector readout.
The data produced by each detector are received by DATE ALICE data acquisition web site, May 10, 2009
〈
http://phdepaid.web.cern.ch/phdepaid/
〉
(ALICE Data Acquisition program) using PCI (Peripheral Component Interconnect) based card called D-RORC web site, May 10, 2009
〈
http://alice-proj-ddl.web.cern.ch/alice-proj-ddl/rorc_intro.html
〉
(DAQ Readout Receiver Card). Of the order of 400 of these cards are installed in the PCs of the DAQ farm, and they are connected by optical links called DDL web site, May 10, 2009
〈
http://alice-proj-ddl.web.cern.ch/alice-proj-ddl/ddl_intro.html
〉
(Detector Data Link) to the detector readout electronics. The D-RORC is controlled by the readout software, part of the DATE program that reads the events coming from these cards. We will present the results obtained during the performance tests of the new release of the D-RORC, PCI Express, in development at CERN. The paper will review the working principles of the D-RORC, its use by the readout software and the benefits in using PCI Express instead PCI-X. It will also introduce the work in progress for the new release of the readout software towards the next hardware platform based on 64-bit computer architecture and DDLs based on 10G Ethernet.
ALICE (A Large Ion Collider Experiment) is a detector dedicated to the studies with heavy ion collisions exploring the physics of strongly interacting nuclear matter and the quark-gluon plasma at the ...CERN LHC (Large Hadron Collider). After the second long shutdown of the LHC, the ALICE Experiment will be upgraded to make high precision measurements of rare probes at low pT, which cannot be selected with a trigger, and therefore require a very large sample of events recorded on tape. The online computing system will be completely redesigned to address the major challenge of sampling the full 50 kHz Pb-Pb interaction rate increasing the present limit by a factor of 100. This upgrade will also include the continuous un-triggered read-out of two detectors: ITS (Inner Tracking System) and TPC (Time Projection Chamber)) producing a sustained throughput of 1 TB/s. This unprecedented data rate will be reduced by adopting an entirely new strategy where calibration and reconstruction are performed online, and only the reconstruction results are stored while the raw data are discarded. This system, already demonstrated in production on the TPC data since 2011, will be optimized for the online usage of reconstruction algorithms. This implies much tighter coupling between online and offline computing systems. An R&D program has been set up to meet this huge challenge. The object of this paper is to present this program and its first results.
The yield of charged particles associated with high-p{sub t} trigger particles (8 < p{sub t} < 15 GeV/c) is measured with the ALICE detector in Pb-Pb collisions at {radical}s{sub NN} = 2.76 TeV ...relative to proton-proton collisions at the same energy. The conditional per-trigger yields are extracted from the narrow jetlike correlation peaks in azimuthal dihadron correlations. In the 5% most central collisions, we observe that the yield of associated charged particles with transverse momenta p{sub t} > 3 GeV/c on the away side drops to about 60% of that observed in pp collisions, while on the near side a moderate enhancement of 20%-30% is found.
This article presents the first results of a project in underwater modular robotics, called Neubots. The goals of the projects are to explore, following Von Neumann’s ideas, potential mechanisms ...underlying self-organization and self-replication. We briefly explain the design features of the module units. We then present simulation results of the artificial co-evolution of body structures and neural controllers for locomotion. The neural controllers are inspired from the central pattern generators underlying locomotion in vertebrate animals. They are composed of multiple neural oscillators which are connected together by a specific type of coupling called synaptic spreading. The co-evolution of body and controller leads to interesting robots capable of efficient swimming. Interesting features of the neural controllers include the possibility to modulate the speed of locomotion by varying simple input signals, the robustness against perturbations, and the distributed nature of the controllers which makes them well suited for modular robotics.