ALICE, the general purpose, heavy ion collision detector at the CERN LHC is designed to study the physics of strongly interacting matter using proton-proton, nucleus-nucleus and proton-nucleus ...collisions at high energies. The ALICE experiment will be upgraded during the Long Shutdown 2 in order to exploit the full scientific potential of the future LHC. The requirements will then be significantly different from the original design of the experiment and will require major changes to the detector read-out. The main physics topics addressed by the ALICE upgrade are characterised by rare processes with a very small signal-to-background ratio, requiring very large statistics of fully reconstructed events. In order to keep up with the 50 kHz interaction rate, the upgraded detectors will be read out continuously. However, triggered readout will be used by some detectors and for commissioning and some calibration runs. The total data volume collected from the detectors will increase significantly reaching a sustained data throughput of up to 3 TB/s with the zero-suppression of the TPC data performed after the data transfer to the detector read-out system. A flexible mechanism of bandwidth throttling will allow the system to gracefully degrade the effective rate of recorded interactions in case of saturation of the computing system. This paper includes a summary of these updated requirements and presents a refined design of the detector read-out and of the interface with the detectors and the online systems. It also elaborates on the system behaviour in continuous and triggered readout and defines ways to throttle the data read-out in both cases.
The ALICE data acquisition system Carena, F.; Carena, W.; Chapeland, S. ...
Nuclear instruments & methods in physics research. Section A, Accelerators, spectrometers, detectors and associated equipment,
03/2014, Letnik:
741
Journal Article
Recenzirano
Odprti dostop
In this paper we describe the design, the construction, the commissioning and the operation of the Data Acquisition (DAQ) and Experiment Control Systems (ECS) of the ALICE experiment at the CERN ...Large Hadron Collider (LHC).
The DAQ and the ECS are the systems used respectively for the acquisition of all physics data and for the overall control of the experiment. They are two computing systems made of hundreds of PCs and data storage units interconnected via two networks. The collection of experimental data from the detectors is performed by several hundreds of high-speed optical links.
We describe in detail the design considerations for these systems handling the extreme data throughput resulting from central lead ions collisions at LHC energy. The implementation of the resulting requirements into hardware (custom optical links and commercial computing equipment), infrastructure (racks, cooling, power distribution, control room), and software led to many innovative solutions which are described together with a presentation of all the major components of the systems, as currently realized. We also report on the performance achieved during the first period of data taking (from 2009 to 2013) often exceeding those specified in the DAQ Technical Design Report.
ALFA: The new ALICE-FAIR software framework Al-Turany, M.; Buncic, P.; Hristov, P. ...
Journal of physics. Conference series,
12/2015, Letnik:
664, Številka:
7
Journal Article
Recenzirano
Odprti dostop
The commonalities between the ALICE and FAIR experiments and their computing requirements led to the development of large parts of a common software framework in an experiment independent way. The ...FairRoot project has already shown the feasibility of such an approach for the FAIR experiments and extending it beyond FAIR to experiments at other facilities1, 2. The ALFA framework is a joint development between ALICE Online- Offline (O2) and FairRoot teams. ALFA is designed as a flexible, elastic system, which balances reliability and ease of development with performance using multi-processing and multithreading. A message- based approach has been adopted; such an approach will support the use of the software on different hardware platforms, including heterogeneous systems. Each process in ALFA assumes limited communication and reliance on other processes. Such a design will add horizontal scaling (multiple processes) to vertical scaling provided by multiple threads to meet computing and throughput demands. ALFA does not dictate any application protocols. Potentially, any content-based processor or any source can change the application protocol. The framework supports different serialization standards for data exchange between different hardware and software languages.
ALICE (A Large Ion Collider Experiment) is a heavy-ion detector studying the physics of strongly interacting matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). The ALICE ...Data-AcQuisition (DAQ) system handles the data flow from the sub-detector electronics to the permanent data storage in the CERN computing center. The DAQ farm consists of about 1000 devices of many different types ranging from direct accessible machines to storage arrays and custom optical links. The system performance monitoring tool used during the LHC run 1 will be replaced by a new tool for run 2. This paper shows the results of an evaluation that has been conducted on six publicly available monitoring tools. The evaluation has been carried out by taking into account selection criteria such as scalability, flexibility, reliability as well as data collection methods and display. All the tools have been prototyped and evaluated according to those criteria. We will describe the considerations that have led to the selection of the Zabbix monitoring tool for the DAQ farm. The results of the tests conducted in the ALICE DAQ laboratory will be presented. In addition, the deployment of the software on the DAQ machines in terms of metrics collected and data collection methods will be described. We will illustrate how remote nodes are monitored with Zabbix by using SNMP-based agents and how DAQ specific metrics are retrieved and displayed. We will also show how the monitoring information is accessed and made available via the graphical user interface and how Zabbix communicates with the other DAQ online systems for notification and reporting.
The ALICE data quality monitoring system Haller, B von; Telesca, A; Chapeland, S ...
Journal of physics. Conference series,
12/2011, Letnik:
331, Številka:
2
Journal Article
Recenzirano
Odprti dostop
ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). The ...online Data Quality Monitoring (DQM) is a key element of the Data Acquisition's software chain. It provide shifters with precise and complete information to quickly identify and overcome problems, and as a consequence to ensure acquisition of high quality data. DQM typically involves the online gathering, the analysis by user-defined algorithms and the visualization of monitored data. This paper describes the final design of ALICE'S DQM framework called AMORE (Automatic MOnitoRing Environment), as well as its latest and coming features like the integration with the offline analysis and reconstruction framework, a better use of multi-core processors by a parallelization effort, and its interface with the eLogBook. The concurrent collection and analysis of data in an online environment requires the framework to be highly efficient, robust and scalable. We will describe what has been implemented to achieve these goals and the procedures we follow to ensure appropriate robustness and performance. We finally review the wide range of usages people make of this framework, from the basic monitoring of a single sub-detector to the most complex ones within the High Level Trigger farm or using the Prompt Reconstruction and we describe the various ways of accessing the monitoring results. We conclude with our experience, before and after the LHC startup, when monitoring the data quality in a challenging environment.
The ALICE DAQ infoLogger Chapeland, S; Carena, F; Carena, W ...
Journal of physics. Conference series,
01/2014, Letnik:
513, Številka:
1
Journal Article
Recenzirano
Odprti dostop
ALICE (A Large Ion Collider Experiment) is a heavy-ion experiment studying the physics of strongly interacting matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). The ALICE DAQ ...(Data Acquisition System) is based on a large farm of commodity hardware consisting of more than 600 devices (Linux PCs, storage, network switches). The DAQ reads the data transferred from the detectors through 500 dedicated optical links at an aggregated and sustained rate of up to 10 Gigabytes per second and stores at up to 2.5 Gigabytes per second. The infoLogger is the log system which collects centrally the messages issued by the thousands of processes running on the DAQ machines. It allows to report errors on the fly, and to keep a trace of runtime execution for later investigation. More than 500000 messages are stored every day in a MySQL database, in a structured table keeping track for each message of 16 indexing fields (e.g. time, host, user, ...). The total amount of logs for 2012 exceeds 75GB of data and 150 million rows. We present in this paper the architecture and implementation of this distributed logging system, consisting of a client programming API, local data collector processes, a central server, and interactive human interfaces. We review the operational experience during the 2012 run, in particular the actions taken to ensure shifters receive manageable and relevant content from the main log stream. Finally, we present the performance of this log system, and future evolutions.
A Large Ion Collider Experiment (ALICE) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). The ...online Data Quality Monitoring (DQM) plays an essential role in the experiment operation by providing shifters with immediate feedback on the data being recorded in order to quickly identify and overcome problems. An immediate access to the DQM results is needed not only by shifters in the control room but also by detector experts worldwide. As a consequence, a new web application has been developed to dynamically display and manipulate the ROOT-based objects produced by the DQM system in a flexible and user friendly interface. The architecture and design of the tool, its main features and the technologies that were used, both on the server and the client side, are described. In particular, we detail how we took advantage of the most recent ROOT JavaScript I O and web server library to give interactive access to ROOT objects stored in a database. We describe as well the use of modern web techniques and packages such as AJAX, DHTMLX and jQuery, which has been instrumental in the successful implementation of a reactive and efficient application. We finally present the resulting application and how code quality was ensured. We conclude with a roadmap for future technical and functional developments.
The ALICE and ATLAS DAQ systems read out detector data via point-to-point serial links into custom hardware modules, the ALICE RORC and ATLAS ROBIN. To meet the increase in operational requirements ...both experiments are replacing their respective modules with a new common module, the C-RORC. This card, developed by ALICE, implements a PCIe Gen 2 x8 interface and interfaces to twelve optical links via three QSFP transceivers. This paper presents the design of the C-RORC, its performance and its application in the ALICE and ATLAS experiments.
ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study the physics of strongly interacting matter and the Quark-Gluon Plasma at the CERN Large Hadron Collider (LHC). A ...large bandwidth and flexible Data-Acquisition System (DAQ) has been designed and deployed to collect sufficient statistics in the short running time available per year for heavy ions and to accommodate very different requirements originating from the 18 sub-detectors. After several months of data taking with beam, lots of experience has been accumulated and some important developments have been initiated in order to evolve towards a more automated and reliable experiment. We will present the experience accumulated so far and the new developments. Several upgrades of existing ALICE detectors or addition of new ones have also been proposed with a significant impact on the DAQ. We will review these proposals, their implication for the DAQ and the way they will be addressed.