ALICE (A Large Ion Collider Experiment) is the dedicated heavy-ion detector studying the physics of strongly interacting matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). ...After the second long shut-down of the LHC, the ALICE detector will be upgraded to cope with an interaction rate of 50 kHz in Pb-Pb collisions, producing in the online computing system (O2) a sustained throughput of 3.4 TB/s. This data will be processed on the fly so that the stream to permanent storage does not exceed 90 GB/s peak, the raw data being discarded. In the context of assessing different computing platforms for the O2 system, we have developed a framework for the Intel Xeon Phi processors (MIC). It provides the components to build a processing pipeline streaming the data from the PC memory to a pool of permanent threads running on the MIC, and back to the host after processing. It is based on explicit offloading mechanisms (data transfer, asynchronous tasks) and basic building blocks (FIFOs, memory pools, C++11 threads). The user only needs to implement the processing method to be run on the MIC. We present in this paper the architecture, implementation, and performance of this system.
The ALICE data acquisition system Carena, F.; Carena, W.; Chapeland, S. ...
Nuclear instruments & methods in physics research. Section A, Accelerators, spectrometers, detectors and associated equipment,
03/2014, Letnik:
741
Journal Article
Recenzirano
Odprti dostop
In this paper we describe the design, the construction, the commissioning and the operation of the Data Acquisition (DAQ) and Experiment Control Systems (ECS) of the ALICE experiment at the CERN ...Large Hadron Collider (LHC).
The DAQ and the ECS are the systems used respectively for the acquisition of all physics data and for the overall control of the experiment. They are two computing systems made of hundreds of PCs and data storage units interconnected via two networks. The collection of experimental data from the detectors is performed by several hundreds of high-speed optical links.
We describe in detail the design considerations for these systems handling the extreme data throughput resulting from central lead ions collisions at LHC energy. The implementation of the resulting requirements into hardware (custom optical links and commercial computing equipment), infrastructure (racks, cooling, power distribution, control room), and software led to many innovative solutions which are described together with a presentation of all the major components of the systems, as currently realized. We also report on the performance achieved during the first period of data taking (from 2009 to 2013) often exceeding those specified in the DAQ Technical Design Report.
ALICE is one of the experiments under installation at CERN Large Hadron Collider (LHC), dedicated to the study of heavy-ion collisions. The final ALICE data acquisition system has been installed and ...is being used for the testing and commissioning of detectors. Online data quality monitoring is an important part of the DAQ software framework, DATE. In this paper, we overview the implementation and usage experience of the interactive tool MOOD used for the commissioning period of ALICE, and we present the architecture of the automatic data quality monitoring framework, a distributed application aimed at producing, collecting, analyzing, visualizing, and storing monitoring data in a large experiment-wide scale.
ALICE (A Large Ion Collider Experiment) is a heavy-ion detector studying the physics of strongly interacting matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). The ALICE ...Data-AcQuisition (DAQ) system handles the data flow from the sub-detector electronics to the permanent data storage in the CERN computing center. The DAQ farm consists of about 1000 devices of many different types ranging from direct accessible machines to storage arrays and custom optical links. The system performance monitoring tool used during the LHC run 1 will be replaced by a new tool for run 2. This paper shows the results of an evaluation that has been conducted on six publicly available monitoring tools. The evaluation has been carried out by taking into account selection criteria such as scalability, flexibility, reliability as well as data collection methods and display. All the tools have been prototyped and evaluated according to those criteria. We will describe the considerations that have led to the selection of the Zabbix monitoring tool for the DAQ farm. The results of the tests conducted in the ALICE DAQ laboratory will be presented. In addition, the deployment of the software on the DAQ machines in terms of metrics collected and data collection methods will be described. We will illustrate how remote nodes are monitored with Zabbix by using SNMP-based agents and how DAQ specific metrics are retrieved and displayed. We will also show how the monitoring information is accessed and made available via the graphical user interface and how Zabbix communicates with the other DAQ online systems for notification and reporting.
The ALICE DAQ infoLogger Chapeland, S; Carena, F; Carena, W ...
Journal of physics. Conference series,
01/2014, Letnik:
513, Številka:
1
Journal Article
Recenzirano
Odprti dostop
ALICE (A Large Ion Collider Experiment) is a heavy-ion experiment studying the physics of strongly interacting matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). The ALICE DAQ ...(Data Acquisition System) is based on a large farm of commodity hardware consisting of more than 600 devices (Linux PCs, storage, network switches). The DAQ reads the data transferred from the detectors through 500 dedicated optical links at an aggregated and sustained rate of up to 10 Gigabytes per second and stores at up to 2.5 Gigabytes per second. The infoLogger is the log system which collects centrally the messages issued by the thousands of processes running on the DAQ machines. It allows to report errors on the fly, and to keep a trace of runtime execution for later investigation. More than 500000 messages are stored every day in a MySQL database, in a structured table keeping track for each message of 16 indexing fields (e.g. time, host, user, ...). The total amount of logs for 2012 exceeds 75GB of data and 150 million rows. We present in this paper the architecture and implementation of this distributed logging system, consisting of a client programming API, local data collector processes, a central server, and interactive human interfaces. We review the operational experience during the 2012 run, in particular the actions taken to ensure shifters receive manageable and relevant content from the main log stream. Finally, we present the performance of this log system, and future evolutions.
A Large Ion Collider Experiment (ALICE) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). The ...online Data Quality Monitoring (DQM) plays an essential role in the experiment operation by providing shifters with immediate feedback on the data being recorded in order to quickly identify and overcome problems. An immediate access to the DQM results is needed not only by shifters in the control room but also by detector experts worldwide. As a consequence, a new web application has been developed to dynamically display and manipulate the ROOT-based objects produced by the DQM system in a flexible and user friendly interface. The architecture and design of the tool, its main features and the technologies that were used, both on the server and the client side, are described. In particular, we detail how we took advantage of the most recent ROOT JavaScript I O and web server library to give interactive access to ROOT objects stored in a database. We describe as well the use of modern web techniques and packages such as AJAX, DHTMLX and jQuery, which has been instrumental in the successful implementation of a reactive and efficient application. We finally present the resulting application and how code quality was ensured. We conclude with a roadmap for future technical and functional developments.
Midrapidity production of π±, K±, and (¯p)p measured by the ALICE experiment at the CERN Large Hadron Collider, in Pb-Pb and inelastic pp collisions at √sNN=5.02 TeV, is presented. The invariant ...yields are measured over a wide transverse momentum (pT) range from hundreds of MeV/c up to 20 GeV/c. The results in Pb-Pb collisions are presented as a function of the collision centrality, in the range 0–90%. The comparison of the pT-integrated particle ratios, i.e., proton-to-pion (p/π) and kaon-to-pion (K/π) ratios, with similar measurements in Pb-Pb collisions at √sNN=2.76 TeV show no significant energy dependence. Blast-wave fits of the pT spectra indicate that in the most central collisions radial flow is slightly larger at 5.02 TeV with respect to 2.76 TeV. Particle ratios (p/π, K/π) as a function of pT show pronounced maxima at pT≈3GeV/c in central Pb-Pb collisions. At high pT, particle ratios at 5.02 TeV are similar to those measured in pp collisions at the same energy and in Pb-Pb collisions at √sNN=2.76 TeV. Using the pp reference spectra measured at the same collision energy of 5.02 TeV, the nuclear modification factors for the different particle species are derived. Within uncertainties, the nuclear modification factor is particle species independent for high pT and compatible with measurements at √sNN=2.76 TeV. The results are compared to state-of-the-art model calculations, which are found to describe the observed trends satisfactorily.
One of the key challenges for nuclear physics today is to understand from first principles the effective interaction between hadrons with different quark content. First successes have been achieved ...using techniques that solve the dynamics of quarks and gluons on discrete space-time lattices
. Experimentally, the dynamics of the strong interaction have been studied by scattering hadrons off each other. Such scattering experiments are difficult or impossible for unstable hadrons
and so high-quality measurements exist only for hadrons containing up and down quarks
. Here we demonstrate that measuring correlations in the momentum space between hadron pairs
produced in ultrarelativistic proton-proton collisions at the CERN Large Hadron Collider (LHC) provides a precise method with which to obtain the missing information on the interaction dynamics between any pair of unstable hadrons. Specifically, we discuss the case of the interaction of baryons containing strange quarks (hyperons). We demonstrate how, using precision measurements of proton-omega baryon correlations, the effect of the strong interaction for this hadron-hadron pair can be studied with precision similar to, and compared with, predictions from lattice calculations
. The large number of hyperons identified in proton-proton collisions at the LHC, together with accurate modelling
of the small (approximately one femtometre) inter-particle distance and exact predictions for the correlation functions, enables a detailed determination of the short-range part of the nucleon-hyperon interaction.
Comprehensive results on the production of unidentified charged particles, π±, K±, K$_S^0$, K*(892)0, $p, \bar{p}, ϕ$(1020), Λ, Λ, Ξ-, Ξ+, Ω-, and $\bar{Ω}^+$ hadrons in proton-proton ...(pp) collisions at $ \sqrt{s}$=7 TeV at midrapidity (|y|<0.5) as a function of charged-particle multiplicity density are presented. In order to avoid autocorrelation biases, the actual transverse momentum (pT) spectra of the particles under study and the event activity are measured in different rapidity windows. In the highest multiplicity class, the charged-particle density reaches about 3.5 times the value measured in inelastic collisions. While the yield of protons normalized to pions remains approximately constant as a function of multiplicity, the corresponding ratios of strange hadrons to pions show a significant enhancement that increases with increasing strangeness content. Furthermore, all identified particle-to-pion ratios are shown to depend solely on charged-particle multiplicity density, regardless of system type and collision energy. The evolution of the spectral shapes with multiplicity and hadron mass shows patterns that are similar to those observed in p-Pb and Pb-Pb collisions at Large Hadron Collider energies. The obtained pT distributions and yields are compared to expectations from QCD-based pp event generators as well as to predictions from thermal and hydrodynamic models. These comparisons indicate that traces of a collective, equilibrated system are already present in high-multiplicity pp collisions.