ALICE (A Large Ion Collider Experiment) at the LHC plans to use a PROOF cluster at CERN (CAF - CERN Analysis Facility) for analysis. The system is especially aimed at the prototyping phase of ...analyses that need a high number of development iterations and thus require a short response time. Typical examples are the tuning of cuts during the development of an analysis as well as calibration and alignment. Furthermore, the use of an interactive system with very fast response will allow ALICE to extract physics observables out of first data quickly. An additional use case is fast event simulation and reconstruction. A test setup consisting of 40 machines is used for evaluation since May 2006. The PROOF system enables the parallel processing and xrootd the access to files distributed on the test cluster. An automatic staging system for files either catalogued in the ALICE file catalog or stored in the CASTOR mass storage system has been developed. The current setup and ongoing development towards disk quotas and CPU fairshare are described. Furthermore, the integration of PROOF into ALICE's software framework (AliRoot) is discussed.
The charged particle multiplicity distribution is one of the first measurements that ALICE will be able to perform. The knowledge of this basic property at a new energy is needed to configure Monte ...Carlo generators correctly with the aim of understanding the background of other, especially rare, processes including new physics. It allows to study the scaling behaviour and to verify model predictions. The unfolding of the measurement is a non-trivial task due to the finite precision and acceptance of the detector. Solutions are based on
χ
2
minimization or iteratively using Bayes’ theorem. Both approaches to unfold the spectrum are presented. Furthermore, the capabilities of the SPD fast OR trigger are shown that enable physics at very high multiplicities.
A Large Ion Collider Experiment (ALICE) is the dedicated heavy-ion experiment at the CERN LHC and will take data with a bandwidth of up to 1.25 GB/s. It consists of 18 subdetectors that interact with ...five online systems (CTP, DAQ, DCS, ECS, and HLT). Data recorded is read out by DAQ in a raw data stream produced by the subdetectors. In addition the subdetectors produce conditions data derived from the raw data, i.e. calibration and alignment information, which have to be available from the beginning of the reconstruction and therefore cannot be included in the raw data. The extraction of the conditions data is steered by a system called Shuttle. It provides the link between data produced by the subdetectors in the online systems and a dedicated procedure per subdetector, called preprocessor, that runs in the Shuttle system. The preprocessor performs merging, consolidation, and reformatting of the data. Finally, it stores the data in the Grid Offline Conditions DataBase (OCDB) so that they are available for the Offline reconstruction. The reconstruction of a given run is initiated automatically once the raw data is successfully exported to the Grid storage and the run has been processed in the Shuttle framework. These proceedings introduce the Shuttle system. The performance of the system during the ALICE cosmics commissioning and LHC startup is described.
First released in 2010, the Rivet library forms an important repository for analysis code, facilitating comparisons between measurements of the final state in particle collisions and theoretical ...calculations of those final states. We give an overview of Rivet's current design and implementation, its uptake for analysis preservation and physics results, and summarise recent developments including propagation of MC systematic-uncertainty weights, heavy-ion and \(ep\) physics, and systems for detector emulation. In addition, we provide a short user guide that supplements and updates the Rivet user manual.
The Future Circular Collider (FCC) Study is aimed at assessing the physics potential and the technical feasibility of a new collider with centre-of-mass energies, in the hadron-hadron collision mode, ...seven times larger than the nominal LHC energies. Operating such machine with heavy ions is an option that is being considered in the accelerator design studies. It would provide, for example, Pb-Pb and p-Pb collisions at sqrt{s_NN} = 39 and 63 TeV, respectively, per nucleon-nucleon collision, with integrated luminosities above 30 nb^-1 per month for Pb-Pb. This is a report by the working group on heavy-ion physics of the FCC Study. First ideas on the physics opportunities with heavy ions at the FCC are presented, covering the physics of the Quark-Gluon Plasma, of gluon saturation, of photon-induced collisions, as well as connections with other fields of high-energy physics.
The goal of the ALICE experiment at LHC is to study strongly interacting matter at high energy densities as well as the signatures and properties of the quark-gluon plasma. This goal manifests itself ...in a rich physics program. Although ALICE will mainly study heavy-ion collisions, a dedicated program will concentrate on proton-proton physics. The first part will introduce the ALICE experiment from a pp measurement's point of view. Two unique properties are its low pT cut-off and the excellent PID capabilities. The various topics of the proton-proton physics program, which will allow a close scrutiny of existing theoretical models, will be described. Furthermore, the interpretation of measurements of heavy-ion collisions necessitates the comparison to measurements of pp collisions. The second part will concentrate on the day-1 physics program of ALICE. At startup, neither the LHC luminosity nor its energy will have their nominal values. Furthermore, the ALICE detector is in the process of being aligned and calibrated. Still several physics topics can be studied from the very beginning. These will be presented as well as the effort that is already ongoing to be ready for the first collision. The statistics needed for each of the topics will be given with respect to the foreseen LHC startup scenario.
The charged particle multiplicity distribution is one of the first measurements that ALICE will be able to perform. The knowledge of this basic property at a new energy is needed to configure Monte ...Carlo generators correctly with the aim of understanding the background of other, especially rare, processes including new physics. It allows to study the scaling behaviour and to verify model predictions. The unfolding of the measurement is a non-trivial task due to the finite precision and acceptance of the detector. Solutions are based on chi2 minimization or iteratively using Bayes' theorem. Both approaches to unfold the spectrum are presented. Furthermore, the capabilities of the SPD fast OR trigger are shown that enable physics at very high multiplicities.
The future opportunities for high-density QCD studies with ion and proton beams at the LHC are presented. Four major scientific goals are identified: the characterisation of the macroscopic long ...wavelength Quark-Gluon Plasma (QGP) properties with unprecedented precision, the investigation of the microscopic parton dynamics underlying QGP properties, the development of a unified picture of particle production and QCD dynamics from small (pp) to large (nucleus--nucleus) systems, the exploration of parton densities in nuclei in a broad (\(x\), \(Q^2\)) kinematic range and the search for the possible onset of parton saturation. In order to address these scientific goals, high-luminosity Pb-Pb and p-Pb programmes are considered as priorities for Runs 3 and 4, complemented by high-multiplicity studies in pp collisions and a short run with oxygen ions. High-luminosity runs with intermediate-mass nuclei, for example Ar or Kr, are considered as an appealing case for extending the heavy-ion programme at the LHC beyond Run 4. The potential of the High-Energy LHC to probe QCD matter with newly-available observables, at twice larger center-of-mass energies than the LHC, is investigated.
The present document discusses plans for a compact, next-generation multi-purpose detector at the LHC as a follow-up to the present ALICE experiment. The aim is to build a nearly massless barrel ...detector consisting of truly cylindrical layers based on curved wafer-scale ultra-thin silicon sensors with MAPS technology, featuring an unprecedented low material budget of 0.05% X\(_0\) per layer, with the innermost layers possibly positioned inside the beam pipe. In addition to superior tracking and vertexing capabilities over a wide momentum range down to a few tens of MeV/\(c\), the detector will provide particle identification via time-of-flight determination with about 20~ps resolution. In addition, electron and photon identification will be performed in a separate shower detector. The proposed detector is conceived for studies of pp, pA and AA collisions at luminosities a factor of 20 to 50 times higher than possible with the upgraded ALICE detector, enabling a rich physics program ranging from measurements with electromagnetic probes at ultra-low transverse momenta to precision physics in the charm and beauty sector.
The objective of this first workshop on Multiple Partonic Interactions (MPI) at the LHC is to raise the profile of MPI studies, summarizing the legacy from the older phenomenology at hadronic ...colliders and favouring further specific contacts between the theory and experimental communities. The MPI are experiencing a growing popularity and are currently widely invoked to account for observations that would not be explained otherwise: the activity of the Underlying Event, the cross sections for multiple heavy flavour production, the survival probability of large rapidity gaps in hard diffraction, etc. At the same time, the implementation of the MPI effects in the Monte Carlo models is quickly proceeding through an increasing level of sophistication and complexity that in perspective achieves deep general implications for the LHC physics. The ultimate ambition of this workshop is to promote the MPI as unification concept between seemingly heterogeneous research lines and to profit of the complete experimental picture in order to constrain their implementation in the models, evaluating the spin offs on the LHC physics program.