DIRAC, the LHCb community Grid solution, was considerably reengineered in order to meet all the requirements for processing the data coming from the LHCb experiment. It is covering all the tasks ...starting with raw data transportation from the experiment area to the grid storage, data processing up to the final user analysis. The reengineered DIRAC3 version of the system includes a fully grid security compliant framework for building service oriented distributed systems; complete Pilot Job framework for creating efficient workload management systems; several subsystems to manage high level operations like data production and distribution management. The user interfaces of the DIRAC3 system providing rich command line and scripting tools are complemented by a full-featured Web portal providing users with a secure access to all the details of the system status and ongoing activities. We will present an overview of the DIRAC3 architecture, new innovative features and the achieved performance. Extending DIRAC3 to manage computing resources beyond the WLCG grid will be discussed. Experience with using DIRAC3 by other user communities than LHCb and in other application domains than High Energy Physics will be shown to demonstrate the general-purpose nature of the system.
In the LHCb experiment a wide variety of Monte Carlo simulated samples needs to be produced for the experiment's physics program. Monte Carlo productions are handled centrally similarly to all ...massive processing of data in the experiment. In order to cope with the large set of different types of simulation samples, necessary procedures based on common infrastructures have been set up with a numerical event type identification code used throughout. The various elements in the procedure, from writing a configuration for an event type to deploying them on the production environment, from submitting and processing a request to retrieving the sample produced as well as the conventions established to allow their interplay will be described. The choices made have allowed a high level of automation of Monte Carlo productions that are handled centrally in a transparent way with experts concentrating on their specific tasks. As a result the massive Monte Carlo production of the experiment is efficiently processed on a world-wide distributed system with minimal manpower.
LHCbDirac: distributed computing in LHCb Stagni, F; Charpentier, P; Graciani, R ...
Journal of physics. Conference series,
01/2012, Letnik:
396, Številka:
3
Journal Article
Recenzirano
Odprti dostop
We present LHCbDirac, an extension of the DIRAC community Grid solution that handles LHCb specificities. The DIRAC software has been developed for many years within LHCb only. Nowadays it is a ...generic software, used by many scientific communities worldwide. Each community wanting to take advantage of DIRAC has to develop an extension, containing all the necessary code for handling their specific cases. LHCbDirac is an actively developed extension, implementing the LHCb computing model and workflows handling all the distributed computing activities of LHCb. Such activities include real data processing (reconstruction, stripping and streaming), Monte-Carlo simulation and data replication. Other activities are groups and user analysis, data management, resources management and monitoring, data provenance, accounting for user and production jobs. LHCbDirac also provides extensions of the DIRAC interfaces, including a secure web client, python APIs and CLIs. Before putting in production a new release, a number of certification tests are run in a dedicated setup. This contribution highlights the versatility of the system, also presenting the experience with real data processing, data and resources management, monitoring for activities and resources.
The increase of luminosity of the LHC in 2011 also introduced an increase of computing requirements for data processing. This paper describes the data processing operations during 2011 prompt ...reconstruction as well as the end of year re-processing of the full data sample. It further gives an outlook to next evolutionary steps in the LHCb computing model for 2012 data processing and beyond.
HERA-B data acquisition system Dam, M.; Egorytchev, V.; Essenov, S. ...
Nuclear instruments & methods in physics research. Section A, Accelerators, spectrometers, detectors and associated equipment,
06/2004, Letnik:
525, Številka:
3
Journal Article
Recenzirano
The HERA-B Data Acquisition System implements a
50
kHz
dead-timeless readout of
500
kB
events requiring unprecedented speed of storing and data processing. The system is based on Digital Signal ...Processors (DSP) minimizing the number of components. A high bandwidth, low-latency DSP switching network provides full connectivity between the readout buffers and a PC farm which runs the higher level trigger. The design of the system and the achieved performance are described in this paper.
The HERA-B experiment was dedicated for the measurement of charge conjugation space inversion (parity) (CP)-violation in decays of neutral B-mesons and for investigating the physics of charmed ...particles. One of the experimental requirements is a highly selective on-line filtering of data due to high interaction rates and a low signal-to-background ratio. This demands a hierarchical trigger and a high bandwidth data acquisition system. The challenge for the data acquisition (DAQ) system is a readout free of deadtime, which requires an unprecedented speed of storing and processing the data. In this paper, we will outline the general architecture and hardware implementation of the HERA-B DAQ.
Observation of and evidence for decays Affolder, A; Albrecht, J; Amerio, S ...
New journal of physics,
12/2014, Letnik:
16, Številka:
12
Journal Article
Recenzirano
Odprti dostop
Measurements of the branching fractions of and decays are performed using a data sample corresponding to of proton-proton collision data collected with the LHCb detector at a centre-of-mass energy of ..., where the mesons are reconstructed in the final state. The first observation of the decay and the first evidence for the decay are reported with branching fractions where the first uncertainties are statistical and the second are systematic. In addition, an upper limit of is set at confidence level.
In the article considers the visualization methods of three-dimensional objects, which are based on the original projection system. Methods allow to form real-time without the mathematical treatment ...to restore intermediate azimuthally image real or virtual object by its reference images.
The HERA-B experiment was dedicated for the measurement of CP-violation in decays of neutral B-mesons and for investigating the physics of charmed particles. One of the experimental requirements is a ...highly selective on-line filtering of data due to high interaction rates and a low signal-to-background ratio. This demands a hierarchical trigger and a high bandwidth data acquisition system. The challenge for the DAQ system is a readout free of deadtime, which requires an unprecedent speed of storing and processing the data. In this paper we will outline the general architecture and hardware implementation of the HERA-B DAQ.