The CMS Experiment is taking high energy collision data at CERN. The computing infrastructure used to analyse the data is distributed round the world in a tiered structure. In order to use the 7 ...Tier-1 sites, the 50 Tier-2 sites and a still growing number of about 30 Tier-3 sites, the CMS software has to be available at those sites. Except for a very few sites the deployment and the removal of CMS software is managed centrally. Since the deployment team has no local accounts at the remote sites all installation jobs have to be sent via Grid jobs. Via a VOMS role the job has a high priority in the batch system and gains write privileges to the software area. Due to the lack of interactive access the installation jobs must be very robust against possible failures, in order not to leave a broken software installation. The CMS software is packaged in RPMs that are installed in the software area independent of the host OS. The apt-get tool is used to resolve package dependencies. This paper reports about the recent deployment experiences and the achieved performance.
CMS analysis operations Andreeva, J; Calloni, M; Colling, D ...
Journal of physics. Conference series,
04/2010, Letnik:
219, Številka:
7
Journal Article
Recenzirano
Odprti dostop
During normal data taking CMS expects to support potentially as many as 2000 analysis users. Since the beginning of 2008 there have been more than 800 individuals who submitted a remote analysis job ...to the CMS computing infrastructure. The bulk of these users will be supported at the over 40 CMS Tier-2 centres. Supporting a globally distributed community of users on a globally distributed set of computing clusters is a task that requires reconsidering the normal methods of user support for Analysis Operations. In 2008 CMS formed an Analysis Support Task Force in preparation for large-scale physics analysis activities. The charge of the task force was to evaluate the available support tools, the user support techniques, and the direct feedback of users with the goal of improving the success rate and user experience when utilizing the distributed computing environment. The task force determined the tools needed to assess and reduce the number of non-zero exit code applications submitted through the grid interfaces and worked with the CMS experiment dashboard developers to obtain the necessary information to quickly and proactively identify issues with user jobs and data sets hosted at various sites. Results of the analysis group surveys were compiled. Reference platforms for testing and debugging problems were established in various geographic regions. The task force also assessed the resources needed to make the transition to a permanent Analysis Operations task. In this presentation the results of the task force will be discussed as well as the CMS Analysis Operations plans for the start of data taking.
Debugging data transfers in CMS Bagliesi, G; Belforte, S; Bloom, K ...
Journal of physics. Conference series,
04/2010, Letnik:
219, Številka:
6
Journal Article
Recenzirano
Odprti dostop
The CMS experiment at CERN is preparing for LHC data taking in several computing preparation activities. In early 2007 a traffic load generator infrastructure for distributed data transfer tests was ...designed and deployed to equip the WLCG tiers which support the CMS virtual organization with a means for debugging, load-testing and commissioning data transfer routes among CMS computing centres. The LoadTest is based upon PhEDEx as a reliable, scalable data set replication system. The Debugging Data Transfers (DDT) task force was created to coordinate the debugging of the data transfer links. The task force aimed to commission most crucial transfer routes among CMS tiers by designing and enforcing a clear procedure to debug problematic links. Such procedure aimed to move a link from a debugging phase in a separate and independent environment to a production environment when a set of agreed conditions are achieved for that link. The goal was to deliver one by one working transfer routes to the CMS data operations team. The preparation, activities and experience of the DDT task force within the CMS experiment are discussed. Common technical problems and challenges encountered during the lifetime of the taskforce in debugging data transfer links in CMS are explained and summarized.
We have searched for the C-violating decay η→γγγ in a sample of ∼18 million η mesons produced in φ→ηγ decays, collected with the KLOE detector at the Frascati φ-factory DAΦNE. No signal is observed ...and we obtain the upper limit BR(η→γγγ)⩽1.6×10−5 at 90% CL.
Monte Carlo production in CMS has received a major boost in performance and scale since the past CHEP06 conference. The production system has been re-engineered in order to incorporate the experience ...gained in running the previous system and to integrate production with the new CMS event data model, data management system and data processing framework. The system is interfaced to the two major computing Grids used by CMS, the LHC Computing Grid (LCG) and the Open Science Grid (OSG). Operational experience and integration aspects of the new CMS Monte Carlo production system is presented together with an analysis of production statistics. The new system automatically handles job submission, resource monitoring, job queuing, job distribution according to the available resources, data merging, registration of data into the data bookkeeping, data location, data transfer and placement systems. Compared to the previous production system automation, reliability and performance have been considerably improved. A more efficient use of computing resources and a better handling of the inherent Grid unreliability have resulted in an increase of production scale by about an order of magnitude, capable of running in parallel at the order of ten thousand jobs and yielding more than two million events per day.
We have measured the cross section σ(e+e−→π+π−γ) at an energy W=mϕ=1.02 GeV with the KLOE detector at the electron–positron collider DAΦNE. From the dependence of the cross section on the invariant ...mass of the two-pion system, we extract σ(e+e−→π+π−) for the mass range 0.35<s<0.95 GeV2. From this result, we calculate the pion form factor and the hadronic contribution to the muon anomaly, aμ.
AbstractWe have measured the cross section σ(e+e–→π+π–γ) with the KLOE detector at DAΦNE, at an energy W=Mφ=1.02 GeV. From the dependence of the cross section on ...\(m(\pi^+\pi^{-})=\sqrt{W^2-2WE_\gamma}\), where Eγ is the energy of the photon radiated from the initial state, we extract σ(e+e–→π+π–) for the mass range 0.35<m2(π+π–)<0.95 GeV2. From our result we extract the pion form factor and the hadronic contribution to the muon anomaly, aμ.
The normalized differential cross section for top quark pair (Formula: see text) production is measured in pp collisions at a centre-of-mass energy of 8Formula: see text at the CERN LHC using the CMS ...detector in data corresponding to an integrated luminosity of 19.7Formula: see text. The measurements are performed in the leptonFormula: see textjets (Formula: see textFormula: see textjets) and in the dilepton (Formula: see text, Formula: see text, and Formula: see text) decay channels. The Formula: see text cross section is measured as a function of the kinematic properties of the charged leptons, the jets associated to b quarks, the top quarks, and the Formula: see text system. The data are compared with several predictions from perturbative quantum chromodynamic up to approximate next-to-next-to-leading-order precision. No significant deviations are observed relative to the standard model predictions.
Data handling, reconstruction, and simulation for the KLOE experiment Ambrosino, F.; Antonelli, A.; Antonelli, M. ...
Nuclear instruments & methods in physics research. Section A, Accelerators, spectrometers, detectors and associated equipment,
12/2004, Letnik:
534, Številka:
3
Journal Article
Recenzirano
Odprti dostop
The broad physics program of the KLOE experiment is based on the high event rate at the Frascati φ
factory, and calls for an up-to-date system for data acquisition and processing. In this review of ...the KLOE offline environment, the architecture of the data-processing system and the programs developed for data reconstruction and Monte Carlo simulation are described, as well as the various procedures used for data handling and transfer between the different components of the system.
Measurements of the Formula: see textFormula: see text production cross sections in proton-proton collisions at center-of-mass energies of 7 and 8Formula: see text are presented. Candidate events for ...the leptonic decay mode Formula: see text, where Formula: see text denotes an electron or a muon, are reconstructed and selected from data corresponding to an integrated luminosity of 5.1 (19.6)Formula: see text at 7 (8)Formula: see text collected with the CMS experiment. The measured cross sections, Formula: see text at 7Formula: see text, and Formula: see text at 8Formula: see text, are in good agreement with the standard model predictions with next-to-leading-order accuracy. The selected data are analyzed to search for anomalous triple gauge couplings involving the Formula: see textFormula: see text final state. In the absence of any deviation from the standard model predictions, limits are set on the relevant parameters. These limits are then combined with the previously published CMS results for Formula: see textFormula: see text in 4Formula: see text final states, yielding the most stringent constraints on the anomalous couplings.