•The ALEPH detector at LEP is used to measure cosmic ray muons.•Momentum spectrum and charge ratio of vertical muons at surface level are determined.•Results are compared to and interpreted by Monte ...Carlo models.•Information about energy spectrum and chemical composition of cosmic ray primaries is inferred.
The ALEPH detector at LEP has been used to measure the momentum spectrum and charge ratio of vertical cosmic ray muons underground. The sea-level cosmic ray muon spectrum for momenta up to 2.5TeV/c has been obtained by correcting for the overburden of 320m water equivalent (mwe). The results are compared with Monte Carlo models for air shower development in the atmosphere. From the analysis of the spectrum the total flux and the spectral index of the cosmic ray primaries is inferred. The charge ratio suggests a dominantly light composition of cosmic ray primaries with energies in the energy range between 103 and 105GeV.
We report the first results of DarkSide-50, a direct search for dark matter operating in the underground Laboratori Nazionali del Gran Sasso (LNGS) and searching for the rare nuclear recoils possibly ...induced by weakly interacting massive particles (WIMPs). The dark matter detector is a Liquid Argon Time Projection Chamber with a (46.4±0.7) kg active mass, operated inside a 30 t organic liquid scintillator neutron veto, which is in turn installed at the center of a 1 kt water Cherenkov veto for the residual flux of cosmic rays. We report here the null results of a dark matter search for a (1422±67) kgd exposure with an atmospheric argon fill. This is the most sensitive dark matter search performed with an argon target, corresponding to a 90% CL upper limit on the WIMP-nucleon spin-independent cross section of 6.1×10−44 cm2 for a WIMP mass of 100 Gev/c2.
Liquid argon is a bright scintillator with potent particle identification properties, making it an attractive target for direct-detection dark matter searches. The DarkSide-50 dark matter search here ...reports the first WIMP search results obtained using a target of low-radioactivity argon. DarkSide-50 is a dark matter detector, using a two-phase liquid argon time projection chamber, located at the Laboratori Nazionali del Gran Sasso. The underground argon is shown to contain super(39) Ar at a level reduced by a factor (1.4+ or -0.2)x10 super(3) relative to atmospheric argon. We report a background-free null result from (2616+ or -43)kgd of data, accumulated over 70.9 live days. When combined with our previous search using an atmospheric argon, the 90% C.L. upper limit on the WIMP-nucleon spin-independent cross section, based on zero events found in the WIMP search regions, is 2.0x10 super(-44)cm super(2)(8.6x10-44cm super(2), 8.0x10 super(-43)cm super(2)) for a WIMP mass of 100GeV/c super(2)(1TeV/c super(2), 10TeV/c super(2)).
The SuperB asymmetric energy e+e− collider and detector to be built at the newly founded Nicola Cabibbo Lab will provide a uniquely sensitive probe of New Physics in the flavour sector of the ...Standard Model. SuperB distributed computing group performed a detailed evaluation of DIRAC Distributed Infrastructure for use in the SuperB experiment based on the two use cases: End User Analysis and Monte Carlo Production. Test aims to evaluate DIRAC capabilities to manage both gLite and OSG sites, File Catalog management, job and data management features in SuperB realistic use cases.
The SuperB asymmetric energy e+e−- collider and detector to be built at the newly founded Nicola Cabibbo Lab will provide a uniquely sensitive probe of New Physics in the flavor sector of the ...Standard Model. Studying minute effects in the heavy quark and heavy lepton sectors requires a data sample of 75 ab−-1 and a luminosity target of 1036 cm−-2 s−-1. This luminosity translate in the requirement of storing more than 50 PByte of additional data each year, making SuperB an interesting challenge to the data management infrastructure, both at site level as at Wide Area Network level. A new Tier1, distributed among 3 or 4 sites in the south of Italy, is planned as part of the SuperB computing infrastructure. Data storage is a relevant topic whose development affects the way to configure and setup storage infrastructure both in local computing cluster and in a distributed paradigm. In this work we report the test on the software for data distribution and data replica focusing on the experiences made with Hadoop and GlusterFS.
SuperB Simulation Production System Tomassetti, L; Bianchi, F; Ciaschini, V ...
Journal of physics. Conference series,
01/2012, Letnik:
396, Številka:
2
Journal Article
Recenzirano
Odprti dostop
The SuperB asymmetric e+e− collider and detector to be built at the newly founded Nicola Cabibbo Lab will provide a uniquely sensitive probe of New Physics in the flavor sector of the Standard Model. ...Studying minute effects in the heavy quark and heavy lepton sectors requires a data sample of 75 ab−1 and a peak luminosity of 1036 cm−2 s−1. The SuperB Computing group is working on developing a simulation production framework capable to satisfy the experiment needs. It provides access to distributed resources in order to support both the detector design definition and its performance evaluation studies. During last year the framework has evolved from the point of view of job workflow, Grid services interfaces and technologies adoption. A complete code refactoring and sub-component language porting now permits the framework to sustain distributed production involving resources from two continents and Grid Flavors. In this paper we will report a complete description of the production system status of the art, its evolution and its integration with Grid services; in particular, we will focus on the utilization of new Grid component features as in LB and WMS version 3. Results from the last official SuperB production cycle will be reported.
The SuperB asymmetric-energy e+e− collider and detector to be built at the newly founded Nicola Cabibbo Lab will provide a uniquely sensitive probe of New Physics in the flavour sector of the ...Standard Model. Studying minute effects in the heavy quark and heavy lepton sectors requires a data sample of 75ab−1 and a luminosity target of 1036cm−2s−1. These parameters require a substantial growth in computing requirements and performances. The SuperB collaboration is thus investigating the advantages of new CPU architectures (multi and many cores) and how to exploit their capability of task parallelization in the framework for simulation and analysis software. In this work we present the underlying architecture which we intend to use and some preliminary performance results of the first framework prototype.
The SuperB asymmetric energy e+e− collider and detector to be built at the newly founded Nicola Cabibbo Lab will provide a uniquely sensitive probe of New Physics in the flavor sector of the Standard ...Model. Studying minute effects in the heavy quark and heavy lepton sectors requires a data sample of 75 ab−1 and a luminosity target of 1036cm−2s−1. The increasing network performance also in the Wide Area Network environment and the capability to read data remotely with good efficiency are providing new possibilities and opening new scenarios in the data access field. Subjects like data access and data availability in a distributed environment are key points in the definition of the computing model for an HEP experiment like SuperB. R&D efforts in such a field have been brought on during the last year in order to release the Computing Technical Design Report within 2013. WAN direct access to data has been identified as one of the more interesting viable option; robust and reliable protocols as HTTP/WebDAV and xrootd are the subjects of a specific R&D line in a mid-term scenario. In this work we present the R&D results obtained in the study of new data access technologies for typical HEP use cases, focusing on specific protocols such as HTTP and WebDAV in Wide Area Network scenarios. Reports on efficiency, performance and reliability tests performed in a data analysis context have been described. Future R&D plan includes HTTP and xrootd protocols comparison tests, in terms of performance, efficiency, security and features available.
The dual-phase xenon time projection chamber (TPC) is a powerful tool for direct-detection experiments searching for WIMP dark matter, other dark matter models, and neutrinoless double-beta decay. ...Successful operation of such a TPC is critically dependent on the ability to hold high electric fields in the bulk liquid, across the liquid surface, and in the gas. Careful design and construction of the electrodes used to establish these fields is therefore required. We present the design and production of the LUX-ZEPLIN (LZ) experiment’s high-voltage electrodes, a set of four woven mesh wire grids. Grid design drivers are discussed, with emphasis placed on design of the electron extraction region. We follow this with a description of the grid production process and a discussion of steps taken to validate the LZ grids prior to integration into the TPC.
Data from a dedicated cosmic ray run of the ALEPH detector were used in a study of muon trident production, i.e., muon pairs produced by muons. Here the overburden and the calorimeters are the target ...materials while the ALEPH time projection chamber provides the momentum measurements. A theoretical estimate of the muon trident cross section is obtained by developing a Monte Carlo simulation for muon propagation in the overburden and the detector. Two muon trident candidates were found to match the expected theoretical pattern. The observed production rate implies that the nuclear form factor cannot be neglected for muon tridents.