Summary
Since the 1970s, satellite communications have been continuously evolving and improving to provide services characterized by increasing complexity and quality. This evolution has been ...supported by the constant increase in the operating frequency for achieving the necessary high data rates. This contribution focuses on the long‐term key role of the Italian Space Agency in supporting research activities on (and the developments of) high‐frequency satellite communication systems. The Alphasat experiment is the most recent effort of the Italian Space Agency, in collaboration with the European Space Agency, to thoroughly investigate the severe detrimental atmospheric effects impairing radio waves at high frequency (specifically, Ka and Q bands) and the associated fade mitigation techniques (eg, uplink power control, site diversity, and adaptive coding and modulation) required to achieve the typical target quality and availability of modern satellite communication systems.
This contribution focuses on the long‐term key role of the Italian Space Agency in supporting research activities on high‐frequency satellite communication systems. The Alphasat experiment is the most recent effort of Italian Space Agency, in collaboration with the European Space Agency, to thoroughly investigate the severe detrimental atmospheric effects impairing radio waves at high frequency (specifically, Ka and Q bands) and the associated fade mitigation techniques required to achieve the typical target quality and availability of modern SatCom systems.
The computing infrastructures serving the LHC experiments have been designed to cope at most with the average amount of data recorded. The usage peaks, as already observed in Run-I, may however ...originate large backlogs, thus delaying the completion of the data reconstruction and ultimately the data availability for physics analysis. In order to cope with the production peaks, the LHC experiments are exploring the opportunity to access Cloud resources provided by external partners or commercial providers. In this work we present the proof of concept of the elastic extension of a local analysis facility, specifically the Bologna Tier-3 Grid site, for the LHC experiments hosted at the site, on an external OpenStack infrastructure. We focus on the Cloud Bursting of the Grid site using DynFarm, a newly designed tool that allows the dynamic registration of new worker nodes to LSF. In this approach, the dynamically added worker nodes instantiated on an OpenStack infrastructure are transparently accessed by the LHC Grid tools and at the same time they serve as an extension of the farm for the local usage.
After the successful LHC data taking in Run-I and in view of the future runs, the LHC experiments are facing new challenges in the design and operation of the computing facilities. The computing ...infrastructure for Run-II is dimensioned to cope at most with the average amount of data recorded. The usage peaks, as already observed in Run-I, may however originate large backlogs, thus delaying the completion of the data reconstruction and ultimately the data availability for physics analysis. In order to cope with the production peaks, CMS - along the lines followed by other LHC experiments - is exploring the opportunity to access Cloud resources provided by external partners or commercial providers. Specific use cases have already been explored and successfully exploited during Long Shutdown 1 (LS1) and the first part of Run 2. In this work we present the proof of concept of the elastic extension of a CMS site, specifically the Bologna Tier-3, on an external OpenStack infrastructure. We focus on the "Cloud Bursting" of a CMS Grid site using a newly designed LSF configuration that allows the dynamic registration of new worker nodes to LSF. In this approach, the dynamically added worker nodes instantiated on the OpenStack infrastructure are transparently accessed by the LHC Grid tools and at the same time they serve as an extension of the farm for the local usage. The amount of resources allocated thus can be elastically modeled to cope up with the needs of CMS experiment and local users. Moreover, a direct access integration of OpenStack resources to the CMS workload management system is explored. In this paper we present this approach, we report on the performances of the on-demand allocated resources, and we discuss the lessons learned and the next steps.
Beginning in 2009, the CMS experiment will produce several petabytes of data each year which will be distributed over many computing centres geographically distributed in different countries. The CMS ...computing model defines how the data is to be distributed and accessed to enable physicists to efficiently run their analyses over the data. The analysis will be performed in a distributed way using Grid infrastructure. CRAB (CMS remote analysis builder) is a specific tool, designed and developed by the CMS collaboration, that allows the end user to transparently access distributed data. CRAB interacts with the local user environment, the CMS data management services and with the Grid middleware; it takes care of the data and resource discovery; it splits the user's task into several processes (jobs) and distributes and parallelizes them over different Grid environments; it performs process tracking and output handling. Very limited knowledge of the underlying technical details is required of the end user. The tool can be used as a direct interface to the computing system or can delegate the task to a server, which takes care of the job handling, providing services such as automatic resubmission in case of failures and notification to the user of the task status. Its current implementation is able to interact with gLite and OSG Grid middlewares. Furthermore, with the same interface, it enables access to local data and batch systems such as load sharing facility (LSF). CRAB has been in production and in routine use by end users since Spring 2004. It has been extensively used in studies to prepare the Physics Technical Design Report, in the analysis of reconstructed event samples generated during the Computing Software and Analysis Challenges and in the preliminary cosmic ray data taking. The CRAB architecture and the usage inside the CMS community will be described in detail, as well as the current status and future development.
Evidence for the light-by-light scattering process, γγ→γγ, in ultraperipheral PbPb collisions at a centre-of-mass energy per nucleon pair of 5.02TeV is reported. The analysis is conducted using a ...data sample corresponding to an integrated luminosity of 390μb−1 recorded by the CMS experiment at the LHC. Light-by-light scattering processes are selected in events with two photons exclusively produced, each with transverse energy ETγ>2GeV, pseudorapidity |ηγ|<2.4, diphoton invariant mass mγγ>5GeV, diphoton transverse momentum pTγγ<1GeV, and diphoton acoplanarity below 0.01. After all selection criteria are applied, 14 events are observed, compared to expectations of 9.0±0.9(theo) events for the signal and 4.0±1.2(stat) for the background processes. The excess observed in data relative to the background-only expectation corresponds to a significance of 3.7 standard deviations, and has properties consistent with those expected for the light-by-light scattering signal. The measured fiducial light-by-light scattering cross section, σfid(γγ→γγ)=120±46(stat)±28(syst)±12(theo)nb, is consistent with the standard model prediction. The mγγ distribution is used to set new exclusion limits on the production of pseudoscalar axion-like particles, via the ▪ process, in the mass range ▪.
Broadband satellite communication systems, with their global access and broadcasting capabilities, are assuming an increasing relevance in the framework of modern Information Society. These systems ...could greatly benefit from the use of Extremely High Frequency (EHF); in particular "beyond Ka-band" frequencies can provide the advantage of large bandwidth availability but also smaller antenna size for a fixed gain, or conversely, higher antenna gain for a fixed size. One of the main drawback that limits the use of these frequencies is represented by the strong impairments caused by the lower part of the atmosphere; research activities on techniques for propagation impairments mitigation are needed, in order to dynamically adapt the system to the channel conditions; in particular Adaptive Coding and Modulation (ACM), Data Rate Adaptation (DRA), up-link power control (ULPC), spatial diversity (both using classical site diversity approach or smart gateways approach) and on-board adaptive power allocation can be efficiently adopted to improve EHF satellite systems performance. At present the use of Ka-band is the benchmark for broadband satellite communications commercial application; higher bands are under scientific investigation; in particular Italian Space Agency started an experimental campaign on Q/V band based on Alphasat "Aldo Paraboni" P/L; in this paper, the authors will report the latest results of such experiments, with a specific focus on ACM techniques optimization.
A search for direct production of the supersymmetric (SUSY) partners of electrons or muons is presented in final states with two opposite-charge, same-flavour leptons (electrons and muons), no jets, ...and large missing transverse momentum. The data sample corresponds to an integrated luminosity of 35.9fb−1 of proton–proton collisions at s=13TeV, collected with the CMS detector at the LHC in 2016. The search uses the MT2 variable, which generalises the transverse mass for systems with two invisible objects and provides a discrimination against standard model backgrounds containing W bosons. The observed yields are consistent with the expectations from the standard model. The search is interpreted in the context of simplified SUSY models and probes slepton masses up to approximately 290, 400, and 450 GeV, assuming right-handed only, left-handed only, and both right- and left-handed sleptons (mass degenerate selectrons and smuons), and a massless lightest supersymmetric particle. Limits are also set on selectrons and smuons separately. These limits show an improvement on the existing limits of approximately 150 GeV.
The results of two searches for pair production of vectorlike T or B quarks in fully hadronic final states are presented, using data from the CMS experiment at a center-of-mass energy of 13 TeV. The ...data were collected at the LHC during 2016 and correspond to an integrated luminosity of 35.9 fb−1. A cut-based analysis specifically targets the bW decay mode of the T quark and allows for the reconstruction of the T quark candidates. In a second analysis, a multiclassification algorithm, the "boosted event shape tagger," is deployed to label candidate jets as originating from top quarks, and W, Z, and H. Candidate events are categorized according to the multiplicities of identified jets, and the scalar sum of all observed jet momenta is used to discriminate signal events from the quantum chromodynamics multijet background. Both analyses probe all possible branching fraction combinations of the T and B quarks and set limits at 95% confidence level on their masses, ranging from 740 to 1370 GeV. These results represent a significant improvement relative to existing searches in the fully hadronic final state.
The Q/V Band program of ASI involves two main elements: the "Space Segment" and the "Mission Segment". The Space Segment is represented by the Technology Demonstration Payload TDP#5, the "Aldo ...Paraboni" P/L. Financed by Italy through the ARTES-8 Program, it has been developed by the European Space Agency and implemented by italian space industries. It has been embarked as hosted payload on Alphasat satellite, and successfully launched on July 25th 2013. Alphasat is an INMARSAT Commercial Telecommunications Geosynchronous satellite, which uses the ESA developed Alphabus Platform and embarks four Technology Demonstration Payloads (TDPs). Among these TDPs, TDP#5 is devoted to the exploitation and the investigation of Q/V band communications. In conjunction with the Mission Segment (MS) the TDP#5 allows to carry out communications experiments (Propagation Impairment Mitigation Techniques - PIMT) at 40/50 GHz (Q/V-band) and propagation experiments at both 20 GHz (Ka-band) and 40 GHz (Q-band). The MS has been developed by ASI and consists of two transmitting/receiving Ground Stations (GS) and three control centers, for the execution of experiments in accordance with the requirements defined by the two Principal Investigators, appointed by ASI for the communication and propagation experiments. After the integration of all the elements composing the MS, the system test campaign started with the scope to demonstrate the correct operations through functional and performance tests. In particular, in addition to the propagation experiment verification, the campaign allows to test the Q/V band communication functions in the two payload configurations (i.e. in loopback mode and in cross mode).
Differential Higgs boson (H) production cross sections are sensitive probes for physics beyond the standard model. New physics may contribute in the gluon-gluon fusion loop, the dominant Higgs boson ...production mechanism at the LHC, and manifest itself through deviations from the distributions predicted by the standard model. Combined spectra for the H→γγ, H→ZZ, and H→bb‾ decay channels and the inclusive Higgs boson production cross section are presented, based on proton-proton collision data recorded with the CMS detector at s=13TeV corresponding to an integrated luminosity of 35.9fb−1. The transverse momentum spectrum is used to place limits on the Higgs boson couplings to the top, bottom, and charm quarks, as well as its direct coupling to the gluon field. No significant deviations from the standard model are observed in any differential distribution. The measured total cross section is 61.1±6.0(stat)±3.7(syst)pb, and the precision of the measurement of the differential cross section of the Higgs boson transverse momentum is improved by about 15% with respect to the H→γγ channel alone.