The second phase of the LHC, the High-Luminosity LHC, is scheduled to start in 2029, after a shutdown during which the beam intensity and focusing will be significantly upgraded. For this HL-LHC era, ...also the CMS detector will receive an extensive upgrade, primarily to maintain its physics performance at increasing pileup. The Phase-2 CMS Level-1 trigger rate will increase to 750 kHz, for an estimated data rate in excess of 50 Tbit/s. The Phase-2 CMS off-detector electronics will be based on the ATCA standard, with back-end boards receiving the detector data from the on-detector front-ends via custom, radiation-tolerant, optical links. The CMS Phase-2 data acquisition design tightens the integration between trigger control and data flow, extending the synchronous regime of the DAQ system. At the core of the design is the DAQ and Timing Hub, a custom ATCA hub card forming the bridge between the different, detector-specific, control and readout electronics and the common timing, trigger, and control systems. The overall synchronisation and data flow of the experiment is handled by the Trigger and Timing Control and Distribution System. For increased flexibility during commissioning and calibration runs, the design of the Phase-2 trigger and timing distribution system breaks with the traditional distribution tree, in favour of a configurable network connecting multiple independent control units to all off-detector endpoints. In order to reduce the number of custom hardware designs required, the DAQ hardware is designed such that it can also be used to implement the Trigger and Timing Control and Distribution System.
The upgraded High Luminosity LHC, after the third Long Shutdown (LS3), will provide an instantaneous luminosity of 7.5 × 1034 cm−2s−1 (levelled), at the price of extreme pileup of up to 200 ...interactions per crossing. In LS3, the CMS Detector will also undergo a major upgrade to prepare for the phase-2 of the LHC physics program, starting around 2025. The upgraded detector will be read out at an unprecedented data rate of up to 50 Tb/s and an event rate of 750 kHz. Complete events will be analysed by software algorithms running on standard processing nodes, and selected events will be stored permanently at a rate of up to 10 kHz for offline processing and analysis. In this paper we discuss the baseline design of the DAQ and HLT systems for the phase-2, taking into account the projected evolution of high speed network fabrics for event building and distribution, and the anticipated performance of general purpose CPU. Implications on hardware and infrastructure requirements for the DAQ "data center" are analysed. Emerging technologies for data reduction are considered. Novel possible approaches to event building and online processing, inspired by trending developments in other areas of computing dealing with large masses of data, are also examined. We conclude by discussing the opportunities offered by reading out and processing parts of the detector, wherever the front-end electronics allows, at the machine clock rate (40 MHz). This idea presents interesting challenges and its physics potential should be studied.
Performance of the CMS Event Builder Andre, J-M; Behrens, U; Branson, J ...
Journal of physics. Conference series,
10/2017, Letnik:
898, Številka:
3
Journal Article
Recenzirano
Odprti dostop
The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of O(100GB/s) to ...the high-level trigger farm. The DAQ architecture is based on state-of-the-art network technologies for the event building. For the data concentration, 10/40 Gbit/s Ethernet technologies are used together with a reduced TCP/IP protocol implemented in FPGA for a reliable transport between custom electronics and commercial computing hardware. A 56 Gbit/s Infiniband FDR Clos network has been chosen for the event builder. This paper presents the implementation and performance of the event-building system.
During Run-1 of the LHC, many operational procedures have been automated in the run control system of the Compact Muon Solenoid (CMS) experiment. When detector high voltages are ramped up or down or ...upon certain beam mode changes of the LHC, the DAQ system is automatically partially reconfigured with new parameters. Certain types of errors such as errors caused by single-event upsets may trigger an automatic recovery procedure. Furthermore, the top-level control node continuously performs cross-checks to detect sub-system actions becoming necessary because of changes in configuration keys, changes in the set of included front-end drivers or because of potential clock instabilities. The operator is guided to perform the necessary actions through graphical indicators displayed next to the relevant command buttons in the user interface. Through these indicators, consistent configuration of CMS is ensured. However, manually following the indicators can still be inefficient at times. A new assistant to the operator has therefore been developed that can automatically perform all the necessary actions in a streamlined order. If additional problems arise, the new assistant tries to automatically recover from these. With the new assistant, a run can be started from any state of the sub-systems with a single click. An ongoing run may be recovered with a single click, once the appropriate recovery action has been selected. We review the automation features of CMS Run Control and discuss the new assistant in detail including first operational experience.
Tumor-derived lactic acid inhibits T and natural killer (NK) cell function and, thereby, tumor immunosurveillance. Here, we report that melanoma patients with high expression of glycolysis-related ...genes show a worse progression free survival upon anti-PD1 treatment. The non-steroidal anti-inflammatory drug (NSAID) diclofenac lowers lactate secretion of tumor cells and improves anti-PD1-induced T cell killing in vitro. Surprisingly, diclofenac, but not other NSAIDs, turns out to be a potent inhibitor of the lactate transporters monocarboxylate transporter 1 and 4 and diminishes lactate efflux. Notably, T cell activation, viability, and effector functions are preserved under diclofenac treatment and in a low glucose environment in vitro. Diclofenac, but not aspirin, delays tumor growth and improves the efficacy of checkpoint therapy in vivo. Moreover, genetic suppression of glycolysis in tumor cells strongly improves checkpoint therapy. These findings support the rationale for targeting glycolysis in patients with high glycolytic tumors together with checkpoint inhibitors in clinical trials.
Display omitted
•Glycolytic index in melanoma negatively correlates with response to anti-PD1 therapy•Blocking lactate transport or knock out of glycolytic genes improves checkpoint therapy•Diclofenac blocks the lactate transporters MCT1 and MCT4 in a COX-independent manner•Inhibition of glycolysis by MCT blockade does not impede T cell function
Renner et al. demonstrate a negative correlation between glycolytic activity in tumors and response to checkpoint therapy. Genetic blockade of glycolysis or pharmacological inhibition of the main lactate transporters MCT1 and MCT4 preserves T cell function, reverses tumor acidification, and augments response to checkpoint therapy.
Tritium concentrations in Japanese precipitation samples collected after the March 2011 accident at the Fukushima Dai-ichi Nuclear Power Plant (FNPP1) were measured. Values exceeding the pre-accident ...background were detected at three out of seven localities (Tsukuba, Kashiwa and Hongo) southwest of the FNPP1 at distances varying between 170 and 220km from the source. The highest tritium content was found in the first rainfall in Tsukuba after the accident; however concentrations were 500 times less than the regulatory limit for tritium in drinking water. Tritium concentrations decreased steadily and rapidly with time, becoming indistinguishable from the pre-accident values within five weeks. The atmospheric tritium activities in the vicinity of the FNPP1 during the earliest stage of the accident was estimated to be 1.5×103Bq/m3, which is potentially capable of producing rainwater exceeding the regulatory limit, but only in the immediate vicinity of the source.
► We measured the 3H contents of Japanese rain collected after the Fukushima accident. ► 3H level became 30 times higher than pre-accident level in the first rain at Tsukuba. ► Some locality within 220km from the source showed elevated 3H levels. ► These high 3H signals disappear in a few weeks. ► Atmospheric 3H level at the source during the earliest stage was estimated to be 1500Bq/m3.