A fundamental understanding of how near-IR light propagates through sound and carious dental hard tissues is essential for the development of clinically useful optical diagnostic systems, since image ...contrast is based on changes in the optical properties of these tissues on demineralization. During the caries (decay) process, micropores are formed in the lesion due to partial dissolution of the individual mineral crystals. Such small pores behave as scattering centers, strongly scattering visible and near-IR light. The optical properties of enamel can be quantitatively described by the absorption and scattering coefficients, and the scattering phase function. Our aim is to measure the optical scattering behavior of natural and artificial enamel caries. Near-IR attenuation measurements and angular-resolved goniometer measurements coupled with Monte Carlo simulations are used to determine changes in the scattering coefficient and the scattering anisotropy on demineralization at 1310 nm. An ultra-high resolution digital microradiography system is used to quantify the lesion severity by measurement of the relative mineral loss for comparison with optical scattering measurements. The scattering coefficient increases exponentially with increasing mineral loss. Natural and artificial demineralization increases the scattering coefficient more than two orders of magnitude at 1310 nm, and the scattering is highly forward directed.
The CMS detector will undergo a significant upgrade to cope with the HL-LHC instantaneous luminosity and average number of proton–proton collisions per bunch crossing (BX). The Phase-2 CMS detector ...will be equipped with a new Level-1 (L1) trigger system that will have access to an unprecedented level of information. Advanced reconstruction algorithms will be deployed directly on the L1 FPGA-based processors, producing reconstructed physics primitives of quasi-offline quality. The latter will be collected and processed by the Level-1 trigger Data Scouting (L1DS) system at the full bunch crossing rate. Besides providing vast amounts of data for L1 and detector monitoring, the L1DS will perform quasi-online analysis in a heterogeneous computing farm. It is expected that the study of signatures too common to fit within the L1 acceptance budget, or orthogonal to the standard physics trigger selection strategies will greatly benefit from this approach. An L1DS prototype system has been set up to operate in the current LHC Run-3, with the main goal of demonstrating the basic principle and shape the development of the Phase-2 system. The Run-3 L1DS receives trigger primitives from the Global Muon and Calorimeter Trigger, the Global Trigger decision bits and the muon segments from the Barrel Muon Track Finder. FPGA boards acquire and aggregate the synchronous trigger data streams and perform basic data reduction, before sending the trigger primitives to a set of computing nodes through 100 Gbps Ethernet connections running a simplified firmware version of the TCP/IP protocol. An Intel TBB-based DAQ software receives the TCP/IP streams and applies further processing before the ingestion of the data into a cluster of servers running the CMS reconstruction framework. The output of the computing farm are data sets in the standard CMS data analysis format. This contribution presents the Run-3 L1DS demonstrator architecture and recent physics results extracted from the collected data.
New imaging technologies are needed for the early detection of dental caries (decay) in the interproximal contact sites between teeth. Previous measurements have demonstrated that dental enamel is ...highly transparent in the near-IR at 1300-nm. In this study, a near-IR imaging system operating at 1300-nm was used to acquire images through tooth sections of varying thickness and whole teeth in order to demonstrate the utility of a near-IR dental transillumination system for the imaging of early dental caries (decay). Simulated lesions, which model the optical scattering of natural dental caries, were placed in plano-parallel dental enamel sections. The contrast ratio between the simulated lesions and surrounding sound enamel was calculated from analysis of acquired projection images. The results show significant contrast between the lesion and the enamel (>0.35) and a spatial line profile that clearly resolves the lesion in samples as thick as 6.75-mm. This study clearly demonstrates that a near-IR transillumination system has considerable potential for the imaging of early dental decay.
Precision oncology trials for pediatric cancers require rapid and accurate detection of genetic alterations. Tumor variant identification should interrogate the distinctive driver genes and more ...frequent copy number variants and gene fusions that are characteristics of pediatric tumors. Here, we evaluate tumor variant identification using whole genome sequencing (n = 12 samples) and two amplification-based next-generation sequencing assays (n = 28 samples), including one assay designed to rapidly assess common diagnostic, prognostic, and therapeutic biomarkers found in pediatric tumors. Variant identification by the three modalities was comparable when filtered for 151 pediatric driver genes. Across the 28 samples, the pediatric cancer-focused assay detected more tumor variants per sample (two-sided,
<
.05), which improved the identification of potentially druggable events and matched pathway inhibitors. Overall, our data indicate that an assay designed to evaluate pediatric cancer-specific variants, including gene fusions, may improve the detection of target-agent pairs for precision oncology.
File-based data flow in the CMS Filter Farm Andre, J-M; Andronidis, A; Bawej, T ...
Journal of physics. Conference series,
12/2015, Letnik:
664, Številka:
8
Journal Article, Conference Proceeding
Recenzirano
Odprti dostop
During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground ...for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes are also generated in the form of small "documents" using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These "files" can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.
The data-acquisition system of the CMS experiment at the LHC performs the read-out and assembly of events accepted by the first level hardware trigger. Assembled events are made available to the ...high-level trigger which selects interesting events for offline storage and analysis. The system is designed to handle a maximum input rate of 100 kHz and an aggregated throughput of 100GB/s originating from approximately 500 sources. An overview of the architecture and design of the hardware and software of the DAQ system is given. We discuss the performance and operational experience from the first months of LHC physics data taking.
The CMS data acquisition system software Bauer, G; Behrens, U; Biery, K ...
Journal of physics. Conference series,
04/2010, Letnik:
219, Številka:
2
Journal Article
Recenzirano
Odprti dostop
The CMS data acquisition system is made of two major subsystems: event building and event filter. The presented paper describes the architecture and design of the software that processes the data ...flow in the currently operating experiment. The central DAQ system relies on industry standard networks and processing equipment. Adopting a single software infrastructure in all subsystems of the experiment imposes, however, a number of different requirements. High efficiency and configuration flexibility are among the most important ones. The XDAQ software infrastructure has matured over an eight years development and testing period and has shown to be able to cope well with the requirements of the CMS experiment.
The CMS experiment at the LHC features a two-level trigger system. Events accepted by the first level trigger, at a maximum rate of 100 kHz, are read out by the Data Acquisition system (DAQ), and ...subsequently assembled in memory in a farm of computers running a software high-level trigger (HLT), which selects interesting events for offline storage and analysis at a rate of order few hundred Hz. The HLT algorithms consist of sequences of offline-style reconstruction and filtering modules, executed on a farm of 0(10000) CPU cores built from commodity hardware. Experience from the operation of the HLT system in the collider run 2010/2011 is reported. The current architecture of the CMS HLT, its integration with the CMS reconstruction framework and the CMS DAQ, are discussed in the light of future development. The possible short- and medium-term evolution of the HLT software infrastructure to support extensions of the HLT computing power, and to address remaining performance and maintenance issues, are discussed.
The error and alarm system for the data acquisition of the Compact Muon Solenoid (CMS) at CERN was successfully used for the physics runs at Large Hadron Collider (LHC) during first three years of ...activities. Error and alarm processing entails the notification, collection, storing and visualization of all exceptional conditions occurring in the highly distributed CMS online system using a uniform scheme. Alerts and reports are shown on-line by web application facilities that map them to graphical models of the system as defined by the user. A persistency service keeps a history of all exceptions occurred, allowing subsequent retrieval of user defined time windows of events for later playback or analysis. This paper describes the architecture and the technologies used and deals with operational aspects during the first years of LHC operation. In particular we focus on performance, stability, and integration with the CMS sub-detectors.