Abstract
The performance of I/O intensive applications is largely determined by the organization of data and the associated insertion/extraction techniques. In this paper we present the design and ...implementation of an application that is targeted at managing data received (upto ~ 150 Gb/s payload throughput) into host DRAM, buffering data for several seconds, matched with the DRAM size, before being dropped. All data are validated, processed and indexed. The features extracted from the processing are streamed out to subscribers over the network; in addition, while data resides in the buffer, about 0.1 ‰ of them are served to remote clients upon request. Last but not least, the application must be able to locally persist data at full input speed when instructed to do so. The characteristics of the incoming data stream (fixed or variable rate, fixed or variable payload size) heavily influences the choice of implementation of the buffer management system. The application design promotes the separation of interfaces (concepts) and application oriented specializations (models) that makes it possible to generalize most of the workflows and only requires minimal effort to integrate new data sources. After the description of the application design, we will present the hardware platform used for validation and benchmarking of the software, and the performance results obtained.
The CMS Condition Database System Guida, S. Di; Govi, G.; Ojeda, M. ...
Journal of physics. Conference series,
12/2015, Letnik:
664, Številka:
4
Journal Article
Recenzirano
Odprti dostop
The Condition Database plays a key role in the CMS computing infrastructure. The complexity of the detector and the variety of the sub-systems involved are setting tight requirements for handling the ...Conditions. In the last two years the collaboration has put a substantial effort in the re-design of the Condition Database system, with the aim at improving the scalability and the operability for the data taking starting in 2015. The re-design has focused on simplifying the architecture, using the lessons learned during the operation of the Run I data-taking period (20092013). In the new system the relational features of the database schema are mainly exploited to handle the metadata (Tag and Interval of Validity), allowing for a limited and controlled set of queries. The bulk condition data (Payloads) are stored as unstructured binary data, allowing the storage in a single table with a common layout for all of the condition data types. In this paper, we describe the full architecture of the system, including the services implemented for uploading payloads and the tools for browsing the database. Furthermore, the implementation choices for the core software will be discussed.
Conditions data infrastructure for both ATLAS and CMS have to deal with the management of several terabytes of data 1, 2. Distributed computing access to this data requires particular care and ...attention to manage request-rates of up to several tens of kHz. Thanks to the large overlap in use cases and requirements, ATLAS and CMS have worked towards a common solution for conditions data management with the aim of using this design for data-taking in Run 3. In the meantime other experiments, including NA62, have expressed an interest in this cross-experiment initiative. For experiments with a smaller payload volume and complexity, there is particular interest in having simple payload storage. The conditions data management model is implemented in a small set of relational database tables. A prototype access toolkit consisting of an intermediate web server has been implemented, using standard technologies available in the Java community. Access is provided through a set of REST services for which the API has been described in a generic way using standard Open API specifications, implemented in Swagger. Such a solution allows the automatic generation of client code and server stubs and further allows changes in the backend technology transparently. An important advantage of using a REST API for conditions access is the possibility of caching identical URLs, addressing one of the biggest challenges that large distributed computing solutions impose on conditions data access, avoiding direct DB access by means of standard web proxy solutions.
Changes in confidence in implementing smoking cessation support for pregnant women was assessed among Romanian General Practitioners (GPs) before and after a training program of evidence-based ...clinical practices to promote quitting. The total number of physicians participating in the study was 69. Before training, 51% of GPs felt somewhat/very confident asking pregnant women about tobacco use, 39% assisted smokers with a quit plan, 38% arranged follow-up for patients. After training, 85–90% found the training informative/very informative on: how to ask patients if they smoke (89%), advising patients to quit (88%), talking about the benefits of quitting (85%), assessing patients readiness to quit (87%), assisting patients in setting a quit date (87%).
A rib-functionalized iron(II) tris-dioximate clathrochelate bearing an annulated phenylimidazole fragment was prepared using nucleophilic substitution and electrophilic addition at the chelating ...α-dioximate fragment of the macrobicyclic framework. The resultant cage complex was identified with single crystal XRD, analytical data,
1
H,
13
C,
19
F,
11
B NMR spectroscopy, and examined with UV-vis spectroscopy and CVA. Two approaches to modification of the clathrochelate framework are compared.
Three-phase microcosm experiments were set up to investigate the enhancement of trichloroethene (TCE) biodegradation and to identify the most promising electron donor for in situ bioremediation. ...Acetate as carbon and energy source and hydrogen as electron donor were tested in microcosm experiments. Previous studies showed only partial dechlorination of TCE in a two-phase system due to the absence of adhesion surface and the difficulty of biofilm formation. Therefore soil was used to ensure adequate surface for the settlement of bacteria. The dynamics of biodegradation was monitored by using gas-chromatography. Microbial community structure and function were characterized by molecular biological methods (Terminal restriction fragment length polymorphism—T-RFLP, clone library), and through PCR-based group specific detection of key taxa as well as key metabolic genes both at DNA and RNA level. TCE was degraded to vinyl-chloride (VC) with added acetate, while in the case of the biotic control ethene production was detected by day 220. T-RFLP revealed that TCE enrichment resulted in different dechlorinating communities with the dominance of Sulfurospirillum halorespirans in the microcosms where VC was the end product, and with the dominance of Dehalococcoides sp. where the dechlorination ended in ethene. Using PCR-based techniques, key community members and dechlorinating bacteria were detected in the effectively dechlorinating microcosms. Scanning electron microscopy results provided evidences of biofilm formation on the surface of soil particles.