The vast majority of high energy physics experiments rely on data acquisition and hardware-based trigger systems performing a number of stringent selections before storing data for offline analysis. ...The online reconstruction and selection performed at the trigger level are bound to the synchronous nature of the data acquisition system, resulting in a trade-off between the amount of data collected and the complexity of the online reconstruction performed. Exotic physics processes, such as long-lived and slow-moving particles, are rarely targeted by online triggers as they require complex and nonstandard online reconstruction, usually incompatible with the time constraints of most data acquisition systems. The online trigger selection can thus impact as one of the main limiting factors to the experimental reach for exotic signatures. Alternative data acquisition solutions based on the continuous and asynchronous processing of the stream of data from the detectors are therefore foreseeable as a way to extend the experimental physics reach. Trigger-less data readout systems, paired with efficient streaming data processing solutions, can provide a viable alternative. In this document, an end-to-end implementation of a fully trigger-less data acquisition and online data processing system is discussed. An easily scalable and deployable implementation of such an architecture is proposed, based on open-source distributed computing frameworks capable of performing asynchronous online processing of streaming data. The proposed schema can be suitable for deployment as a fully integrated data acquisition system for small-scale experimental apparatus, or to complement the trigger-based data acquisition systems of larger experiments. A muon telescope setup consisting of a set of gaseous detectors is used as the experimental development testbed in this work, and a fully integrated online processing pipeline deployed on cloud computing resources is implemented and described.
A 40 MHz Level-1 trigger scouting system for the CMS Phase-2 upgrade Ardino, Rocco; Deldicque, Christian; Dobson, Marc ...
Nuclear instruments & methods in physics research. Section A, Accelerators, spectrometers, detectors and associated equipment,
02/2023, Volume:
1047
Journal Article
Peer reviewed
Open access
The CMS Phase-2 upgrade for the HL-LHC aims at preserving and expanding the current physics capability of the experiment under extreme pileup conditions. A new tracking system incorporates a track ...finder processor, providing tracks to the Level-1 (L1) trigger. A new high-granularity calorimeter provides fine-grained energy deposition information in the endcap region. New front-end and back-end electronics feed the L1 trigger with high-resolution information from the barrel calorimeter and the muon systems. The upgraded L1 will be based primarily on the Xilinx Ultrascale Plus series of FPGAs, capable of sophisticated feature searches with resolution often similar to the offline reconstruction. The L1 Data Scouting system (L1DS) will capture L1 intermediate data produced by the trigger processors at the beam-crossing rate of 40 MHz, and carry out online analyses based on these limited-resolution data. The L1DS will provide fast and virtually unlimited statistics for detector diagnostics, alternative luminosity measurements, and, in some cases, calibrations. It also has the potential to enable the study of otherwise inaccessible signatures, either too common to fit in the L1 trigger accept budget or with requirements that are orthogonal to “mainstream” physics. The requirements and architecture of the L1DS system are presented, as well as some of the potential physics opportunities under study. The first results from the assembly and commissioning of a demonstrator currently being installed for LHC Run-3 are also presented. The demonstrator collects data from the Global Muon Trigger, the Layer-2 Calorimeter Trigger, the Barrel Muon Track Finder, and the Global Trigger systems of the current CMS L1. This demonstrator, as a data acquisition (DAQ) system operating at the LHC bunch-crossing rate, faces many of the challenges of the Phase-2 system, albeit with scaled-down connectivity, reduced data throughput and physics capabilities, providing a testing ground for new techniques of online data reduction and processing.
Heavy data load and wide cover range have always been crucial problems for big data processing in Internet of Things (IoT). Recently, mobile-edge computing (MEC) and unmanned aerial vehicle base ...stations (UAV-BSs) have emerged as promising techniques in IoT. In this article, we propose a three-layer online data processing network based on the MEC technique. On the bottom layer, raw data are generated by distributed sensors with local information. Upon them, UAV-BSs are deployed as moving MEC servers, which collect data and conduct initial steps of data processing. On top of them, a center cloud receives processed results and conducts further evaluation. For online processing requirements, the edge nodes should stabilize delay to ensure data freshness. Furthermore, limited onboard energy poses constraints to edge processing capability. In this article, we propose an online edge processing scheduling algorithm based on Lyapunov optimization. In cases of low data rate, it tends to reduce edge processor frequency for saving energy. In the presence of a high data rate, it will smartly allocate bandwidth for edge data offloading. Meanwhile, hovering UAV-BSs bring a large and flexible service coverage, which results in a path planning issue. In this article, we also consider this problem and apply deep reinforcement learning to develop an online path planning algorithm. Taking observations of around environment as an input, a CNN network is trained to predict action rewards. By simulations, we validate its effectiveness in enhancing service coverage. The result will contribute to big data processing in future IoT.
Data Mining: Concepts and Techniques provides the concepts and techniques in processing gathered data or information, which will be used in various applications. Specifically, it explains data mining ...and the tools used in discovering knowledge from the collected data. This book is referred as the knowledge discovery from data (KDD). It focuses on the feasibility, usefulness, effectiveness, and scalability of techniques of large data sets. After describing data mining, this edition explains the methods of knowing, preprocessing, processing, and warehousing data. It then presents information about data warehouses, online analytical processing (OLAP), and data cube technology. Then, the methods involved in mining frequent patterns, associations, and correlations for large data sets are described. The book details the methods for data classification and introduces the concepts and methods for data clustering. The remaining chapters discuss the outlier detection and the trends, applications, and research frontiers in data mining. This book is intended for Computer Science students, application developers, business professionals, and researchers who seek information on data mining. * Presents dozens of algorithms and implementation examples, all in pseudo-code and suitable for use in real-world, large-scale data mining projects * Addresses advanced topics such as mining object-relational databases, spatial databases, multimedia databases, time-series databases, text databases, the World Wide Web, and applications in several fields * Provides a comprehensive, practical look at the concepts and techniques you need to get the most out of your data
The high pulse intensity and repetition rate of the European X-ray Free-Electron Laser (EuXFEL) provide superior temporal resolution compared with other X-ray sources. In combination with MHz X-ray ...microscopy techniques, it offers a unique opportunity to achieve superior contrast and spatial resolution in applications demanding high temporal resolution. In both live visualization and offline data analysis for microscopy experiments, baseline normalization is essential for further processing steps such as phase retrieval and modal decomposition. In addition, access to normalized projections during data acquisition can play an important role in decision-making and improve the quality of the data. However, the stochastic nature of X-ray free-electron laser sources hinders the use of standard flat-field normalization methods during MHz X-ray microscopy experiments. Here, an online (
i.e.
near real-time) dynamic flat-field correction method based on principal component analysis of dynamically evolving flat-field images is presented. The method is used for the normalization of individual X-ray projections and has been implemented as a near real-time analysis tool at the Single Particles, Clusters, and Biomolecules and Serial Femtosecond Crystallography (SPB/SFX) instrument of EuXFEL.
Composition in Convergence: The Impact of New Media on Writing Assessment considers how technological forms--such as computers and online courses--transform the assessment of writing, in addition to ...text classroom activity. Much has been written on how technology has affected writing, but assessment has had little attention. In this book, author Diane Penrod examines how, on the one hand, computer technology and interactive material create a disruption of conventional literacy practices (reading, writing, interpreting, and critique), while, on the other hand, the influence of computers allows teachers to propose and develop new models for thinking and writing to engage students in real-world settings.This text is intended for scholars and educators in writing and composition, educational assessment, writing and technology, computers and composition, and electronic literacy. In addition, it is appropriate for graduate students planning to teach and assess electronic writing or teach in online environments.
Real-time monitoring of gas-liquid pipe flow is highly demanded in industrial processes in the chemical and power engineering sectors. Therefore, the present contribution describes the novel design ...of a robust wire-mesh sensor with an integrated data processing unit. The developed device features a sensor body for industrial conditions of up to 400 °C and 135 bar as well as real-time processing of measured data, including phase fraction calculation, temperature compensation and flow pattern identification. Furthermore, user interfaces are included via a display and 4…20 mA connectivity for the integration into industrial process control systems. In the second part of the contribution, we describe the experimental verification of the main functionalities of the developed system. Firstly, the calculation of cross-sectionally averaged phase fractions along with temperature compensation was tested. Considering temperature drifts of up to 55 K, an average deviation of 3.9% across the full range of the phase fraction was found by comparison against image references from camera recordings. Secondly, the automatic flow pattern identification was tested in an air-water two-phase flow loop. The results reveal reasonable agreement with well-established flow pattern maps for both horizontal and vertical pipe orientations. The present results indicate that all prerequisites for an application in industrial environments in the near future are fulfilled.
Literacy intervention programs are a common approach to improve children’s literacy achievement. A previous study (Rohl, Milton and Brady, 2000; Rohl and Milton, 2002) identified a range of literacy ...intervention programs offered across Australia, including Victoria. Contemporary Victorian education policies have shifted towards greater school choice in literacy intervention provision, suggesting that up to date research about schools’ use of these programs is timely. This article outlines and discusses an online data collection protocol for gathering information about literacy intervention use in Victorian primary education settings in 2014. Data on 150 schools’ intervention provision, together with their demographic and average reading achievement information, were gathered from schools’ websites, annual reports, and the My School website (ACARA, n.d., a). Descriptive statistics and Pearson’s chi-square tests were used to explore differences in reported literacy intervention offerings between schools from different sectors; and of differing enrolment sizes, and levels of socio-educational advantage and reading achievement. The results showed that literacy interventions were commonly offered across schools, with a range of programs identified. School sector showed a highly significant association, and enrolment size showed a moderate association; with whether or not schools offered one or more literacy interventions; but no significant associations were identified for either schools’ socio-educational status or mean reading achievement. Implications of these findings are discussed with reference to sector policies and research literature. The potential and challenges in utilising online data in educational research are also explored. This paper contributes recent empirical data on literacy intervention provision in Victoria and explores the utility of online data methodologies to answer questions about schools’ programs.
The essence of online data processing Dexter, Philip; Liu, Yu David; Chiu, Kenneth
Proceedings of ACM on programming languages,
10/2022, Volume:
6, Issue:
OOPSLA2
Journal Article
Peer reviewed
Open access
Data processing systems are a fundamental component of the modern computing stack. These systems are routinely deployed online: they continuously receive the requests of data processing operations, ...and continuously return the results to end users or client applications. Online data processing systems have unique features beyond conventional data processing, and the optimizations designed for them are complex, especially when data themselves are structured and dynamic. This paper describes DON Calculus, the first rigorous foundation for online data processing. It captures the essential behavior of both the backend data processing engine and the frontend application, with the focus on two design dimensions essential yet unique to online data processing systems: incremental operation processing (IOP) and temporal locality optimization (TLO). A novel design insight is that the operations continuously applied to the data can be defined as an operation stream flowing through the data structure, and this abstraction unifies diverse designs of IOP and TLO in one calculus. DON Calculus is endowed with a mechanized metatheory centering around a key observable equivalence property: despite the significant non-deterministic executions introduced by IOP and TLO, the observable result of DON Calculus data processing is identical to that of conventional data processing without IOP and TLO. Broadly, DON Calculus is a novel instance in the active pursuit of providing rigorous guarantees to the software system stack. The specification and mechanization of DON Calculus provide a sound base for the designers of future data processing systems to build upon, helping them embrace rigorous semantic engineering without the need of developing from scratch.
Issue addressed: Online systems offer opportunities to provide effective, ongoing support to childcare services to implement dietary guidelines. The study aimed to assess the effectiveness of a ...dissemination strategy on childcare service: (i) adoption; and (ii) use of an online menu planning program designed to increase compliance with dietary guidelines.
Methods: A nonrandomised controlled trial was conducted with long day care services across Australia. All services received an email invitation to access an online evidence-based menu planning program. Services in the intervention also received training, telephone contact and provision of a portable computer tablet to encourage program adoption and use. Outcomes were assessed at the 6-month follow-up using analytics data recorded by the online program. Outcomes included the proportion of services having accessed the program (adoption) and the proportion of services with a current menu entered in the program (use as intended).
Results: Twenty-seven interventions and 19 control services took part. At the 6-month follow-up, 100% vs 58% of services had adopted the online menu planning program (OR: 14.67, 95% CI: 2.43-infinity; P < 0.01) and 41% vs 5% of services had a current menu entered in the program (OR: 9.99, 95% CI: 1.01-534.57; P < 0.01) in the intervention and control arms respectively.
Conclusions: This study highlights the need for strategies to support adoption and use of an online menu planning program in childcare services if the potential benefits of such a program are to be achieved. Future research should explore the effectiveness of differing strategies to increase adoption and use of online programs at scale. So what? Strategies to support childcare service uptake and use of online programs are required in order for the potential public health benefits of such technologies to be realised.