High Energy and Nuclear Physics (HENP) libraries are now required to be more and more multi-thread-safe, if not multi-thread-friendly and multi-threaded. This is usually done using the new constructs ...and library components offered by the C++11 and C++14 standards. These components are however quite low-level and hard to use and compose. However, Go provides a set of better building blocks for tackling concurrency: goroutines and channels. This language is now used by the whole cloud industry: docker/moby, rkt, Kubernetes are obvious flagships for Go. But to be able to perform any meaningful physics analysis, one needs a set of basic libraries (matrix operations, linear algebra, plotting, I/O, ...) We present Go-HEP, a set of packages to easily write concurrent software to interface with legacy HENP C++ physics libraries.
Aims
The study was aimed to understand the depuration process of Cryptosporidium parvum and Toxoplasma gondii oocysts by zebra mussel (Dreissena polymorpha), to consider the use of the zebra mussel ...as a bioremediation tool.
Materials and methods
Two experiments were performed: (i) individual exposure of mussel to investigate oocyst transfers between bivalves and water and (ii) in vivo exposure to assess the ability of the zebra mussel to degrade oocysts.
Results
(i) Our results highlighted a transfer of oocysts from the mussels to the water after 3 and 7 days of depuration; however, some oocysts were still bioaccumulated in mussel tissue. (ii) Between 7 days of exposure at 1000 or 10 000 oocysts/mussel/day and 7 days of depuration, the number of bioaccumulated oocysts did not vary but the number of infectious oocysts decreased.
Conclusion
Results show that D. polymorpha can release oocysts in water via (pseudo)faeces in depuration period. Oocysts remain bioaccumulated and infectious oocyst number decreases during the depuration period in zebra mussel tissues. Results suggest a degradation of bioaccumulated C. parvum and T. gondii oocysts.
Significance and Impact of the Study
This study highlighted the potential use of D. polymorpha as a bioremediation tool to mitigate of protozoan contamination in water resources.
ProIO is a new event-oriented streaming data format which utilizes Google’s Protocol Buffers (protobuf) to be flexible and highly language-neutral. The ProIO concept is described here along with its ...software implementations. The performance of the ProIO concept for a dataset with Monte-Carlo event records used in high-energy physics was benchmarked and compared/contrasted with ROOT I/O. Various combinations of general-purpose compression and variable-length integer encoding available in protobuf were used to investigate the relationship between I/O performance and size-on-disk in a few key scenarios.
Program Title: ProIO
Program Files doi:http://dx.doi.org/10.17632/mfxsg2d5x5.1
Licensing provisions: BSD 3-clause
Programming language: Python, Go, C++, Java
Nature of problem: In high-energy and nuclear physics (HEP and NP), Google’s Protocol Buffers (protobufs) can be a useful tool for the persistence of data. However, protobufs are not well-suited for describing large, rich datasets. Additionally, features such as direct event access, lazy event decoding, general-purpose compression, and self-description are features that are important to HEP and NP, but that are missing from protobuf.
Solution method: The solution adopted here is to describe and implement a streaming format for wrapping protobufs in an event structure. This solution requires small (typically less than 1000 lines of code) implementations of the format in the desired programming languages. With this approach, most of the I/O heavy lifting is done by the protobufs, and ProIO adds the necessary physics-oriented features.
The CCube reconstruction algorithm for the SoLid experiment Abreu, Y.; Amhis, Y.; Arnold, L. ...
Nuclear instruments & methods in physics research. Section A, Accelerators, spectrometers, detectors and associated equipment,
September 2024, Letnik:
1066
Journal Article
Recenzirano
The SoLid experiment is a very-short-baseline experiment aimed at searching for nuclear-reactor-produced active-to-sterile antineutrino oscillations. The detection principle is based on the pairing ...of two types of solid scintillators: polyvinyl toluene and 6LiF:ZnS(Ag), which is a new technology used in this field of Physics. In addition to good neutron-gamma discrimination, this setup allows the detector to be highly segmented (the basic detection unit is a 5 cm side cube). High segmentation provides numerous advantages, including the precise location of inverse beta decay (IBD) products, the derivation of the antineutrino energy estimator based on the isolated positron energy, and a powerful background reduction tool based on the topological signature of the signal. Finally, the system is read out by a network of wavelength-shifting (WLS) fibres coupled to a photodetectors. This paper describes the design of the reconstruction algorithm that allows maximum use of the granularity of the detector. The goal of the algorithm is to convert the output of the optical-fibre readout to the list of the detection units from which it originated. This paper provides a performance comparison for three methods and concludes with a choice of the baseline approach for the experiment.
The nature of occupational risks and hazards in industries that produce or use synthetic amorphous silica (SAS) nanoparticles is still under discussion. Manufactured SAS occur in amorphous form and ...can be divided into two main types according to the production process, namely, pyrogenic silica (powder) and precipitated silica (powder, gel or colloid). The physical and chemical properties of SAS may vary in terms of particle size, surface area, agglomeration state or purity, and differences in their toxicity potential might therefore be expected. The aim of this study was to compare the cytotoxicity and genotoxicity of representative manufactured SAS samples in Chinese hamster lung fibroblasts (V79 cells). Five samples from industrial SAS producers were evaluated, that is, two pyrogenic SAS powders (with primary particle sizes of 20 nm and 25/70 nm), one precipitated SAS powder (20 nm) and two precipitated SAS colloids (15 and 40/80 nm). V79 cell cultures were treated with different concentrations of SAS pre-dispersed in bovine serum albumin –water medium. Pyr (pyrogenic) 20, Pre (precipitated) 20 and Col (colloid) 15 significantly decreased the cell viability after 24 h of exposure, whilst Pyr 25/70 and Col 40/80 had negligible effects. The cytotoxicity of Pyr 20, Pre 20 and Col 15 was revealed by the induction of apoptosis, and Pyr 20 and Col 15 also produced DNA damage. However, none of the SAS samples generated intracellular reactive oxidative species, micronuclei or genomic mutations in V79 cells after 24 h of exposure. Overall, the results of this study show that pyrogenic, precipitated and colloidal manufactured SAS of around 20 nm primary particle size can produce significant cytotoxic and genotoxic effects in V79 cells. In contrast, the coarser-grained pyrogenic and colloid SAS (approximately 50 nm) yielded negligible toxicity, despite having been manufactured by same processes as their finer-grained equivalents. To explain these differences, the influence of particle agglomeration and oxidative species formation is discussed.
HEP software stacks are not shallow. Indeed, HEP experiments' software is usually many applications in one (reconstruction, simulation, analysis,...) and thus require many libraries - developed ...in-house or by third parties - to be properly compiled and installed. Moreover, because of resource constraints, experiments' software is usually installed, tested, validated and deployed on a very narrow set of platforms, architectures, toolchains and operating systems. As a consequence, bootstrapping a software environment on a developer machine or deploying the software on production or user machines is usually perceived as tedious and iterative work, especially when one wants the native performances of bare metal. Docker containers provide an interesting avenue for packaging applications and development environment, relying on the Linux kernel capabilities for process isolation, adding git-like capabilities to the filesystem layer and providing (close to) native CPU, memory and I/O performances.
Core Ideas
SNO KARST is dedicated to the study of karst functioning.
Hydrodynamics and geochemistry are measured at springs and in karst compartments.
Process sampling was set up at nine sites in ...various climatic contexts.
Continuous monitoring concerns timescales from 10 to >50 yr.
New tools and findings are due to the complementarity of gathered data.
Karst aquifers and watersheds represent a major source of drinking water around the world. They are also known as complex and often highly vulnerable hydrosystems due to strong surface–groundwater interactions. Improving the understanding of karst functioning is thus a major issue for the efficient management of karst groundwater resources. A comprehensive understanding of the various processes can be achieved only by studying karst systems across a wide range of spatiotemporal scales under different geological, geomorphological, climatic, and soil cover settings. The objective of the French Karst National Observatory Service (SNO KARST) is to supply the international scientific community with appropriate data and tools, with the ambition of (i) facilitating the collection of long‐term observations of hydrogeochemical variables in karst, and (ii) promoting knowledge sharing and developing cross‐disciplinary research on karst. This paper provides an overview of the monitoring sites and collective achievements, such as the KarstMod modular modeling platform and the PaPRIKa toolbox, of SNO KARST. It also presents the research questions addressed within the framework of this network, along with major research results regarding (i) the hydrological response of karst to climate and anthropogenic changes, (ii) the influence of karst on geochemical balance of watersheds in the critical zone, and (iii) the relationships between the structure and hydrological functioning of karst aquifers and watersheds.
Geophysical surveys were conducted on the very unstable front part of the La Clapière landslide in the French Alps (Alpes Maritimes). The electrical resistivity survey was carried out to obtain, for ...the first time on this deep‐seated landslide, 3D information on the slipping surface and the vertical drained faults. Moreover, we planned to follow within time (6 months) the evolution of the saturated zones (presence of gravitational water) and their percolation into the shearing zones. Our 4D results showed the importance of the complex water channelization within the slope and relation to geological discontinuities.
Specific vulnerability estimations for groundwater resources are usually geographic information system-based (GIS) methods that establish spatial qualitative indexes which determine the sensitivity ...to infiltration of surface contaminants, but with little validation of the working hypothesis. On the other hand, lumped parameter models, such as the Residence Time Distribution (RTD), are used to predict temporal water quality changes in drinking water supply, but the lumped parameters do not incorporate the spatial variability of the land cover and use. At the interface between these two approaches, a GIS tool was developed to estimate the lumped parameters from the vulnerability mapping dataset. In this method the temporal evolution of groundwater quality is linked to the vulnerability concept on the basis of equivalent lumped parameters that account for the spatially distributed hydrodynamic characteristics of the overall unsaturated and saturated flow nets feeding the drinking water supply. This vulnerability mapping method can be validated by field observations of water concentrations. A test for atrazine specific vulnerability of the Val d’Orléans karstic aquifer demonstrates the reliability of this approach for groundwater contamination assessment.