The Isotope and Muon Production using Advanced-Cyclotron and Target Technology Project (IMPACT) foresees the introduction of two new target stations and three new beamlines: one for radionuclide ...production and two for surface muon production. The latter forms the project, High-Intensity Muon Beams (HIMB), which plans to increase the muon rate from the current world record of 10 8 µ + /s up to 10 10 µ + /s. This work presents an overview of the future HIMB beamlines focusing on the magnet designs that have been developed to ensure increased muon rate production and transmission. Specific radiation-hard resistive coils, based on mineral insulation, are required in this case due to the proximity to the target station. A high muon capture and transmission efficiency requires solenoid-like magnets, as well as dipole magnets and crossed-field separators to select the desired particles, while suppressing unwanted background particles. The radiation-hard capture solenoid plays the most important role in the whole beamline since it must provide a high capture efficiency. Beam optics studies provided the on-axis field profile necessary for optimizing the size and shape of the capture solenoid. Therefore, the article will also elucidate on these solenoid design strategies for achieving the desired capture efficiency.
Commissioning of liquid xenon gamma-ray detector for MEG II experiment Matsushita, Ayaka; Ban, Sei; Benmansour, Hicham ...
Nuclear instruments & methods in physics research. Section A, Accelerators, spectrometers, detectors and associated equipment,
02/2023, Letnik:
1047
Journal Article
Recenzirano
The liquid xenon (LXe) gamma-ray detector in the MEG II experiment measures the energy, position and timing of the gamma-ray from μ+→e+γ, and it is the key to the unprecedented sensitivity of the ...experiment. All the photo sensors of 4092 VUV MPPCs and 668 PMTs were read out for the first time and a physics data collection started in 2021. The detector response was monitored all through the beam time, and the LXe detector operated stably. The timing and energy resolution were measured using the gamma-rays from the π0 decays after charge exchange reaction of charged pions in a liquid hydrogen target. The detector has been successfully commissioned and is ready for the long physics run.
Currently, PSI delivers the most intense continuous muon beam in the world with up to a few 10
8
µ
+
/s. The High-Intensity Muon Beams (HIMB) project is developing a new target station and muon ...beamlines able to deliver 10
10
µ
+
/s, with a huge impact for low-energy, high-precision muon experiments. While the next generation of proton drivers with beam powers in excess of the currently achieved 1.4 MW still require significant research and development, the focus of HIMB is to improve the surface muon yield with a new target geometry and to increase capture and transmission with a solenoid-based beamline in order to reach a total efficiency of approximately 10 %. We present the current status of the HIMB project.
The MEG experiment took data at the Paul Scherrer Institute in the years 2009–2013 to test the violation of the lepton flavor conservation law, which originates from an accidental symmetry that the ...Standard Model of elementary particle physics has, and published the most stringent limit on the charged lepton flavor violating decay μ+→e+γ: BR(μ+→e+γ) <4.2×10−13 at 90% confidence level. The MEG detector has been upgraded in order to reach a sensitivity of 6×10−14. The basic principle of MEG II is to achieve the highest possible sensitivity using the full muon beam intensity at the Paul Scherrer Institute (7×107 muons/s) with an upgraded detector. The main improvements are better rate capability of all sub-detectors and improved resolutions while keeping the same detector concept. In this paper, we present the current status of the preparation, integration and commissioning of the MEG II detector in the recent engineering runs.
Current discrete manufacturing systems are characterized by an ever-increasing complexity, demanding for innovative solutions, capable to optimize performances while increasing the resilience and the ...capability to adapt to production modifications. With such a background, changing perspective to deal with distributed modular architectures of Cyber Physical Systems is mandatory, and the IEC 61499 standard, its object oriented, and event-based approaches promote this paradigm shift. The multi-disciplinary nature of the CPS entities and the possibility to exploit their digital counterparts, paves the way for the development of enhanced decision-support systems like the ones dedicated to Virtual Commissioning (VC). VC supports the automation developer in evaluating the impact of different management strategies, increasing the reliability of the final control applications, while reducing the amount of time to carry on physical tests on the real mechatronic system. However, creating a virtual commissioning model is still a complex and potentially expensive process that needs to be carried out by different professionals who must tightly cooperate to generate an effective playground for the automation testing. We propose a new approach to the design and develop virtual commissioning models, that, leveraging the synergies between modular simulation and IEC 61499 automation technologies, aims at improving the efficiency of the overall process of implementing 3D simulation digital twins for complex automated discrete manufacturing systems. The paper describes an open architecture, composed of reference data models and software API, and presents a proof-of-concept implementation of an integrated engineering platform of VC models.
The High Intensity Muon Beams (HIMB) project at the Paul Scherrer Institute (PSI) will deliver muon beams with unprecedented intensities of up to 1010muons/s for next-generation particle physics and ...material science experiments. This represents a hundredfold increase over the current state-of-the-art muon intensities, also provided by PSI. We performed beam dynamics optimisations and studies for the design of the HIMB beamlines MUH2 and MUH3 using Graphics Transport, Graphics Turtle, and G4beamline, the latter incorporating PSI’s own measured π+ cross-sections and variance reduction. We initially performed large-scale beamline optimisations using asynchronous Bayesian optimisation with DeepHyper. We are now developing an island-based evolutionary optimisation code glyfada based on the Paradiseo framework, where we implemented Message Passing Interface (MPI) islands with OpenMP parallelisation within each island. Furthermore, we implemented an island model that is also suitable for high-throughput computing (HTC) environments with asynchronous communication via a Redis database. The code interfaces with the codes COSY INFINITY and G4beamline. The code glyfada will provide heterogeneous island model optimisation using evolutionary optimisation and local search methods, as well as part-wise optimisation of the beamline with automatic advancement through stages. We will use the glyfada for a future large-scale optimisation of the HIMB beamlines.
The MEG experiment took data at the Paul Scherrer Institute in the years 2009–2013 to test the violation of the lepton flavor conservation law, which originates from an accidental symmetry that the ...Standard Model of elementary particle physics has, and published the most stringent limit on the charged lepton flavor violating decay μ+→e+γ: BR(μ+→e+γ) <4.2×10-13 at 90% confidence level. The MEG detector has been upgraded in order to reach a sensitivity of 6×10-14. The basic principle of MEG II is to achieve the highest possible sensitivity using the full muon beam intensity at the Paul Scherrer Institute (7×107 muons/s) with an upgraded detector. The main improvements are better rate capability of all sub-detectors and improved resolutions while keeping the same detector concept. In this paper, we present the current status of the preparation, integration and commissioning of the MEG II detector in the recent engineering runs.
Cyber Physical System (CPS) nameplate values change over time due to situation and strain. This chapter describes the future CPS as equipped with special assets named Functional Models to be uploaded ...to Centralized Support Infrastructure (CSI) for synchronization and data analysis. Functional Models are essentially software routines that are run against data sent by the CPS. In a nutshell, the microservice architecture is the evolution of the classical Service Oriented Architecture (SOA) in which the application is seen as a suite of small services, each devoted to a single activity. Big Data technologies are becoming innovation drivers in industry. The CSI is required to handle unprecedented volumes of data generated by the digital representation of the factory in order to keep updated the CPS nameplate information. The speed layer is in charge of processing infinite streams of information. It is the purpose of the Speed Layer to offer a low latency, real-time data processing.
Virtual Factory Manager for semantic data handling Ghielmini, Giorgio; Pedrazzoli, Paolo; Rovere, Diego ...
CIRP journal of manufacturing science and technology,
2013, 2013-1-00, Letnik:
6, Številka:
4
Journal Article
Recenzirano
The growing importance of manufacturing SMEs within the European economy, in terms of Gross Domestic Product and number of jobs, emphasizes the need of proper ICT tools to support their ...competitiveness. Major ICT players already offer one-does-all Product Lifecycle Management suites, however, these do show consistent shortcomings in terms of SME accessibility, degree of personalization and they often lack of an acceptable level of interoperability. These problems are being addressed by the development of a Virtual Factory Framework (VFF). The approach is based on four pillars: (1) Semantic Shared Data Model, (2) Virtual Factory Manager (VFM), (3) Decoupled Software Tools and (4) Integration of Knowledge. This paper will focus on the Virtual Factory Manager that acts as a server supporting the I/O communications within the framework and its stored knowledge for the Decoupled Software Tools needing to access its repository.
This chapter describes the definition of a semantic meta-model meant to describe the functional characteristics of a Cyber Physical Systems (CPS), which are relevant from its design and simulation ...for its integration and coordination in an industrial production environment. CPS are engineered systems that offer close interaction between cyber and physical components. Engineering information is stored following the Object-Oriented Paradigm, and physical and logical plant components are modelled as data objects encapsulating different aspects. Property is an abstract class derived by Identified Element and represents runtime properties of every resource and prototype. An important feature that the CPS data model should support is the possibility to create links between runtime properties and properties defined inside assets and between properties defined by two different assets. A prototype is a Resource model that is complete from a digital point of view, but it is still not applied in any plant model.