The interactions of cosmic rays with the solar atmosphere produce secondary particles which can reach the Earth. In this work, we present a comprehensive calculation of the yields of secondary ...particles such as gamma-rays, electrons, positrons, neutrons, and neutrinos performed with the fluka code. We also estimate the intensity at the Sun and the fluxes at the Earth of these secondary particles by folding their yields with the intensities of cosmic rays impinging on the solar surface. The results are sensitive to the assumptions on the magnetic field nearby the Sun and to the cosmic-ray transport in the magnetic field in the inner Solar System.
The FLUKA Monte Carlo code is used extensively at CERN for all beam-machine interactions, radioprotection calculations and facility design of forthcoming projects. Such needs require the code to be ...consistently reliable over the entire energy range (from MeV to TeV) for all projectiles (full suite of elementary particles and heavy ions). Outside CERN, among various applications worldwide, FLUKA serves as a core tool for the HIT and CNAO hadron-therapy facilities in Europe. Therefore, medical applications further impose stringent requirements in terms of reliability and predictive power, which demands constant refinement of sophisticated nuclear models and continuous code improvement. Some of the latest developments implemented in FLUKA are presented in this paper, with particular emphasis on issues and concerns pertaining to CERN and medical applications.
Clinical Monte Carlo (MC) calculations for carbon ion therapy have to provide absorbed and RBE-weighted dose. The latter is defined as the product of the dose and the relative biological ...effectiveness (RBE). At the GSI Helmholtzzentrum für Schwerionenforschung as well as at the Heidelberg Ion Therapy Center (HIT), the RBE values are calculated according to the local effect model (LEM). In this paper, we describe the approach followed for coupling the FLUKA MC code with the LEM and its application to dose and RBE-weighted dose calculations for a superimposition of two opposed (12)C ion fields as applied in therapeutic irradiations. The obtained results are compared with the available experimental data of CHO (Chinese hamster ovary) cell survival and the outcomes of the GSI analytical treatment planning code TRiP98. Some discrepancies have been observed between the analytical and MC calculations of absorbed physical dose profiles, which can be explained by the differences between the laterally integrated depth-dose distributions in water used as input basic data in TRiP98 and the FLUKA recalculated ones. On the other hand, taking into account the differences in the physical beam modeling, the FLUKA-based biological calculations of the CHO cell survival profiles are found in good agreement with the experimental data as well with the TRiP98 predictions. The developed approach that combines the MC transport/interaction capability with the same biological model as in the treatment planning system (TPS) will be used at HIT to support validation/improvement of both dose and RBE-weighted dose calculations performed by the analytical TPS.
Monte Carlo simulations of electromagnetic particle interactions and transport by FLUKA and PENELOPE were compared. 10 keV to 10 MeV incident photon beams impinged a LYSO crystal and a soft-tissue ...phantom. Central-axis as well as off-axis depth doses agreed within 1 s.d.; no systematic under- or over-estimate of the pulse height spectra was observed from 100 keV to 10 MeV for both materials, agreement was within 5%. Simulation of photon and electron transport and interactions at this level of precision and reliability is of significant impact, for instance, on treatment monitoring of hadrontherapy where a code like FLUKA is needed to simulate the full suite of particles and interactions (not just electromagnetic). At the interaction-by-interaction level, apart from known differences in condensed history techniques, two-quanta positron annihilation at rest was found to differ between the two codes. PENELOPE produced a 511 keV sharp line, whereas FLUKA produced visible acolinearity, a feature recently implemented to account for the momentum of shell electrons.
Purpose: The calculation of patient-specific dose distribution can be achieved by Monte Carlo simulations or by analytical methods. In this study, fluka Monte Carlo code has been considered for use ...in nuclear medicine dosimetry. Up to now, fluka has mainly been dedicated to other fields, namely high energy physics, radiation protection, and hadrontherapy. When first employing a Monte Carlo code for nuclear medicine dosimetry, its results concerning electron transport at energies typical of nuclear medicine applications need to be verified. This is commonly achieved by means of calculation of a representative parameter and comparison with reference data. Dose point kernel (DPK), quantifying the energy deposition all around a point isotropic source, is often the one.Methods:
fluka
DPKs have been calculated in both water and compact bone for monoenergetic electrons (10–3 MeV) and for beta emitting isotopes commonly used for therapy (89Sr, 90Y, 131I, 153Sm, 177Lu, 186Re, and 188Re). Point isotropic sources have been simulated at the center of a water (bone) sphere, and deposed energy has been tallied in concentric shells. fluka outcomes have been compared to penelope v.2008 results, calculated in this study as well. Moreover, in case of monoenergetic electrons in water, comparison with the data from the literature (etran, geant
4, mcnpx) has been done. Maximum percentage differences within 0.8·R
CSDA and 0.9·R
CSDA for monoenergetic electrons (R
CSDA being the continuous slowing down approximation range) and within 0.8·X90 and 0.9·X90 for isotopes (X90 being the radius of the sphere in which 90% of the emitted energy is absorbed) have been computed, together with the average percentage difference within 0.9·R
CSDA and 0.9·X90 for electrons and isotopes, respectively.Results: Concerning monoenergetic electrons, within 0.8·R
CSDA (where 90%–97% of the particle energy is deposed), fluka and penelope agree mostly within 7%, except for 10 and 20 keV electrons (12% in water, 8.3% in bone). The discrepancies between fluka and the other codes are of the same order of magnitude than those observed when comparing the other codes among them, which can be referred to the different simulation algorithms. When considering the beta spectra, discrepancies notably reduce: within 0.9·X90, fluka and penelope differ for less than 1% in water and less than 2% in bone with any of the isotopes here considered. Complete data of fluka
DPKs are given as Supplementary Material as a tool to perform dosimetry by analytical point kernel convolution.Conclusions:
fluka provides reliable results when transporting electrons in the low energy range, proving to be an adequate tool for nuclear medicine dosimetry.
Measurement uncertainty of atmospheric profiles obtained by radiosoundings is crucial in climate change studies. This paper shows how the understanding of geographic gaps of radiosonde networks calls ...for a functional approach able to handle spatio-temporal profile data, and related complexity issues are addressed.
Satellite product validation is a key to ensure the delivery of quality products for climate and weather applications. To do this, a fundamental step is the comparison with other instruments, such as ...radiosonde. This is especially true for essential climate variables such as temperature and humidity.
Thanks to a functional data representation, this paper uses a likelihood‐based approach that exploits the measurement uncertainties in a natural way. In particular, the comparison of temperature and humidity radiosonde measurements collected within the network of the Universal Rawinsonde Observation Program (RAOB) and the corresponding atmospheric profiles derived from the infrared atmospheric sounding interferometer aboard MetOp‐A and MetOp‐B satellites is developed with the aim of understanding the vertical smoothing mismatch uncertainty.
Moreover, conventional RAOB functional data representation is assessed by means of a comparison with radiosonde reference measurements given by the Global Climate Observing System (GCOS) Reference Upper‐Air Network (GRUAN), which provides high‐resolution fully traceable radio‐sounding profiles. In this way, the uncertainty related to coarse vertical resolution, or sparseness, of the conventional RAOB is assessed.
It has been found that the uncertainty of vertical smoothing mismatch averaged along the profile is 0.50 K for temperature and 0.16 g/kg for water‐vapor mixing ratio. Moreover, the uncertainty related to RAOB sparseness, averaged along the profile, is 0.29 K for temperature and 0.13 g/kg for water‐vapor mixing ratio.
In recent literature there has been a growing interest in the construction of covariance models for multivariate Gaussian random fields. However, effective estimation methods for these models are ...somehow unexplored. The maximum likelihood method has attractive features, but when we deal with large data sets this solution becomes impractical, so computationally efficient solutions have to be devised. In this paper we explore the use of the covariance tapering method for the estimation of multivariate covariance models. In particular, through a simulation study, we compare the use of simple separable tapers with more flexible multivariate tapers recently proposed in the literature and we discuss the asymptotic properties of the method under increasing domain asymptotics.
Most urban areas of the Po basin in the North of Italy are persistently affected by poor air quality and difficulty in disposing of airborne pollutants. In this context, the municipality of Milan ...started a multi-year progressive policy based on an extended limited traffic zone (Area B). Starting on 25 February 2019, the first phase partially restricted the circulation of some classes of highly polluting vehicles on the territory, in particular, Euro 0 petrol vehicles and Euro 0 to 3 diesel vehicles, excluding public transport. This is the early-stage of a long term policy that will restrict access to an increasing number of vehicles. The goal of this paper is to evaluate the early-stage impact of this policy on two specific vehicle-generated pollutants: total nitrogen oxides (NO x ) and nitrogen dioxide (NO 2 ), which are gathered by Lombardy Regional Agency for Environmental Protection (ARPA Lombardia). We use a statistical model for time series intervention analysis based on unobservable components. We use data from 2014 to 2018 for pre-policy model selection and the relatively short period up to September 2019 for early-stage policy assessment. We include weather conditions, socio-economic factors, and a counter-factual, given by the concentration of the same pollutant in other important neighbouring cities. Although the average concentrations reduced after the policy introduction, this paper argues that this could be due to other factors. Considering that the short time window may be not long enough for social adaptation to the new rules, our model does not provide statistical evidence of a positive policy effect for NO x and NO 2 . Instead, in one of the most central monitoring stations, a significant negative impact is found.
Reliable predictions of yields of nuclear fragments produced in electromagnetic dissociation and hadronic fragmentation of ion beams are of great practical importance in analyzing beam losses and ...interactions with the beam environment at the Large Hadron Collider (LHC) at CERN as well as for estimating radiation effects of galactic cosmic rays on the spacecraft crew and electronic equipment. The model for predicting the fragmentation of relativistic heavy ions is briefly described, and then applied to problems of relevance for LHC. The results are based on the fluka code, which includes electromagnetic dissociation physics and dpmjet-iii as hadronic event generator. We consider the interaction of fully stripped lead ions with nuclei in the energy range from about one hundred MeV to ultrarelativistic energies. The yields of fragments close in the mass and charge to initial ions are calculated. The approach under discussion provides a good overall description of Pb fragmentation data at 30 and 158AGeV as well as recent LHC data for sNN=2.76TeV Pb-Pb interactions. Good agreement with the calculations in the framework of different models is found. This justifies application of the developed simulation technique both at the LHC injection energy of 177AGeV and at its collision energies of 1.38, 1.58, and 2.75ATeV , and gives confidence in the results obtained.