In this paper, a comparative review between time- and frequency-domain methods for fatigue damage assessment is performed. The principal steps of a fatigue study are described in detail: Material ...Characterization, Definition of the Reference Parameter, Treatment of Loading History, Cycle Counting Algorithm and Damage Model. Furthermore, for each of them the main differences found between the advances made in the time- and frequency-domains are highlighted. As a conclusion, this comparative literature review allows us to identify some important lights and shadows in both approaches: several efforts have been made in the development of advanced material characterization models in S-N field in the time-domain methods, either deterministic or probabilistic, but in the frequency-domain methods only the linear Basquin model is currently used. Also the ongoing discussion about the reference parameter in material characterization (stress, strain, energy, etc.) is not present in the frequency-domain methods, which are mainly based on the stress range. Contrarily, the frequency-domain methods show an advanced treatment of the rainflow histogram with different proposed statistical distributions together with theoretical and analytical relationships between the power spectral density and the expected fatigue damage, leading to a simpler and easier methodology to be applied for fatigue damage assessment than those based on time-domain.
•A comparative review between time- and frequency-domain methods for fatigue damage assessment is performed.•The development of material characterization models is more prolific in time-domain.•The selection of a suitable reference parameter is an ongoing topic in time-domain.•A vast different proposed models to define analytically the rainflow cycle has been formulated in frequency-domain.
•Survey on CNN-based approaches for crowd counting and density estimation.•Discussion on recent hand-crafted representations-based methods.•Recently datasets that pose various challenges are ...discussed.•Detailed analysis and comparison of results of CNN-based and traditional methods.•Discussion on future directions and trends for further progress.
Estimating count and density maps from crowd images has a wide range of applications such as video surveillance, traffic monitoring, public safety and urban planning. In addition, techniques developed for crowd counting can be applied to related tasks in other fields of study such as cell microscopy, vehicle counting and environmental survey. The task of crowd counting and density map estimation is riddled with many challenges such as occlusions, non-uniform density, intra-scene and inter-scene variations in scale and perspective. Nevertheless, over the last few years, crowd count analysis has evolved from earlier methods that are often limited to small variations in crowd density and scales to the current state-of-the-art methods that have developed the ability to perform successfully on a wide range of scenarios. The success of crowd counting methods in the recent years can be largely attributed to deep learning and publications of challenging datasets. In this paper, we provide a comprehensive survey of recent Convolutional Neural Network (CNN) based approaches that have demonstrated significant improvements over earlier methods that rely largely on hand-crafted representations. First, we briefly review the pioneering methods that use hand-crafted representations and then we delve in detail into the deep learning-based approaches and recently published datasets. Furthermore, we discuss the merits and drawbacks of existing CNN-based approaches and identify promising avenues of research in this rapidly evolving field.
The
Ra and
Ra isotopes of radium are significant contaminants in food, raising public concern because of their radiotoxicity. Several methods are available for determining
Ra and
Ra. However, the ...application of these procedures is not focused on food but only on water and environmental matrices. In this study, a cost-effective method for the simultaneous determination of
Ra and
Ra radioactivity in food samples using liquid scintillation counting was developed. The overall efficiencies of
Ra and
Ra in the food samples are 69.4-78.4% and 30.1-35.8%, respectively. The minimum detectable activities of
Ra and
Ra are 11.3 mBq/g and 33.4 mBq/g, respectively, in our food sample, obtained using a 1.0 g ash sample and 60 min of counting time. The method was validated using IAEA-certified reference materials and compared with data obtained using gamma spectrometry in tea, kelp, and oyster samples.
Recent tests of a single module of the Jagiellonian Positron Emission Tomography system (J-PET) consisting of 30 cm long plastic scintillator strips have proven its applicability for the detection of ...annihilation quanta (0.511 MeV) with a coincidence resolving time (CRT) of 0.266 ns. The achieved resolution is almost by a factor of two better with respect to the current TOF-PET detectors and it can still be improved since, as it is shown in this article, the intrinsic limit of time resolution for the determination of time of the interaction of 0.511 MeV gamma quanta in plastic scintillators is much lower. As the major point of the article, a method allowing to record timestamps of several photons, at two ends of the scintillator strip, by means of matrix of silicon photomultipliers (SiPM) is introduced. As a result of simulations, conducted with the number of SiPM varying from 4 to 42, it is shown that the improvement of timing resolution saturates with the growing number of photomultipliers, and that the Formula: see text configuration at two ends allowing to read twenty timestamps, constitutes an optimal solution. The conducted simulations accounted for the emission time distribution, photon transport and absorption inside the scintillator, as well as quantum efficiency and transit time spread of photosensors, and were checked based on the experimental results. Application of the Formula: see text matrix of SiPM allows for achieving the coincidence resolving time in positron emission tomography of Formula: see text0.170 ns for 15 cm axial field-of-view (AFOV) and Formula: see text0.365 ns for 100 cm AFOV. The results open perspectives for construction of a cost-effective TOF-PET scanner with significantly better TOF resolution and larger AFOV with respect to the current TOF-PET modalities.
: Time-of-flight positron emission tomography (PET) is the next frontier in improving the effective sensitivity. To achieve superior timing for time-of-flight PET, combined with high detection ...efficiency and cost-effectiveness, we have studied the applicability of BaF2 in metascintillators driven by the timing of cross-luminescence photon production.
: Based on previous simulation studies of energy sharing and analytic multi-exponential scintillation pulse, as well as sensitivity characteristics, we have experimentally tested a pixel of 3 × 3 × 15 mm3 based on 300
m BGO and 300
m BaF2 layers. To harness the deep ultraviolet cross-luminescent light component, which carries improved timing, we use the FBK VUV SiPM. Metascintillator energy sharing is addressed through a double integration approach.
: We reach an energy resolution of 22%, comparable to an 18% resolution of simple BGO pixels using the same readout, through the optimized use of the integrals of the metascintillator pulse in energy sharing calculation. We measure the energy sharing extent of each pulse with a resolution of 25% and demonstrate that experimental and simulation results agree well. Based on the energy sharing, a timewalk correction is applied, exhibiting significant improvements for both the coincidence time resolution (CTR) and the shape of the timing histogram. We reach 242 ps CTR for the entire photopeak, while for a subset of 13% of the most shared events, the CTR value improves to 108 ps, comparable to the 3 × 3 × 5 mm3 LYSO:Ce:Ca reference crystal.
: While we are considering different ways to improve further these results, this proof-of-concept demonstrates the applicability of cross-luminescence for metascintillator designs through the application of VUV compatible SiPM coupling, and easily implementable digital algorithms. This is the first test of BaF
-based metascintillators of sufficient stoppng power to be included in a PET scanner, demonstrating the industrial applicability of such cross-luminescent metascintillators.
Scintillation materials and detectors that are used in many applications, such as medical imaging, security, oil-logging, high energy physics and non-destructive inspection, are reviewed. The ...fundamental physics understood today is explained, and common scintillators and scintillation detectors are introduced. The properties explained here are light yield, energy non-proportionality, emission wavelength, energy resolution, decay time, effective atomic number and timing resolution. For further understanding, the emission mechanisms of scintillator materials are also introduced. Furthermore, unresolved problems in scintillation phenomenon are considered, and my recent interpretations are discussed. These topics include positive hysteresis, the co-doping of non-luminescent ions, the introduction of an aimed impurity phase, the excitation density effect and the complementary relationship between scintillators and storage phosphors.
The Global Burden of Diseases, Injuries, and Risk Factors Study 2016 (GBD 2016) provides a comprehensive assessment of risk factor exposure and attributable burden of disease. By providing estimates ...over a long time series, this study can monitor risk exposure trends critical to health surveillance and inform policy debates on the importance of addressing risks in context.
We used the comparative risk assessment framework developed for previous iterations of GBD to estimate levels and trends in exposure, attributable deaths, and attributable disability-adjusted life-years (DALYs), by age group, sex, year, and location for 84 behavioural, environmental and occupational, and metabolic risks or clusters of risks from 1990 to 2016. This study included 481 risk-outcome pairs that met the GBD study criteria for convincing or probable evidence of causation. We extracted relative risk (RR) and exposure estimates from 22 717 randomised controlled trials, cohorts, pooled cohorts, household surveys, census data, satellite data, and other sources, according to the GBD 2016 source counting methods. Using the counterfactual scenario of theoretical minimum risk exposure level (TMREL), we estimated the portion of deaths and DALYs that could be attributed to a given risk. Finally, we explored four drivers of trends in attributable burden: population growth, population ageing, trends in risk exposure, and all other factors combined.
Since 1990, exposure increased significantly for 30 risks, did not change significantly for four risks, and decreased significantly for 31 risks. Among risks that are leading causes of burden of disease, child growth failure and household air pollution showed the most significant declines, while metabolic risks, such as body-mass index and high fasting plasma glucose, showed significant increases. In 2016, at Level 3 of the hierarchy, the three leading risk factors in terms of attributable DALYs at the global level for men were smoking (124·1 million DALYs 95% UI 111·2 million to 137·0 million), high systolic blood pressure (122·2 million DALYs 110·3 million to 133·3 million, and low birthweight and short gestation (83·0 million DALYs 78·3 million to 87·7 million), and for women, were high systolic blood pressure (89·9 million DALYs 80·9 million to 98·2 million), high body-mass index (64·8 million DALYs 44·4 million to 87·6 million), and high fasting plasma glucose (63·8 million DALYs 53·2 million to 76·3 million). In 2016 in 113 countries, the leading risk factor in terms of attributable DALYs was a metabolic risk factor. Smoking remained among the leading five risk factors for DALYs for 109 countries, while low birthweight and short gestation was the leading risk factor for DALYs in 38 countries, particularly in sub-Saharan Africa and South Asia. In terms of important drivers of change in trends of burden attributable to risk factors, between 2006 and 2016 exposure to risks explains an 9·3% (6·9–11·6) decline in deaths and a 10·8% (8·3–13·1) decrease in DALYs at the global level, while population ageing accounts for 14·9% (12·7–17·5) of deaths and 6·2% (3·9–8·7) of DALYs, and population growth for 12·4% (10·1–14·9) of deaths and 12·4% (10·1–14·9) of DALYs. The largest contribution of trends in risk exposure to disease burden is seen between ages 1 year and 4 years, where a decline of 27·3% (24·9–29·7) of the change in DALYs between 2006 and 2016 can be attributed to declines in exposure to risks.
Increasingly detailed understanding of the trends in risk exposure and the RRs for each risk-outcome pair provide insights into both the magnitude of health loss attributable to risks and how modification of risk exposure has contributed to health trends. Metabolic risks warrant particular policy attention, due to their large contribution to global disease burden, increasing trends, and variable patterns across countries at the same level of development. GBD 2016 findings show that, while it has huge potential to improve health, risk modification has played a relatively small part in the past decade.
The Bill & Melinda Gates Foundation, Bloomberg Philanthropies.
Quantifying metal and nanoparticle (NP) biouptake and distribution on an individual cellular basis has previously been impossible, given available techniques which provide qualitative data that are ...laborious to acquire and prone to artifacts. Quantifying metal and metal NP uptake and loss processes in environmental organisms will lead to mechanistic understanding of biouptake and improved understanding of potential hazards and risks of metals and NPs. In this work, we present a new technique, single cell inductively coupled plasma mass spectrometry (SC-ICP-MS), which allows quantification of metal concentrations on an individual cell basis down to the attogram (ag) per cell level. We present data validating the novel method, along with the mass of metal per cell. Finally, we use SC-ICP-MS, with ancillary cell counting methods, to quantify the biouptake and strong sorption and distribution of both dissolved Au and Au NPs in a freshwater alga (Cyptomonas ovate). The data suggests differences between dissolved and NP uptake and loss. In the case of NPs, there was a dose and time dependent uptake, but individual cellular variations; at the highest realistic exposure conditions used in this study up to 40–50% of cells contained NPs, while 50–60% of cells did not.
Photon counting detectors (PCDs) with energy discrimination capabilities have been developed for medical x-ray computed tomography (CT) and x-ray (XR) imaging. Using detection mechanisms that are ...completely different from the current energy integrating detectors and measuring the material information of the object to be imaged, these PCDs have the potential not only to improve the current CT and XR images, such as dose reduction, but also to open revolutionary novel applications such as molecular CT and XR imaging. The performance of PCDs is not flawless, however, and it seems extremely challenging to develop PCDs with close to ideal characteristics. In this paper, the authors offer our vision for the future of PCD-CT and PCD-XR with the review of the current status and the prediction of (1) detector technologies, (2) imaging technologies, (3) system technologies, and (4) potential clinical benefits with PCDs.
•The residue which remains from the Rainflow algorithm is identified and discussed.•Damaging transition cycles are missed by conventional Rainflow methods.•Analytical proof is presented to allow ...extended periods to be processed accurately.•The significance of the new approach is demonstrated with case study examples.
Most fatigue loaded structural components are subjected to variable amplitude loads which must be processed into a form that is compatible with design life calculations. Rainflow counting allows individual stress cycles to be identified where they form a closed stress–strain hysteresis loop within a random signal, but inevitably leaves a residue of open data points which must be post-processed. Comparison is made between conventional methods of processing the residue data points, which may be non-conservative, and a more versatile method, presented by Amzallag et al. (1994), which allows transition cycles to be processed accurately.
This paper presents an analytical proof of the method presented by Amzallag et al. The impact of residue processing on fatigue calculations is demonstrated through the application and comparison of the different techniques in two case studies using long term, high resolution data sets. The most significance is found when the load process results in a slowly varying mean stress which is not fully accounted for by traditional Rainflow counting methods.