Abstract The increased demand for molecular imaging tracers useful in assessing and monitoring diseases has stimulated research towards more efficient and flexible radiosynthetic routes, including ...newer technologies. The traditional vessel-based approach suffers from limitations concerning flexibility, reagent mass needed, hardware requirements, large number of connections and valves, repetitive cleaning procedures and overall big footprint to be shielded from radiation. For these reasons, several research groups have started to investigate the application of the fast growing field of microfluidic chemistry to radiosynthetic procedures. After the first report in 2004, many scientific papers have been published and demonstrated the potential for increased process yields, reduced reagent use, improved flexibility and general ease of setup. This review will address definitions occurring in microfluidics as well as analyze the different approaches under two macro-categories: microvessel and microchannel. In this perspective, several works will be collected, involving the use of positron emitting species (11 C,18 F,64 Cu) and the fewer examples of gamma emitting radionuclides (99m Tc,125/131 I). New directions in microfluidic research applied to PET radiochemistry, future developments and challenges are also discussed.
Multivariate extreme value models are a fundamental tool in order to assess potentially dangerous events. The target of this paper is two‐fold. On the one hand we outline how, exploiting recent ...theoretical developments in the theory of copulas, new multivariate extreme value distributions can be easily constructed; in particular, we show how a suitable number of parameters can be introduced, a feature not shared by traditional extreme value models. On the other hand, we introduce a proper new definition of multivariate return period and show the differences with (and the advantages over) the definition presently used in literature. An illustration involving flood data is presented and discussed, and a generalization of the well‐known multivariate logistic Gumbel model is also given.
The aim of this paper is to address the issue of the Transition in Italy proposing some thoughts about the possible links between economy, trade and animals. Connections that could be in some way ...explained by using zooarchaeological finds as a narrative source.
More properly this is a sort of notes for research agenda, arisen from an heterogeneous national panorama, represented by more than seven hundred and sixty thousand fragments, recovered in nearly four hundred different archaeological contexts, analysed in Italy since the seventies until today.
To this national sample have been applied analysis according to different kind of data, trying to assess two main orders of information: 1) the diachronic changes of the proportional incidence of the main livestock taxa (Cattle, Caprine and Pig); 2) the frequencies of the sites, within specific temporal ranges, where some particular taxa and ecological groups were found (like salt-water fish, exotic animals, wild species in urban layers).
The results obtained with this quantitative methodological approach, allow to propose some working hypothesis concerning breeding, fishing, hunting, growing and decline of trade, alimentary practice, diseases, play and some other anthropic behaviour.
After forty years of archaeozoological research in Italy, it seems that there are several evidences that display a transformation in human-animals interactions between the end of antiquity and the beginning of Middle Ages.
According to the data collected up to now, Early Middle Ages seems as a long period marked by an economic system enclosed between two economic set-ups: the Late Antiquity and the low/late medieval. These two ages, as animals remains reveal, show instead some deep similarities.
•
Relationships across intelligence, cognition and neurocognitive disorders are complex.
•
The way these constructs are conceptualized leads to various measurement approaches.
•
Advanced data fitting ...models offered new conceptual and methodological frameworks.
The study of intelligence's role in development of major neurocognitive disorders (MND) is influenced by the approaches used to conceptualize and measure these constructs. In the field of cognitive impairment, the use of single ‘intelligence’ tests is a common approach to estimate intelligence. Despite being a practical compromise between feasibility and constructs, variance of these tests is only partially explained by general intelligence, and some tools (e.g., lexical tasks for premorbid intelligence) presented inherent limitations. Alternatively, factorial models allow an actual measure of intelligence as a latent factor superintending all mental abilities. Royall and colleagues used structural equation modeling to decompose the Spearman's general intelligence factor
g
in δ (shared variance across cognitive and functional measures) and g’ (shared variance across cognitive measures only). Authors defined δ as the ‘
cognitive correlates of functional status’
, and thus a
‘phenotype for all cause dementia’
. Compared to g’, δ explained a little rate of cognitive measures’ variance, but it demonstrated a higher accuracy in dementia case-finding. From the methodological perspective, given
g
‘
indifference
’ to its indicators, further studies are needed to identify the minimal set of tools necessary to extract
g
, and to test also non-cognitive variables as measures of δ. From the clinical perspective, general intelligence seems to influence MND presence and severity more than domain specific cognitive abilities. Giving δ ‘
blindness
’ to etiology, its association with biomarkers and contribution to differential diagnosis might be limited. Classical neuropsychological approaches based on patterns of performances at cognitive tests remained fundamental for differential diagnosis.
A hospital-based study showed that, of 223 patients admitted to a stroke unit, 91 (40·8%) had pre-existing cognitive decline, and only 60 (61·2%) of 98 patients who showed no cognitive impairment ...before the acute event were still cognitively healthy at a 3 month follow-up, on the basis of a combined evaluation with the Clinical Dementia Rating scale, the Montreal Cognitive Assessment, and the Clock-Drawing Test.3 Specific factors might have a role in determining cognitive impairment after a stroke, and some of these are related to the patient's characteristics and the stroke event, for example age at stroke onset, epileptic activity, sepsis, stroke severity, lesion location, and lesion multiplicity. ...time to cognitive assessment after stroke was variable, which is not ideal because the prevalence of cognitive impairment increases with longer times between the stroke event and cognitive assessment.2 Third, the potential bias caused by language deficits on the evaluation of cognitive impairment after a stroke is unclear. ...the effect of pre-stroke cognitive decline needs to be considered.
Justice has been an aspiration and, at the same time, a temptation for humans since the beginning. The tendency to resolve disputes in a simple and even brutal manner has been progressively replaced ...by the creation of a system of rules, useful to compress and eliminate the excesses of a restorative approach in a rebalancing function, often governed by revenge. Revengeful action, however, has remained in common perception as a more effective remedy than that provided by the norm and has continued to influence the actions of the servants of the law themselves. The figure of the “dirty” detective has become popular in
literature, bringing with it all the suffering and uncertainty associated with the coexistence of justice and revenge as a means of restoring balance. A question lies behind the dilemmas of the servant of the law: does good or evil prevail in our society?
ABSTRACT
The non-detection of zero-metallicity stars in ultra-faint dwarf galaxies (UFDs) can be used to constrain the initial mass function (IMF) of the first (PopIII) stars by means of a ...statistical comparison between available data and predictions from chemical evolution models. To this end we develop a model that follows the formation of isolated UFDs, calibrated to best reproduce the available data for the best studied system: Boötes I. Our statistical approach shows that UFDs are the best suitable systems to study the implications of the persisting non-detection of zero-metallicity stars on the PopIII IMF, i.e. its shape, the minimum mass (mmin), and the characteristic mass (mch). We show that accounting for the incomplete sampling of the IMF is essential to compute the expected number of long-lived PopIII stars in inefficiently star-forming UFDs. By simulating the colour–magnitude diagram of Boötes I, and thus take into account the mass-range of the observed stars, we can obtain even tighter constrains on mmin. By exploiting the 96 stars with measured metallicities ($\rm i \lt 19$) in the UFDs represented by our model, we demonstrate that $m_{\mathrm{ ch}} \gt 1\: \rm {M_{\odot }}$ or $m_{\mathrm{ min}} \gt 0.8 \:\rm {M_{\odot }}$ at $99\%$ confidence level. This means that a present-day IMF for PopIII stars is excluded by our model, and a top-heavy PopIII IMF is strongly favoured. We can limit $m_{\mathrm{ min}} \gt 0.8\: \rm {M_{\odot }}$ independent of the PopIII IMF shape by targeting the four UFDs Boötes I, Hercules, Leo IV, and Eridanus II with future generation instruments, such as ELT/MOSAIC ($\rm i \lt 25$), which can provide samples of >10 000 stars.
•An original multivariate multi-site analysis of drought dynamics is presented.•The most recent multivariate copula-based procedures are adopted.•The multivariate tool of Dynamic Return Period is ...used to assess the drought state.•Multivariate previsional tools named “Hazard Trajectories & Fans” are introduced.•Valuable indications for a multi-site real-time assessment of droughts are provided.
Droughts, like floods, represent the most dangerous, and costly, water cycle expressions, with huge impacts on society and built environment. Droughts are events occurring over a certain region, lasting several weeks or months, and involving multiple variables: thus, a multivariate, multi-site, approach is most appropriate for their statistical characterization. In this methodological work, hydrological droughts are considered, and a multivariate approach is proposed, by regarding as relevant variables the duration and the average intensity. A multivariate, multi-site, frequency analysis is presented, based on the Theory of Copulas and the joint Survival Kendall’s Return Periods, by investigating the historical drought episodes occurred at five main river sections of the Po river (Northern Italy), the most important Italian basin. The tool of Dynamic Return Period is used, and the new concepts of Hazard Trajectories and Fans are introduced, in order to provide useful indications for a valuable multi-site real-time assessment of droughts.