Fear of spiders, or Arachnophobia, is one of the most common specific phobias. The gold standard treatment, in vivo exposure therapy, is effective, but comes with significant limitations, including ...restricted availability, high costs, and high refusal rates. Novel technologies, such as augmented reality, may help to overcome these limitations and make Exposure Therapy more accessible by using mobile devices. This study will use a Randomized Controlled Trial design to investigate whether ZeroPhobia: Arachnophobia, a 6-week Augmented Reality Exposure Therapy smartphone self-help application, can effectively reduce spider phobia symptoms. Additionally, we will examine user-friendliness of the application and the effect of usage intensity and presence on treatment outcome. This study is registered in the Netherlands Trial Registry under NL70238.029.19 (Trial NL9221). Ethical approval was received on October 11, 2019. One-hundred-twelve participants (age 18-64, score greater than or equal to 59) on the Fear of Spiders Questionnaire FSQ will be recruited from the general Dutch population and randomly assigned to a treatment or waitlist control group. The ZeroPhobia application can be accessed on users' smartphone. Baseline, post-test (i.e., at six weeks), 3- and 12-month follow-up assessments will be done, each including the Fear of Spiders Questionnaire as the main outcome measure as well as additional measures of anxiety, depression, user-friendliness, and presence as secondary measures and covariates. The study was funded on September 25, 2018. Data collection started in September 2021 and the study is expected to run until September 2022. Our study will improve our understanding of the efficacy and feasibility of providing Exposure Therapy for spider phobia using an Augmented Reality self-help application, with the intention of making mental health care more accessible.
ABSTRACT
Data Challenge 1 (DC1) is the first synthetic data set produced by the Rubin Observatory Legacy Survey of Space and Time (LSST) Dark Energy Science Collaboration (DESC). DC1 is designed to ...develop and validate data reduction and analysis and to study the impact of systematic effects that will affect the LSST data set. DC1 is comprised of r-band observations of 40 deg2 to 10 yr LSST depth. We present each stage of the simulation and analysis process: (a) generation, by synthesizing sources from cosmological N-body simulations in individual sensor-visit images with different observing conditions; (b) reduction using a development version of the LSST Science Pipelines; and (c) matching to the input cosmological catalogue for validation and testing. We verify that testable LSST requirements pass within the fidelity of DC1. We establish a selection procedure that produces a sufficiently clean extragalactic sample for clustering analyses and we discuss residual sample contamination, including contributions from inefficiency in star–galaxy separation and imperfect deblending. We compute the galaxy power spectrum on the simulated field and conclude that: (i) survey properties have an impact of 50 per cent of the statistical uncertainty for the scales and models used in DC1; (ii) a selection to eliminate artefacts in the catalogues is necessary to avoid biases in the measured clustering; and (iii) the presence of bright objects has a significant impact (2σ–6σ) in the estimated power spectra at small scales (ℓ > 1200), highlighting the impact of blending in studies at small angular scales in LSST.
•Multiple exposures and their combined effects require better risk management.•Precautionary approaches and intermediate measures could already be applied.•A European strategy is needed to address ...the research and policy needs.
The number of anthropogenic chemicals, manufactured, by-products, metabolites and abiotically formed transformation products, counts to hundreds of thousands, at present. Thus, humans and wildlife are exposed to complex mixtures, never one chemical at a time and rarely with only one dominating effect. Hence there is an urgent need to develop strategies on how exposure to multiple hazardous chemicals and the combination of their effects can be assessed. A workshop, “Advancing the Assessment of Chemical Mixtures and their Risks for Human Health and the Environment” was organized in May 2018 together with Joint Research Center in Ispra, EU-funded research projects and Commission Services and relevant EU agencies. This forum for researchers and policy-makers was created to discuss and identify gaps in risk assessment and governance of chemical mixtures as well as to discuss state of the art science and future research needs. Based on the presentations and discussions at this workshop we want to bring forward the following Key Messages:•We are at a turning point: multiple exposures and their combined effects require better management to protect public health and the environment from hazardous chemical mixtures.•Regulatory initiatives should be launched to investigate the opportunities for all relevant regulatory frameworks to include prospective mixture risk assessment and consider combined exposures to (real-life) chemical mixtures to humans and wildlife, across sectors.•Precautionary approaches and intermediate measures (e.g. Mixture Assessment Factor) can already be applied, although, definitive mixture risk assessments cannot be routinely conducted due to significant knowledge and data gaps.•A European strategy needs to be set, through stakeholder engagement, for the governance of combined exposure to multiple chemicals and mixtures. The strategy would include research aimed at scientific advancement in mechanistic understanding and modelling techniques, as well as research to address regulatory and policy needs. Without such a clear strategy, specific objectives and common priorities, research, and policies to address mixtures will likely remain scattered and insufficient.
A model and data toolbox is presented to assess risks from combined exposure to multiple chemicals using probabilistic methods. The Monte Carlo Risk Assessment (MCRA) toolbox, also known as the ...EuroMix toolbox, has more than 40 modules addressing all areas of risk assessment, and includes a data repository with data collected in the EuroMix project. This paper gives an introduction to the toolbox and illustrates its use with examples from the EuroMix project. The toolbox can be used for hazard identification, hazard characterisation, exposure assessment and risk characterisation. Examples for hazard identification are selection of substances relevant for a specific adverse outcome based on adverse outcome pathways and QSAR models. Examples for hazard characterisation are calculation of benchmark doses and relative potency factors with uncertainty from dose response data, and use of kinetic models to perform in vitro to in vivo extrapolation. Examples for exposure assessment are assessing cumulative exposure at external or internal level, where the latter option is needed when dietary and non-dietary routes have to be aggregated. Finally, risk characterisation is illustrated by calculation and display of the margin of exposure for single substances and for the cumulation, including uncertainties derived from exposure and hazard characterisation estimates.
Display omitted
•MCRA is a modular model and data toolbox to assess combined chemical exposure risks.•Substances can be selected based on adverse outcome pathway networks or QSAR models.•Benchmark doses and relative potency factors with uncertainty can be calculated.•Kinetic models can be used for in vitro to in vivo extrapolation or for aggregation.•Risks can be assessed through traditional or probabilistic margins of exposure.
Complete anatomical resection of the primary tumour is still the standard of care in patients with early stage lung cancer. Because these patients are usually smokers who also suffer from chronic ...obstructive pulmonary disease, regional differences in pulmonary function due to lung tissue destruction exist. The purpose of the present article is to evaluate the currently available guidelines and to discuss novel methods for the pre-operative functional and anatomical pulmonary evaluation in lung cancer patients. Despite the fact that knowledge on the pre-operative evaluation of the pulmonary function has substantially increased during the past decade, the majority of the studies are small, underpowered and, with exception of a proposed algorithm, not prospectively validated in independent cohorts. The future harmonisation of guidelines is required and novel imaging techniques should be incorporated in the pre-operative evaluation in chronic obstructive pulmonary disease patients with borderline pulmonary function.
We present the fourth Fermi Large Area Telescope catalog (4FGL) of γ-ray sources. Based on the first eight years of science data from the Fermi Gamma-ray Space Telescope mission in the energy range ...from 50 MeV to 1 TeV, it is the deepest yet in this energy range. Relative to the 3FGL catalog, the 4FGL catalog has twice as much exposure as well as a number of analysis improvements, including an updated model for the Galactic diffuse γ-ray emission, and two sets of light curves (one-year and two-month intervals). The 4FGL catalog includes 5064 sources above 4 significance, for which we provide localization and spectral properties. Seventy-five sources are modeled explicitly as spatially extended, and overall, 358 sources are considered as identified based on angular extent, periodicity, or correlated variability observed at other wavelengths. For 1336 sources, we have not found plausible counterparts at other wavelengths. More than 3130 of the identified or associated sources are active galaxies of the blazar class, and 239 are pulsars.
Background
This study was conducted to validate a pretreatment (i.e. prior to neoadjuvant chemoradiotherapy) pathological staging system in the resection specimen after neoadjuvant chemoradiotherapy ...for esophageal cancer. The study investigated the prognostic value of pretreatment pathological T and N categories (prepT and prepN categories) in both an independent and a combined patient cohort.
Methods
Patients with esophageal cancer treated with neoadjuvant chemotherapy and esophagectomy between 2012 and 2015 were included. PrepT and prepN categories were estimated based on the extent of tumor regression and regressional changes of lymph nodes in the resection specimen. The difference in Akaike’s information criterion (ΔAIC) was used to assess prognostic performance. PrepN and ypN categories were combined to determine the effect of nodal sterilization on prognosis. A multivariable Cox regression model was used to identify combined prepN and ypN categories as independent prognostic factors.
Results
The prognostic strength of the prepT category was better than the cT and ypT categories (ΔAIC 7.7 vs. 3.0 and 2.9, respectively), and the prognostic strength of the prepN category was better than the cN category and similar to the ypN category (ΔAIC 29.2 vs. − 1.0 and 27.9, respectively). PrepN + patients who became ypN0 had significantly worse survival than prepN0 patients (2-year overall survival 69% vs. 86% in 137 patients;
p
= 0.044). Similar results were found in a combined cohort of 317 patients (2-year overall survival 62% vs. 85%;
p
= 0.002). Combined prepN/ypN stage was independently associated with overall survival.
Conclusions
These results independently confirm the prognostic value of prepTNM staging. PrepTNM staging is of additional prognostic value to cTNM and ypTNM. PrepN0/ypN0 patients have a better survival than prepN +/ypN0 patients.
We aimed to determine pretreatment pathological tumor extent in the resection specimen after neoadjuvant chemoradiotherapy (nCRT) and to assess its prognostic value in patients with esophageal ...cancer.
Patients with esophageal cancer, treated with nCRT plus surgery were included (2003-2011). Pretreatment pathological T-stage (prepT-stage) and N-stage (prepN-stage) were estimated based on the extent of regressional changes and residual tumor cells in the resection specimen. Interobserver agreement was determined between 3 pathologists. The prognostic performance of prepT-stage and prepN-stage was scored using the difference in Akaike information criterion (ΔAIC). PrepN-stage and posttreatment pathological N-stage (ypN-stage) were combined to determine the effect of nodal sterilization on prognosis.
Overall concordance for prepT-stage and prepN-stage was 0.69 and 0.84, respectively. Prognostic strength of prepT-stage was similar to clinical T-stage and worse compared with ypT-stage (ΔAIC 1.3 versus 2.0 and 8.9, respectively). In contrast, prognostic strength of prepN-stage was better than cN-stage and similar to ypN-stage (ΔAIC 17.9 versus 6.2 and 17.2, respectively). PrepN+ patients who become ypN0 after nCRT have a worse survival compared with prepN0 patients, with a five year overall survival of 51% versus 68%, P = 0.019, respectively.
PrepT-stage and prepN-stage can be estimated reproducibly. Prognostic strength of prepT-stage is comparable with clinical T-stage, whereas prepN-stage is better than cN-stage. PrepN+ patients who become ypN0 after nCRT have a worse survival compared with prepN0 patients. Pretreatment pathological staging should be considered useful as a new staging parameter for esophageal cancer and could also be of interest for other tumor types.
Humans and wildlife are exposed to an intractably large number of different combinations of chemicals via food, water, air, consumer products, and other media and sources. This raises concerns about ...their impact on public and environmental health. The risk assessment of chemicals for regulatory purposes mainly relies on the assessment of individual chemicals. If exposure to multiple chemicals is considered in a legislative framework, it is usually limited to chemicals falling within this framework and co-exposure to chemicals that are covered by a different regulatory framework is often neglected. Methodologies and guidance for assessing risks from combined exposure to multiple chemicals have been developed for different regulatory sectors, however, a harmonised, consistent approach for performing mixture risk assessments and management across different regulatory sectors is lacking. At the time of this publication, several EU research projects are running, funded by the current European Research and Innovation Programme Horizon 2020 or the Seventh Framework Programme. They aim at addressing knowledge gaps and developing methodologies to better assess chemical mixtures, by generating and making available internal and external exposure data, developing models for exposure assessment, developing tools for in silico and in vitro effect assessment to be applied in a tiered framework and for grouping of chemicals, as well as developing joint epidemiological-toxicological approaches for mixture risk assessment and for prioritising mixtures of concern. The projects EDC-MixRisk, EuroMix, EUToxRisk, HBM4EU and SOLUTIONS have started an exchange between the consortia, European Commission Services and EU Agencies, in order to identify where new methodologies have become available and where remaining gaps need to be further addressed. This paper maps how the different projects contribute to the data needs and assessment methodologies and identifies remaining challenges to be further addressed for the assessment of chemical mixtures.
•Mapping EU funded research projects to different aspects of mixture risk assessment.•Overview of current status and methodological developments•Need to further address data and knowledge gaps overarching different chemical sectors
Emphysema and small airway disease both contribute to chronic obstructive pulmonary disease (COPD), a disease characterised by accelerated decline in lung function. The association between the extent ...of emphysema in male current and former smokers and lung function decline was investigated.
Current and former heavy smokers participating in a lung cancer screening trial were recruited to the study and all underwent CT. Spirometry was performed at baseline and at 3-year follow-up. The 15th percentile (Perc15) was used to assess the severity of emphysema.
2085 men of mean age 59.8 years participated in the study. Mean (SD) baseline Perc15 was -934.9 (19.5) HU. A lower Perc15 value correlated with a lower forced expiratory volume in 1 s (FEV(1)) at baseline (r=0.12, p<0.001). Linear mixed model analysis showed that a lower Perc15 was significantly related to a greater decline in FEV(1) after follow-up (p<0.001). Participants without baseline airway obstruction who developed it after follow-up had significantly lower mean (SD) Perc15 values at baseline than those who did not develop obstruction (-934.2 (17.1) HU vs -930.2 (19.7) HU, p<0.001).
Greater baseline severity of CT-detected emphysema is related to lower baseline lung function and greater rates of lung function decline, even in those without airway obstruction. CT-detected emphysema aids in identifying non-obstructed male smokers who will develop airflow obstruction.