The safe introduction of Generation IV (Gen IV) reactor concepts into operation will require extensive testing of their components. This must be performed under neutronic conditions representative of ...those expected to prevail inside the new reactor cores when in operation. In a thermal Material Testing Reactor (MTR) such neutronic conditions can be achieved by tailoring the prevailing neutron spectrum with the utilization of a device containing appropriate materials. In this work various materials are investigated as candidate components of a device that will be required in case that a thermal MTR neutron energy spectrum must be locally transformed, so as to imitate Sodium cooled Fast Reactor (SFR). Many nuclides have been examined with respect to only their neutronic behavior, providing thus a pool of neutronically appropriate materials for consideration in further investigation, such as regarding reactor safety and fabrication issues. The nuclides have been studied using the neutronics code TRIPOLI-4.8 while the reflector of the Jules Horowitz Reactor (JHR) was considered as the hosting environment of the transforming device. The results obtained suggest that elements with important inelastic neutron scattering could be chosen at a first level as being able to modify the prevailing neutron spectrum towards the desired direction. The factors which are important for an effective inelastic scatterer comprise density and inelastic microscopic cross section, as well as the energy ranges where inelastic scattering occurs. All the above factors have been separately examined in order to suggest potential device materials, able to locally produce SFR neutron spectrum imitation in a thermal MTR.
The presence of fast neutron spectra in new reactors is expected to induce a strong impact on the contained materials, including structural materials, nuclear fuels, neutron reflecting materials, and ...tritium breeding materials. Therefore, introduction of these reactors into operation will require extensive testing of their components, which must be performed under neutronic conditions representative of those expected to prevail inside the reactor cores when in operation. Due to limited availability of fast reactors, testing of future reactor materials will mostly take place in water cooled material test reactors (MTRs) by tailoring the neutron spectrum via neutron screens. The latter rely on the utilization of materials capable of absorbing neutrons at specific energy. A large but fragmented experience is available on that topic. In this work a comprehensive compilation of the existing neutron screen technology is attempted, focusing on neutron screens developed in order to locally enhance the fast over thermal neutron flux ratio in a reactor core.
Research reactors are used for many applications: material testing; radioisotope production; beam-line applications for material research; nuclear transmutation doping; neutron activation analysis; ...neutron radiography experiments; fuel waste management; and other neutron and nuclear material related quantities, features, and research areas of interest. Each application requires enhanced neutron fluxes in a specific section of the energy spectrum; therefore, appropriate irradiation positions in the core or an appropriate configuration of the beam line need to be chosen. In several cases the required flux exceeds the maximum value that can be obtained in the existing irradiation positions of the operating reactor core, but the desired neutron flux amplification through the reactor power upgrade would require large-scale transformations, high costs, and long shutdown periods. With the creation of a flux trap at a central core position in the open pool Greek Research Reactor (GRR-1), a noticeable local increase of the thermal neutron flux was achieved, compared to the irradiation channels at peripheral core positions. In the present technical note, calculational and measurement results concerning the original core modification are presented, while the possibility of larger sample irradiation at higher thermal neutron flux in the GRR-1 is investigated. The presented results are based on deterministic and stochastic neutronic calculations with numerical models validated using measurements conducted for the original flux trap. The work is completed with a thorough thermal-hydraulic analysis to evaluate the impact of the proposed modifications to reactor operation. The study showed that the flux trap enlargement with complete removal of a central control fuel assembly increases the maximum thermal neutron flux by ~41%, while further removal of the neighboring fuel assembly leads to an average flux increase of ~45%, thus offering capabilities for extended reactor utilization such as additional isotope production.
Background/Objectives
Overall and long‐term opioid use among older adults have increased since 1999. Less is known about opioid use in older adults in nursing homes (NHs).
Design
Cross‐sectional.
...Setting
U.S. NHs (N = 13,522).
Participants
Long‐stay NH resident Medicare beneficiaries with a Minimum Data Set 3.0 (MDS) assessment between April 1, 2012, and June 30, 2012, and 120 days of follow‐up (N = 315,949).
Measurements
We used Medicare Part D claims to measure length of opioid use in the 120 days from the index assessment (short‐term: ≤30 days, medium‐term: >30–89 days, long‐term: ≥90 days), adjuvants (e.g., anticonvulsants), and other pain medications (e.g., corticosteroids). MDS assessments in the follow‐up period were used to measure nonpharmacological pain management use. Modified Poisson models were used to estimate adjusted prevalence ratios (aPR) and 95% confidence intervals (CI) for age, gender, race and ethnicity, cognitive and physical impairment, and long‐term opioid use.
Results
Of all long‐stay residents, 32.4% were prescribed any opioid, and 15.5% were prescribed opioids long‐term. Opioid users (versus nonusers) were more commonly prescribed pain adjuvants (32.9% vs 14.9%), other pain medications (25.5% vs 11.0%), and nonpharmacological pain management (24.5% vs 9.3%). Long‐term opioid use was higher in women (aPR = 1.21, 95% CI = 1.18–1.23) and lower in racial and ethnic minorities (non‐Hispanic blacks vs whites: APR = 0.93, 95% CI = 0.90–0.94) and those with severe cognitive impairment (vs no or mild impairment, aPR = 0.82, 95% CI = 0.79–0.83).
Conclusion
One in seven NH residents was prescribed opioids long‐term. Recent guidelines on opioid prescribing for pain recommend reducing long‐term opioid use, but this is challenging in NHs because residents may not benefit from nonpharmacological and nonopioid interventions. Studies to address concerns about opioid safety and effectiveness (e.g., on pain and functional status) in NHs are needed.
Summarizing the impact of community-based mitigation strategies and mobility on COVID-19 infections throughout the pandemic is critical for informing responses and future infectious disease ...outbreaks. Here, we employed time-series analyses to empirically investigate the relationships between mitigation strategies and mobility on COVID-19 incident cases across US states during the first three waves of infections.
We linked data on daily COVID-19 incidence by US state from March to December 2020 with the stringency index, a well-known index capturing the strictness of mitigation strategies, and the trip ratio, which measures the ratio of the number of trips taken per day compared with the same day in 2019. We utilized multilevel models to determine the relative impacts of policy stringency and the trip ratio on COVID-19 cumulative incidence and the effective reproduction number. We stratified analyses by three waves of infections.
Every five-point increase in the stringency index was associated with 2.89% (95% confidence interval = 1.52, 4.26%) and 5.01% (3.02, 6.95%) reductions in COVID-19 incidence for the first and third waves, respectively. Reducing the number of trips taken by 50% compared with the same time in 2019 was associated with a 16.2% (-0.07, 35.2%) decline in COVID-19 incidence at the state level during the second wave and 19.3% (2.30, 39.0%) during the third wave.
Mitigation strategies and reductions in mobility are associated with marked health gains through the reduction of COVID-19 infections, but we estimate variable impacts depending on policy stringency and levels of adherence.
Heterogeneity in national SARS-CoV-2 infection surveillance capabilities may compromise global enumeration and tracking of COVID-19 cases and deaths and bias analyses of the pandemic's tolls. Taking ...account of heterogeneity in data completeness may thus help clarify analyses of the relationship between COVID-19 outcomes and standard preparedness measures. We examined country-level associations of pandemic preparedness capacities inventories, from the Global Health Security (GHS) Index and Joint External Evaluation (JEE), on SARS-CoV-2 infection and COVID-19 death data completion rates adjusted for income. Analyses were stratified by 100, 100-300, 300-500, and 500-700 days after the first reported case in each country. We subsequently reevaluated the relationship of pandemic preparedness on SARS-CoV-2 infection and age-standardized COVID-19 death rates adjusted for cross-country differentials in data completeness during the pre-vaccine era. Every 10% increase in the GHS Index was associated with a 14.9% (95% confidence interval 8.34-21.8%) increase in SARS-CoV-2 infection completion rate and a 10.6% (5.91-15.4%) increase in the death completion rate during the entire observation period. Disease prevention (infections: beta = 1.08 1.05-1.10, deaths: beta = 1.05 1.04-1.07), detection (infections: beta = 1.04 1.01-1.06, deaths: beta = 1.03 1.01-1.05), response (infections: beta = 1.06 1.00-1.13, deaths: beta = 1.05 1.00-1.10), health system (infections: beta = 1.06 1.03-1.10, deaths: beta = 1.05 1.03-1.07), and risk environment (infections: beta = 1.27 1.15-1.41, deaths: beta = 1.15 1.08-1.23) were associated with both data completeness outcomes. Effect sizes of GHS Index on infection completion (Low income: beta = 1.18 1.04-1.34, Lower Middle income: beta = 1.41 1.16-1.71) and death completion rates (Low income: beta = 1.19 1.09-1.31, Lower Middle income: beta = 1.25 1.10-1.43) were largest in LMICs. After adjustment for cross-country differences in data completeness, each 10% increase in the GHS Index was associated with a 13.5% (4.80-21.4%) decrease in SARS-CoV-2 infection rate at 100 days and a 9.10 (1.07-16.5%) decrease at 300 days. For age-standardized COVID-19 death rates, each 10% increase in the GHS Index was with a 15.7% (5.19-25.0%) decrease at 100 days and a 10.3% (- 0.00-19.5%) decrease at 300 days.
Several ecologic studies have suggested that the bacillus Calmette-Guérin (BCG) vaccine may be protective against SARS-CoV-2 infection including a highly-cited published pre-print by Miller et al., ...finding that middle/high- and high-income countries that never had a universal BCG policy experienced higher COVID-19 burden compared to countries that currently have universal BCG vaccination policies. We provide a case study of the limitations of ecologic analyses by evaluating whether these early ecologic findings persisted as the pandemic progressed. Similar to Miller et al., we employed Wilcoxon Rank Sum Tests to compare population medians in COVID-19 mortality, incidence, and mortality-to-incidence ratio between countries with universal BCG policies compared to those that never had such policies. We then computed Pearson's r correlations to evaluate the association between year of BCG vaccination policy implementation and COVID-19 outcomes. We repeated these analyses for every month in 2020 subsequent to Miller et al.'s March 2020 analysis. We found that the differences in COVID-19 burden associated with BCG vaccination policies in March 2020 generally diminished in magnitude and usually lost statistical significance as the pandemic progressed. While six of nine analyses were statistically significant in March, only two were significant by the end of 2020. These results underscore the need for caution in interpreting ecologic studies, given their inherent methodological limitations, which can be magnified in the context of a rapidly evolving pandemic in which there is measurement error of both exposure and outcome status.
The use of marginal structural models (MSMs) to adjust for time-varying confounding has increased in epidemiologic studies. However, in the setting of MSMs, recommendations for how best to handle ...missing data are contradictory. We present a plasmode simulation study to compare the validity and precision of MSMs estimates using complete case analysis (CC), multiple imputation (MI), and inverse probability weighting (IPW) in the presence of missing data on time-independent and time-varying confounders.
Simulations were based on a cohort substudy using data from the Osteoarthritis Initiative which estimated the marginal causal effect of intra-articular injection use on yearly changes in knee pain. We simulated 81 scenarios with parameter values varied on missing mechanisms (MCAR, MAR, and MNAR), percentages of missing (10%, 20%, and 30%), type of confounders (time-independent, time-varying, either or both), and analytical approaches (CC, IPW, and MI). The performance of CC, IPW, and MI methods was compared using relative bias, mean squared error of the estimates of interest, and empirical power.
Across scenarios defined by missing data mechanism, extent of missing data, and confounder type, MI generally produced less biased estimates (range: 1.2%-6.7%) with better precision (range: 0.17-0.18) compared with IPW (relative bias: -5.3% to 8.0%; precision: 0.19-0.53). Empirical power was constant across the scenarios using MI.
Under simple yet realistically constructed scenarios, MI seems to confer an advantage over IPW in MSMs applications.