Abstract
Background
To date, no study has examined influenza vaccine effectiveness (IVE) against laboratory-confirmed influenza-associated hospitalizations during pregnancy.
Methods
The Pregnancy ...Influenza Vaccine Effectiveness Network (PREVENT) consisted of public health or healthcare systems with integrated laboratory, medical, and vaccination records in Australia, Canada (Alberta and Ontario), Israel, and the United States (California, Oregon, and Washington). Sites identified pregnant women aged 18 through 50 years whose pregnancies overlapped with local influenza seasons from 2010 through 2016. Administrative data were used to identify hospitalizations with acute respiratory or febrile illness (ARFI) and clinician-ordered real-time reverse transcription polymerase chain reaction (rRT-PCR) testing for influenza viruses. Overall IVE was estimated using the test-negative design and adjusting for site, season, season timing, and high-risk medical conditions.
Results
Among 19450 hospitalizations with an ARFI discharge diagnosis (across 25 site-specific study seasons), only 1030 (6%) of the pregnant women were tested for influenza viruses by rRT-PCR. Approximately half of these women had pneumonia or influenza discharge diagnoses (54%). Influenza A or B virus infections were detected in 598/1030 (58%) of the ARFI hospitalizations with influenza testing. Across sites and seasons, 13% of rRT-PCR-confirmed influenza-positive pregnant women were vaccinated compared with 22% of influenza-negative pregnant women; the adjusted overall IVE was 40% (95% confidence interval = 12%–59%) against influenza-associated hospitalization during pregnancy.
Conclusion
Between 2010 and 2016, influenza vaccines offered moderate protection against laboratory-confirmed influenza-associated hospitalizations during pregnancy, which may further inform the benefits of maternal influenza vaccination programs.
In this retrospective study of hospitals in Australia, Canada, Israel, and the United States from 2010 to 2016, influenza vaccines were 40% effective in preventing laboratory-confirmed influenza-associated hospitalizations during pregnancy.
Induction of DNA double-strand breaks (DSBs) in ribosomal DNA (rDNA) repeats is associated with ATM-dependent repression of ribosomal RNA synthesis and large-scale reorganization of nucleolar ...architecture, but the signaling events that regulate these responses are largely elusive. Here we show that the nucleolar response to rDNA breaks is dependent on both ATM and ATR activity. We further demonstrate that ATM- and NBS1-dependent recruitment of TOPBP1 in the nucleoli is required for inhibition of ribosomal RNA synthesis and nucleolar segregation in response to rDNA breaks. Mechanistically, TOPBP1 recruitment is mediated by phosphorylation-dependent interactions between three of its BRCT domains and conserved phosphorylated Ser/Thr residues at the C-terminus of the nucleolar phosphoprotein Treacle. Our data thus reveal an important cooperation between TOPBP1 and Treacle in the signaling cascade that triggers transcriptional inhibition and nucleolar segregation in response to rDNA breaks.
Glioblastoma is the most lethal primary brain cancer. Clinical outcomes for glioblastoma remain poor, and new treatments are needed.
To investigate whether adding autologous tumor lysate-loaded ...dendritic cell vaccine (DCVax-L) to standard of care (SOC) extends survival among patients with glioblastoma.
This phase 3, prospective, externally controlled nonrandomized trial compared overall survival (OS) in patients with newly diagnosed glioblastoma (nGBM) and recurrent glioblastoma (rGBM) treated with DCVax-L plus SOC vs contemporaneous matched external control patients treated with SOC. This international, multicenter trial was conducted at 94 sites in 4 countries from August 2007 to November 2015. Data analysis was conducted from October 2020 to September 2021.
The active treatment was DCVax-L plus SOC temozolomide. The nGBM external control patients received SOC temozolomide and placebo; the rGBM external controls received approved rGBM therapies.
The primary and secondary end points compared overall survival (OS) in nGBM and rGBM, respectively, with contemporaneous matched external control populations from the control groups of other formal randomized clinical trials.
A total of 331 patients were enrolled in the trial, with 232 randomized to the DCVax-L group and 99 to the placebo group. Median OS (mOS) for the 232 patients with nGBM receiving DCVax-L was 19.3 (95% CI, 17.5-21.3) months from randomization (22.4 months from surgery) vs 16.5 (95% CI, 16.0-17.5) months from randomization in control patients (HR = 0.80; 98% CI, 0.00-0.94; P = .002). Survival at 48 months from randomization was 15.7% vs 9.9%, and at 60 months, it was 13.0% vs 5.7%. For 64 patients with rGBM receiving DCVax-L, mOS was 13.2 (95% CI, 9.7-16.8) months from relapse vs 7.8 (95% CI, 7.2-8.2) months among control patients (HR, 0.58; 98% CI, 0.00-0.76; P < .001). Survival at 24 and 30 months after recurrence was 20.7% vs 9.6% and 11.1% vs 5.1%, respectively. Survival was improved in patients with nGBM with methylated MGMT receiving DCVax-L compared with external control patients (HR, 0.74; 98% CI, 0.55-1.00; P = .03).
In this study, adding DCVax-L to SOC resulted in clinically meaningful and statistically significant extension of survival for patients with both nGBM and rGBM compared with contemporaneous, matched external controls who received SOC alone.
ClinicalTrials.gov Identifier: NCT00045968.
Background:
Field expedient screening tools that can identify individuals at an elevated risk for injury are needed to minimize time loss in American football players. Previous research has suggested ...that poor dynamic balance may be associated with an elevated risk for injury in athletes; however, this has yet to be examined in college football players.
Hypothesis:
To determine if dynamic balance deficits are associated with an elevated risk of injury in collegiate football players. It was hypothesized that football players with lower performance and increased asymmetry in dynamic balance would be at an elevated risk for sustaining a noncontact lower extremity injury.
Study Design:
Prospective cohort study.
Methods:
Fifty-nine collegiate American football players volunteered for this study. Demographic information, injury history, and dynamic balance testing performance were collected, and noncontact lower extremity injuries were recorded over the course of the season. Receiver operator characteristic curves were calculated based on performance on the Star Excursion Balance Test (SEBT), including composite score and asymmetry, to determine the population-specific risk cut-off point. Relative risk was then calculated based on these variables, as well as previous injury.
Results:
A cut-off point of 89.6% composite score on the SEBT optimized the sensitivity (100%) and specificity (71.7%). A college football player who scored below 89.6% was 3.5 times more likely to get injured.
Conclusion:
Poor performance on the SEBT may be related to an increased risk for sustaining a noncontact lower extremity injury over the course of a competitive American football season.
Clinical Relevance:
College football players should be screened preseason using the SEBT to identify those at an elevated risk for injury based upon dynamic balance performance to implement injury mitigation strategies to this specific subgroup of athletes.
Literature consistently identifies two key examination components when managing ankle/foot pathologies: 1) dorsiflexion range of motion (DFROM) and 2) single limb balance. Mobilizations with movement ...(MWM) and Instrument-Assisted Soft Tissue (IASTM) are two emerging manual therapy (MT) options in the management of ankle/foot conditions.
In this observational cohort study, 147 subjects were randomized in a block fashion as follows: 1) Control, 2) IASTM, 3) MWM, and 4) Combination of both MT interventions. Descriptive statistics of the sample were conducted with integrity checks followed by comparative analysis for mean change between the variables or DFROM and YBTLQ ™ performance.
ANOVA Welch's F indicated significant differences between the treatment conditions (Welch's F (3,75.669) = 4.533, p = .006). Games-Howell post hoc tests indicated significantly more change in DFROM in the IASTM (p = .043) and CKCMOB (p = .026) conditions when they were administered as single treatments, than in the Control Condition or when the treatments were combined. Dynamic balance, as measured by the YBT-LQ™, did not yield a significant response based on the intervention arm.
Specifically, IASTM or closed kinetic chain (CKC) MWM MT techniques used in isolation can be considered a cost-effective intervention that can be administered by a skilled MT practitioner in a “low risk-high reward” clinical scenario with potential biomechanical and neurophysiological benefits for improving CKCDFROM.
•CKCDFROM and YBT-LQ are two reliable and valid tests that can be administered during ankle/foot management.•CKCMWM and IASTM are two-cost-effective, non-invasive MT viable options aimed at improving CKCDFROM.•Compounding MT techniques such as CKCDFROM and IASTM at the ankle region in one session may minimize benefits of improving CKCDFROM.
Chromosome breakage elicits transient silencing of ribosomal RNA synthesis, but the mechanisms involved remained elusive. Here we discover an in trans signalling mechanism that triggers pan-nuclear ...silencing of rRNA transcription in response to DNA damage. This is associated with transient recruitment of the Nijmegen breakage syndrome protein 1 (NBS1), a central regulator of DNA damage responses, into the nucleoli. We further identify TCOF1 (also known as Treacle), a nucleolar factor implicated in ribosome biogenesis and mutated in Treacher Collins syndrome, as an interaction partner of NBS1, and demonstrate that NBS1 translocation and accumulation in the nucleoli is Treacle dependent. Finally, we provide evidence that Treacle-mediated NBS1 recruitment into the nucleoli regulates rRNA silencing in trans in the presence of distant chromosome breaks.
•Movement centered examination strategies for foot and ankle pathology can enhance clinical reasoning within the interprofessional team.•Dorsiflexion range of motion can impact movement patterns ...throughout the lower extremity kinetic chain.•Professionals managing foot and ankle pathology may consider regional interdependence applications of dorsiflexion range of motion limitations and the implications of movement dysfunction within the squat pattern.
An association between limited ankle closed kinetic chain dorsiflexion range of motion (CKCDFROM) and movement dysfunction in the lower quarter is often implied, limited research exists linking CKCDFROM and gross movement patterns, such as the squatting. The purpose of this study is to investigate the association between CKCDFROM and movement patterns in collegiate athletes, as measured by the functional movement screen (FMS).
A quasi-experimental observational analytical cohort study with 147 athletes from five Division III collegiate men’s and women’s athletic teams were included in the study. CKCDFROM was assessed utilizing the lunge test. Movement patterns, specifically the deep squat (DS) and inline lunge (ILL) were assessed utilizing the FMS qualitative criteria. Descriptive statistical analysis examined the association between CKCDFROM limitations and a dysfunctional deep squat or ILL.
Seventy-nine (53.7%) and 30 (20.4%) participants scored a “1” on their FMS deep squat test and ILL, respectively. Participants who scored a “1” on the deep squat and ILL were 3.75 times as likely (3.75 odds ratio; 95% CI 1.57–9.14; p = 0.002) and 1.53 times as likely (1.53 odds ratio; 95% CI 0.65–3.60; p = 0.392), respectively, to have at least one ankle CKCDFROM limitation. The deep squat (DS) was statistically significant, but the ILL did not rise to the level of significance.
Physical therapists should consider regional interdependence implications of movement dysfunction stemming from impairments within the kinetic chain.
A dysfunctional lower extremity movement pattern might be associated with a lack of CKCDFROM. Clinicians will likely benefit from assessing CKCDFROM in those exhibiting dysfunctional squatting and/or lunging.
We examined the impact of kidney transplantation on left ventricular ejection fraction (LVEF) in end-stage renal disease (ESRD) patients with congestive heart failure (CHF).
The ESRD patients with ...decreased LVEF and a poor New York Heart Association (NYHA) functional class are not usually referred for transplant evaluations, as they are considered to be at increased risk of cardiac and surgical complications.
Between June 1998 and November 2002, 103 recipients with LVEF ≤40% and CHF underwent kidney transplantation. The LVEF was re-assessed by radionuclide ventriculography gated-blood pool (MUGA) scan at six and 12 months and at the last follow-up during the post-transplant period.
Mean pre-transplant LVEF% increased from 31.6 ± 6.7 (95% confidence interval CI 30.3 to 32.9) to 52.2 ± 12.0 (95% CI 49.9 to 54.6, p = 0.002) at 12 months after transplantation. There was no perioperative death. After transplantation, 69.9% of patients achieved LVEF ≥50% (normal LVEF). A longer duration of dialysis (in months) before transplantation decreased the likelihood of normalization of LVEF in the post-transplant period (odds ratio 0.82, 95% CI 0.74 to 0.91; p < 0.001). The NYHA functional class improved significantly in those with normalization of LVEF (p = 0.003). After transplantation, LVEF >50% was the only significant factor associated with a lower hazard for death or hospitalizations for CHF (relative risk 0.90, 95% CI 0.86 to 0.95; p < 0.0001).
Kidney transplantation in ESRD patients with advanced systolic heart failure results in an increase in LVEF, improves functional status of CHF, and increases survival. To abrogate the adverse effects of prolonged dialysis on myocardial function, ESRD patients should be counseled for kidney transplantation as soon as the diagnosis of systolic heart failure is established.
Abstract
Purpose
The purpose of this project was to develop a set of valid and feasible quality indicators used to track opioid stewardship efforts in hospital and emergency department settings.
...Methods
Candidate quality indicators were extracted from published literature. Feasibility screening excluded quality indicators that cannot be reliably extracted from the electronic health record or that are irrelevant to pain management in the hospital and emergency department settings. Validity screening used an electronic survey of key stakeholders including pharmacists, nurses, physicians, administrators, and researchers. Stakeholders used a 9-point Likert scale to rate the validity of each quality indicator based on predefined criteria. During expert panel discussions, stakeholders revised quality indicator wording, added new quality indicators, and voted to include or exclude each quality indicator. Priority ranking used a second electronic survey and a 9-point Likert scale to prioritize the included quality indicators.
Results
Literature search yielded 76 unique quality indicators. Feasibility screening excluded 9 quality indicators. The validity survey was completed by 46 (20%) of 228 stakeholders. Expert panel discussions yielded 19 valid and feasible quality indicators. The top 5 quality indicators by priority were: the proportion of patients with (1) naloxone administrations, (2) as needed opioids with duplicate indications, and (3) long acting or extended release opioids if opioid-naïve, (4) the average dose of morphine milligram equivalents administered per day, and (5) the proportion of opioid discharge prescriptions exceeding 7 days.
Conclusion
Multi-professional stakeholders across a health system participated in this consensus process and developed a set of 19 valid and feasible quality indicators for opioid stewardship interventions in the hospital and emergency department settings.