Background & Aims Histologic analysis of liver biopsy specimens allows for grading and staging of nonalcoholic fatty liver disease (NAFLD). We performed a longitudinal study to investigate the ...long-term prognostic relevance of histologic features for patients with NAFLD. Methods We performed a retrospective analysis of 619 patients diagnosed with NAFLD from 1975 through 2005 at medical centers in the United States, Europe, and Thailand. Patients underwent laboratory and biopsy analyses, and were examined every 3–12 months after their diagnosis. Outcomes analyzed were overall mortality, liver transplantation, and liver-related events. Cumulative outcomes were compared by log-rank analysis. Cox proportional-hazards regression was used to estimate adjusted hazard ratios (HRs). Time at risk was determined from the date of liver biopsy to the date of outcome or last follow-up examination. Results Over a median follow-up period of 12.6 years (range, 0.3–35.1 y), 193 of the patients (33.2%) died or underwent liver transplantation. Features of liver biopsies significantly associated with death or liver transplantation included fibrosis stage 1 (HR, 1.88; 95% confidence interval CI, 1.28–2.77), stage 2 (HR, 2.89; 95% CI, 1.93–4.33), stage 3 (HR, 3.76; 95% CI, 2.40–5.89), and stage 4 (HR, 10.9; 95% CI, 6.06–19.62) compared with stage 0, as well as age (HR, 1.07; 95% CI, 1.05–1.08), diabetes (HR, 1.61; 95% CI, 1.13–2.30), current smoking (HR, 2.62; 95% CI, 1.67–4.10), and statin use (HR, 0.32; 95% CI, 0.14–0.70). Twenty-six patients (4.2%) developed liver-related events; fibrosis stage 3 (HR, 14.2; 95% CI, 3.38–59.68) and stage 4 (HR, 51.5; 95% CI, 9.87–269.2) compared with stage 0, were associated significantly with the events. Patients with fibrosis, regardless of steatohepatitis or NAFLD activity score, had shorter survival times than patients without fibrosis. Conclusions In a longitudinal study of patients with NAFLD, fibrosis stage, but no other histologic features of steatohepatitis, were associated independently with long-term overall mortality, liver transplantation, and liver-related events.
Sustained viral response (SVR) is the optimal outcome of hepatitis C virus (HCV) therapy, yet more detailed data are required to confirm its clinical value. Individuals receiving treatment in ...1996‐2011 were identified using the Scottish HCV clinical database. We sourced data on 10 clinical events: liver, nonliver, and all‐cause mortality; first hospitalisation for severe liver morbidity (SLM); cardiovascular disease (CVD); respiratory disorders; neoplasms; alcohol‐intoxication; drug intoxication; and violence‐related injury (note: the latter three events were selected a priori to gauge ongoing chaotic lifestyle behaviours). We determined the association between SVR attainment and each outcome event, in terms of the relative hazard reduction and absolute risk reduction (ARR). We tested for an interaction between SVR and liver disease severity (mild vs. nonmild), defining mild disease as an aspartate aminotransferase‐to‐platelet ratio index (APRI) <0.7. Our cohort comprised 3,385 patients (mean age: 41.6 years), followed‐up for a median 5.3 years (interquartile range: 3.3‐8.2). SVR was associated with a reduced risk of liver mortality (adjusted hazard ratio AHR: 0.24; P < 0.001), nonliver mortality (AHR, 0.68; P = 0.026), all‐cause mortality (AHR, 0.49; P < 0.001), SLM (AHR, 0.21; P < 0.001), CVD (AHR, 0.70; P = 0.001), alcohol intoxication (AHR, 0.52; P = 0.003), and violence‐related injury (AHR, 0.51; P = 0.002). After 7.5 years, SVR was associated with significant ARRs for liver mortality, all‐cause mortality, SLM, and CVD (each 3.0%‐4.7%). However, we detected a strong interaction, in that ARRs were considerably higher for individuals with nonmild disease than for individuals with mild disease. Conclusions: The conclusions are 3‐fold: (1) Overall, SVR is associated with reduced hazard for a range of hepatic and nonhepatic events; (2) an association between SVR and behavioral events is consistent with SVR patients leading healthier lives; and (3) the short‐term value of SVR is greatest for those with nonmild disease. (Hepatology 2015;62:355–364
Background & Aims Some patients with nonalcoholic fatty liver disease (NAFLD) develop liver-related complications and have higher mortality than other patients with NAFLD. We determined the accuracy ...of simple, noninvasive scoring systems in identification of patients at increased risk for liver-related complications or death. Methods We performed a retrospective, international, multicenter cohort study of 320 patients diagnosed with NAFLD, based on liver biopsy analysis through 2002 and followed through 2011. Patients were assigned to mild-, intermediate-, or high-risk groups based on cutoff values for 2 of the following: NAFLD fibrosis score, aspartate aminotransferase/platelet ratio index, FIB-4 score, and BARD score. Outcomes included liver-related complications and death or liver transplantation. We used multivariate Cox proportional hazard regression analysis to adjust for relevant variables and calculate adjusted hazard ratios (aHRs). Results During a median follow-up period of 104.8 months (range, 3−317 months), 14% of patients developed liver-related events and 13% died or underwent liver transplantation. The aHRs for liver-related events in the intermediate-risk and high-risk groups, compared with the low-risk group, were 7.7 (95% confidence interval CI: 1.4−42.7) and 34.2 (95% CI: 6.5−180.1), respectively, based on NAFLD fibrosis score; 8.8 (95% CI: 1.1−67.3) and 20.9 (95% CI: 2.6−165.3) based on the aspartate aminotransferase/platelet ratio index; and 6.2 (95% CI: 1.4−27.2) and 6.6 (95% CI: 1.4−31.1) based on the BARD score. The aHRs for death or liver transplantation in the intermediate-risk and high-risk groups compared with the low-risk group were 4.2 (95% CI: 1.3−13.8) and 9.8 (95% CI: 2.7−35.3), respectively, based on the NAFLD fibrosis scores. Based on aspartate aminotransferase/platelet ratio index and FIB-4 score, only the high-risk group had a greater risk of death or liver transplantation (aHR = 3.1; 95% CI: 1.1−8.4 and aHR = 6.6; 95% CI: 2.3−20.4, respectively). Conclusions Simple noninvasive scoring systems help identify patients with NAFLD who are at increased risk for liver-related complications or death. NAFLD fibrosis score appears to be the best indicator of patients at risk, based on HRs. The results of this study require external validation.
ObjectiveThe first wave of the COVID-19 pandemic had a major impact on healthcare utilisation. The aim of this retrospective review was to quantify how utilisation of non-COVID care changed during ...this time so as to gain insight and inform planning of future services during potential second and subsequent waves.Methods and analysisA longitudinal design was used to analyse anonymous private UK health insurer datasets covering the period of January 2018 to August 2020. Taken as a measure of healthcare utilisation in the UK, incidence rates of claims broken down by service area and condition were calculated alongside overall monthly totals and costs. Pre-COVID-19 years were compared with the current year.ResultsHealthcare utilisation during the first wave of COVID-19 decreased by as much as 70% immediately after lockdown measures were implemented. After 2 months, the trend reversed and claims steadily began to increase, but did not reach rates seen from previous years by the end of August 2020. Assessment by service and diagnostic category showed that most areas, especially those highly reliant on in-person treatment, reflected the same pattern (ie, rapid drop followed by a steady recovery). The provision of mental health services differed from this observed trend, where utilisation increased by 20% during the first wave of COVID-19, in comparison to pre-COVID-19 years. The utilisation of maternity services and the treatment of existing cancers also stayed stable, or increased slightly, during this time.ConclusionsHealthcare utilisation in a UK-based privately insured population decreased dramatically during the first wave of the COVID-19 pandemic, being over 70% lower at its height. However, mental health services remained resilient during this time, possibly due to greater virtualisation of diagnostics and care.
Background
The impact of employee health on productivity in the workplace is generally evidenced through absenteeism and presenteeism. Multicomponent worksite health programmes, with significant ...online elements, have gained in popularity over the last two decades, due in part to their scalability and low cost of implementation. However, little is known about the impact of digital-only interventions on health-related outcomes in employee groups. The aim of this systematic review was to assess the impact of pure digital health interventions in the workplace on health-related outcomes.
Methods
Multiple databases, including MEDLINE, EMBASE, PubMed and PsycINFO, were used to review the literature using PRISMA guidelines.
Results
Of 1345 records screened, 22 randomized controlled trial studies were found to be eligible. Although there was a high level of heterogeneity across these studies, significant improvements were found for a broad range of outcomes such as sleep, mental health, sedentary behaviours and physical activity levels. Standardized measures were not always used to quantify intervention impact. All but one study resulted in at least one significantly improved health-related outcome, but attrition rates ranged widely, suggesting sustaining engagement was an issue. Risk of bias assessment was low for one-third of the studies and unclear for the remaining ones.
Conclusions
This review found modest evidence that digital-only interventions have a positive impact on health-related outcomes in the workplace. High heterogeneity impacted the ability to confirm what interventions might work best for which health outcomes, although less complex health outcomes appeared to be more likely to be impacted. A focus on engagement along with the use of standardized measures and reporting of active intervention components would be helpful in future evaluations.
The Microarray Innovations in Leukemia study assessed the clinical utility of gene expression profiling as a single test to subtype leukemias into conventional categories of myeloid and lymphoid ...malignancies.
The investigation was performed in 11 laboratories across three continents and included 3,334 patients. An exploratory retrospective stage I study was designed for biomarker discovery and generated whole-genome expression profiles from 2,143 patients with leukemias and myelodysplastic syndromes. The gene expression profiling-based diagnostic accuracy was further validated in a prospective second study stage of an independent cohort of 1,191 patients.
On the basis of 2,096 samples, the stage I study achieved 92.2% classification accuracy for all 18 distinct classes investigated (median specificity of 99.7%). In a second cohort of 1,152 prospectively collected patients, a classification scheme reached 95.6% median sensitivity and 99.8% median specificity for 14 standard subtypes of acute leukemia (eight acute lymphoblastic leukemia and six acute myeloid leukemia classes, n = 693). In 29 (57%) of 51 discrepant cases, the microarray results had outperformed routine diagnostic methods.
Gene expression profiling is a robust technology for the diagnosis of hematologic malignancies with high accuracy. It may complement current diagnostic algorithms and could offer a reliable platform for patients who lack access to today's state-of-the-art diagnostic work-up. Our comprehensive gene expression data set will be submitted to the public domain to foster research focusing on the molecular understanding of leukemias.
Mast‐seeding plants often produce high seed crops the year after a warm spring or summer, but the warm‐temperature model has inconsistent predictive ability. Here, we show for 26 long‐term data sets ...from five plant families that the temperature difference between the two previous summers (ΔT) better predicts seed crops. This discovery explains how masting species tailor their flowering patterns to sites across altitudinal temperature gradients; predicts that masting will be unaffected by increasing mean temperatures under climate change; improves prediction of impacts on seed consumers; demonstrates that strongly masting species are hypersensitive to climate; explains the rarity of consecutive high‐seed years without invoking resource constraints; and generates hypotheses about physiological mechanisms in plants and insect seed predators. For plants, ΔT has many attributes of an ideal cue. This temperature‐difference model clarifies our understanding of mast seeding under environmental change, and could also be applied to other cues, such as rainfall.
Occasional risk of serious liver dysfunction and autoimmune hepatitis during atorvastatin therapy has been reported. We compared the risk of hepatotoxicity in atorvastatin relative to simvastatin ...treatment.
The UK GPRD identified patients with a first prescription for simvastatin 164,407 or atorvastatin 76,411 between 1997 and 2006, but with no prior record of liver disease, alcohol-related diagnosis, or liver dysfunction. Incident liver dysfunction in the following six months was identified by biochemical value and compared between statin groups by Cox regression model adjusting for age, sex, year treatment started, dose, alcohol consumption, smoking, body mass index and comorbid conditions.
Moderate to severe hepatotoxicity bilirubin >60μmol/L, AST or ALT >200U/L or alkaline phosphatase >1200U/L developed in 71 patients on atorvastatin versus 101 on simvastatin. Adjusted hazard ratio AHR for all atorvastatin relative to simvastatin was 1.9 95% confidence interval 1.4-2.6. High dose was classified as 40-80mg daily and low dose 10-20mg daily. Hepatotoxicity occurred in 0.44% of 4075 patients on high dose atorvastatin HDA, 0.07% of 72,336 on low dose atorvastatin LDA, 0.09% of 44,675 on high dose simvastatin HDS and 0.05% of 119,732 on low dose simvastatin LDS. AHRs compared to LDS were 7.3 4.2-12.7 for HDA, 1.4 0.9-2.0 for LDA and 1.5 1.0-2.2 for HDS.
The risk of hepatotoxicity was increased in the first six months of atorvastatin compared to simvastatin treatment, with the greatest difference between high dose atorvastatin and low dose simvastatin. The numbers of events in the analyses were small.
We compared the differences in cerebrovascular and cognitive function between 13 aerobic exercise trained, older adults and 13 age-, height- and sex-matched sedentary, untrained controls. We ...determined whether other measures accounted for differences in cerebrovascular and cognitive function between these groups and examined the associations between these functions. Participants undertook anthropometric, mood, cardiovascular, exercise performance, strength, cerebrovascular, and cognitive measurements, and a blood collection. Transcranial Doppler ultrasonography determined cerebrovascular responsiveness (CVR) to hypercapnia and cognitive stimuli. The trained group had a higher CVR to hypercapnia (80.3 ± 7.2 vs 35.1 ± 6.7%,
P
< 0.001), CVR to cognitive stimuli (30.1 ± 2.9 vs 17.8 ± 1.4%,
P
= 0.001) and total composite cognitive score (117 ± 2 vs 98 ± 4,
P
< 0.001) than the controls. These parameters no longer remained statistically different between the groups following adjustments for covariates. There were positive correlations between the total composite cognitive score and CVR to hypercapnia (
r
= 0.474,
P
= 0.014) and CVR to cognitive stimuli (
r
= 0.685,
P
< 0.001). We observed a relationship between cerebrovascular and cognitive function in older adults and an interaction between regular lifelong aerobic exercise training and cardiometabolic factors that may directly influence these functions.
SCOPE: Resistance of proteins to gastrointestinal digestion may play a role in determining immune‐mediated adverse reactions to foods. However, digestion studies have largely been restricted to ...purified proteins and the impact of food processing and food matrices on protein digestibility is poorly understood. METHODS AND RESULTS: Digestibility of a total gliadin fraction (TGF), flour (cv Hereward), and bread was assessed using in vitro batch digestion with simulated oral, gastric, and duodenal phases. Protein digestion was monitored by SDS‐PAGE and immunoblotting using monoclonal antibodies specific for celiac‐toxic sequences (QQSF, QPFP) and starch digestion by measuring undigested starch. Whereas the TGF was rapidly digested during the gastric phase the gluten proteins in bread were virtually undigested and digested rapidly during the duodenal phase only if amylase was included. Duodenal starch digestion was also slower in the absence of duodenal proteases. CONCLUSION: The baking process reduces the digestibility of wheat gluten proteins, including those containing sequences active in celiac disease. Starch digestion affects the extent of protein digestion, probably because of gluten‐starch complex formation during baking. Digestion studies using purified protein fractions alone are therefore not predictive of digestion in complex food matrices.