Obesity is associated with increased mortality, and weight loss trials show rapid improvement in many mortality risk factors. Yet, observational studies typically associate weight loss with higher ...mortality risk. The purpose of this meta-analysis of randomized controlled trials (RCTs) of weight loss was to clarify the effects of intentional weight loss on mortality.
2,484 abstracts were identified and reviewed in PUBMED, yielding 15 RCTs reporting (1) randomization to weight loss or non-weight loss arms, (2) duration of ≥18 months, and (3) deaths by intervention arm. Weight loss interventions were all lifestyle-based. Relative risks (RR) and 95% confidence intervals (95% CI) were estimated for each trial. For trials reporting at least one death (n = 12), a summary estimate was calculated using the Mantel-Haenszel method. Sensitivity analysis using sparse data methods included remaining trials.
Trials enrolled 17,186 participants (53% female, mean age at randomization = 52 years). Mean body mass indices ranged from 30-46 kg/m2, follow-up times ranged from 18 months to 12.6 years (mean: 27 months), and average weight loss in reported trials was 5.5±4.0 kg. A total of 264 deaths were reported in weight loss groups and 310 in non-weight loss groups. The weight loss groups experienced a 15% lower all-cause mortality risk (RR = 0.85; 95% CI: 0.73-1.00). There was no evidence for heterogeneity of effect (Cochran's Q = 5.59 (11 d.f.; p = 0.90); I2 = 0). Results were similar in trials with a mean age at randomization ≥55 years (RR = 0.84; 95% CI 0.71-0.99) and a follow-up time of ≥4 years (RR = 0.85; 95% CI 0.72-1.00).
In obese adults, intentional weight loss may be associated with approximately a 15% reduction in all-cause mortality.
Houston explores the role of diet on life and health span, highlighting the lessons learned from 1945 to 2020. In 1946, Clive McCay and colleagues published a study in the Journal of Gerontology ...describing the effect of diet and coffee on lifespan in rats. At the time, nutrition research was largely focused on the role of vitamins and minerals to treat diseases of deficiency. However, little was known as to how improving the human diet based on these discoveries would affect lifespan. The understanding of nutrition has advanced considerably since the 1946 study showing that improving a diet commonly consumed in the northeastern part of the US with either foods richer in vitamins and calcium or specific synthetic vitamins did not result in a longer lifespan in rats. More recent research on diet patterns in humans has identified foods/food groups that are associated with decreased morbidity and mortality and increased quality of life.
Abstract
Background
Advances in computational algorithms and the availability of large datasets with clinically relevant characteristics provide an opportunity to develop machine learning prediction ...models to aid in diagnosis, prognosis, and treatment of older adults. Some studies have employed machine learning methods for prediction modeling, but skepticism of these methods remains due to lack of reproducibility and difficulty in understanding the complex algorithms that underlie models. We aim to provide an overview of two common machine learning methods: decision tree and random forest. We focus on these methods because they provide a high degree of interpretability.
Method
We discuss the underlying algorithms of decision tree and random forest methods and present a tutorial for developing prediction models for serious fall injury using data from the Lifestyle Interventions and Independence for Elders (LIFE) study.
Results
Decision tree is a machine learning method that produces a model resembling a flow chart. Random forest consists of a collection of many decision trees whose results are aggregated. In the tutorial example, we discuss evaluation metrics and interpretation for these models. Illustrated using data from the LIFE study, prediction models for serious fall injury were moderate at best (area under the receiver operating curve of 0.54 for decision tree and 0.66 for random forest).
Conclusions
Machine learning methods offer an alternative to traditional approaches for modeling outcomes in aging, but their use should be justified and output should be carefully described. Models should be assessed by clinical experts to ensure compatibility with clinical practice.
Background/objectives
Poor appetite in older adults leads to sub‐optimal food intake and increases the risk of undernutrition. The impact of poor appetite on food intake in older adults is unknown. ...The aim of this study was to examine the differences in food intake among older community‐dwelling adults with different reported appetite levels.
Design
Cross‐sectional analysis of data from a longitudinal prospective study.
Setting
Health, aging, and body composition study performed in the USA.
Participants
2,597 community‐dwelling adults aged 70–79.
Measurements
A semi‐quantitative, interviewer‐administered, 108‐item food frequency questionnaire designed to estimate dietary intake. Poor appetite was defined as the report of a moderate, poor, or very poor appetite in the past month and was compared with good or very good appetite.
Results
The mean age of the study sample was 74.5 ± 2.8 years; 48.2% were men, 37.7% were black, and 21.8% reported a poor appetite. After adjustment for total energy intake and potential confounders (including biting/chewing problems), participants with a poor appetite had a significantly lower consumption of protein and dietary fiber, solid foods, protein rich foods, whole grains, fruits, and vegetables, but a higher consumption of dairy foods, fats, oils, sweets, and sodas compared to participants with very good appetite. In addition, they were less likely to report consumption of significant larger portion sizes.
Conclusion
Older adults reporting a poor appetite showed a different dietary intake pattern compared to those with (very) good appetite. Better understanding of the specific dietary intake pattern related to a poor appetite in older adults can be used for nutrition interventions to enhance food intake, diet variety, and diet quality.
Background: In older adults, every 0.1-m/s slower gait speed is associated with a 12% higher mortality. However, little research has identified risk factors for gait-speed decline.Objective: We ...assessed the association between several measures of body composition and age-related decline in gait speed.Design: Data were from 2306 older adults who were participating in the Health, Aging, and Body Composition cohort and were followed for 4 y (50% women; 38% black). Usual walking speed (m/s) over 20 m was measured in years 2 through 6, and the baseline and changes in several measures of body composition were included in mixed-effects models.Results: Gait speed declined by 0.06 ± 0.00 m/s over the 4-y period. Baseline thigh intermuscular fat predicted the annual gait-speed decline (±SE) in both men and women (−0.01 ± 0.00 and −0.02 ± 0.00 m/s per 0.57 cm2, respectively; P < 0.01). In men, but not in women, this relation was independent of total body adiposity. In longitudinal analyses, changes in thigh intermuscular fat and total thigh muscle were the only body-composition measures that predicted gait-speed decline in men and women combined. When modeled together, every 5.75-cm2 increase in thigh intermuscular fat was associated with a 0.01 ± 0.00-m/s decrease in gait speed, whereas every 16.92-cm2 decrease in thigh muscle was associated with a 0.01 ± 0.00-m/s decrease in gait speed.Conclusions: High and increasing thigh intermuscular fat are important predictors of gait-speed decline, implying that fat infiltration into muscle contributes to a loss of mobility with age. Conversely, a decreasing thigh muscle area is also predictive of a decline in gait speed.
IMPORTANCE Calcium intake has been promoted because of its proposed benefit on bone health, particularly among the older population. However, concerns have been raised about the potential adverse ...effect of high calcium intake on cardiovascular health. OBJECTIVE To investigate whether intake of dietary and supplemental calcium is associated with mortality from total cardiovascular disease (CVD), heart disease, and cerebrovascular diseases. DESIGN AND SETTING Prospective study from 1995 through 1996 in California, Florida, Louisiana, New Jersey, North Carolina, and Pennsylvania and the 2 metropolitan areas of Atlanta, Georgia, and Detroit, Michigan. PARTICIPANTS A total of 388 229 men and women aged 50 to 71 years from the National Institutes of Health–AARP Diet and Health Study. MAIN OUTCOME MEASURES Dietary and supplemental calcium intake was assessed at baseline (1995-1996). Supplemental calcium intake included calcium from multivitamins and individual calcium supplements. Cardiovascular disease deaths were ascertained using the National Death Index. Multivariate Cox proportional hazards regression models adjusted for demographic, lifestyle, and dietary variables were used to estimate relative risks (RRs) and 95% CIs. RESULTS During a mean of 12 years of follow-up, 7904 and 3874 CVD deaths in men and women, respectively, were identified. Supplements containing calcium were used by 51% of men and 70% of women. In men, supplemental calcium intake was associated with an elevated risk of CVD death (RR>1000 vs 0 mg/d, 1.20; 95% CI, 1.05-1.36), more specifically with heart disease death (RR, 1.19; 95% CI, 1.03-1.37) but not significantly with cerebrovascular disease death (RR, 1.14; 95% CI, 0.81-1.61). In women, supplemental calcium intake was not associated with CVD death (RR, 1.06; 95% CI, 0.96-1.18), heart disease death (1.05; 0.93-1.18), or cerebrovascular disease death (1.08; 0.87-1.33). Dietary calcium intake was unrelated to CVD death in either men or women. CONCLUSIONS AND RELEVANCE Our findings suggest that high intake of supplemental calcium is associated with an excess risk of CVD death in men but not in women. Additional studies are needed to investigate the effect of supplemental calcium use beyond bone health.
Vitamin D is a steroid hormone precursor that is associated with a range of human traits and diseases. Previous GWAS of serum 25-hydroxyvitamin D concentrations have identified four genome-wide ...significant loci (GC, NADSYN1/DHCR7, CYP2R1, CYP24A1). In this study, we expand the previous SUNLIGHT Consortium GWAS discovery sample size from 16,125 to 79,366 (all European descent). This larger GWAS yields two additional loci harboring genome-wide significant variants (P = 4.7×10
at rs8018720 in SEC23A, and P = 1.9×10
at rs10745742 in AMDHD1). The overall estimate of heritability of 25-hydroxyvitamin D serum concentrations attributable to GWAS common SNPs is 7.5%, with statistically significant loci explaining 38% of this total. Further investigation identifies signal enrichment in immune and hematopoietic tissues, and clustering with autoimmune diseases in cell-type-specific analysis. Larger studies are required to identify additional common SNPs, and to explore the role of rare or structural variants and gene-gene interactions in the heritability of circulating 25-hydroxyvitamin D levels.
Elevated markers of inflammation, such as interleukin-6 (IL-6), are associated with aging, cancer, and functional decline. We assessed the association of pre-diagnosis IL-6 levels with post-diagnosis ...functional trajectories among older adults with cancer. Black and White participants experience different social structures, therefore we sought to understand whether these associations differ between Black and White participants.
We conducted secondary analysis of the Health Aging, Body, and Composition (ABC) prospective longitudinal cohort study. Participants were recruited from 4/1997 to 6/1998. We included 179 participants with a new cancer diagnosis and IL-6 level measured within 2 years before diagnosis. Primary endpoint was functional measures (self-reported ability to walk 1/4, 20-meter gait speed). Nonparametric longitudinal models were used to cluster the trajectories; multinomial and logistic regressions to model associations.
Mean age was 74 (SD 2.9); 36 % identified as Black. For self-reported functional status, we identified 3 clusters: high stable, decline, low stable. For gait speed, we identified 2 clusters: resilient, decline. The relationship between cluster trajectory and IL-6 was different between Black and White participants (p for interaction<0.05). For gait speed, among White participants, a greater log IL-6 level was associated with greater odds of being in the decline vs. resilient cluster Adjusted Odds Ratio (AOR): 4.31, 95 % CI: 1.43, 17.46. Among Black participants, a greater log IL-6 levels were associated with lower odds of being in the decline vs. resilient cluster (AOR: 0.49, 95 % CI: 0.10, 2.08). Directionality was similar for self-reported ability to walk ¼ mile (high stable vs. low stable). Among White participants, a higher log IL-6 level was associated numerically with greater odds of being in the low stable vs. high stable cluster (AOR: 1.99, 95 % CI: 0.82, 4.85). Among Black participants, a higher log IL-6 level was associated numerically with lower odds of being in the low stable cluster vs. high stable cluster (AOR: 0.78, 95 % CI: 0.30, 2.00).
The association between IL-6 levels and functional trajectories of older adults differed by race. Future analyses exploring stressors faces by other minoritized racial backgrounds are needed to determine the association between IL-6 and functional trajectories.
Evidence before this study: Previous research has shown that aging is the greatest risk factor for cancer and older adults with cancer experience a higher burden of comorbidities, increasing their risk of functional decline. Race has also been shown to be associated with increased risk for functional decline. Black individuals are exposed to more chronic negative social determinants, compared to White individuals. Previous work has shown that chronic exposure to negative social determinants leads to elevated levels of inflammatory markers, such as IL-6, but studies investigating the relationship between inflammatory markers and functional decline are limited.
Added value of this study: Authors of this study sought to understand the association between pre-diagnosis IL-6 levels and functional trajectories post-diagnosis in older adults with cancer, and whether these associations differed between Black and White participants with cancer. Authors decided to utilize the data from the Health, Aging and Body Composition (Health ABC) Study. The Health ACB study was a prospective longitudinal cohort study that has a high representation of Black older adults and collected inflammatory cytokines and physical function data over time.
Implications of all available evidence: This work adds to the literature by providing an opportunity to study the difference in the relationships between IL-6 levels and functional trajectories between older Black and White participants with cancer. Identifying factors associated with functional decline and its trajectories may inform treatment decision making and guide development of supportive care interventions to prevent functional decline. Additionally, given the disparities in clinical outcomes for Black individuals, a better understanding of the difference in functional decline based on race will allow more equitable care to be distributed.
•Identifying factors associated with functional trajectories may inform treatment decision making and guide interventions.•The association between IL-6 levels and functional trajectories of older adults differed by race.•Further studies are needed to examine reasons for these racial differences.
Advanced glycation end products (AGEs) promote adverse health effects and may contribute to the multi-system functional decline observed in aging. Diet is a major source of AGEs, and foods high in ...protein may increase circulating AGE concentrations. However, epidemiological evidence that high-protein diets increase AGEs is lacking.
We examined whether dietary protein intake was associated with serum concentrations of the major AGE carboxymethyl-lysine (CML) and the soluble receptor for AGEs (sRAGE) in 2439 participants from the Health, Aging, and Body Composition study (mean age, 73.6 ± 2.9 y; 52% female; 37% black).
CML and sRAGE were measured by ELISA, and the CML/sRAGE ratio was calculated. Protein intake was estimated using an interviewer-administered FFQ and categorized based on current recommendations for older adults: <0.8 g/kg/d (n = 1077), 0.8 to <1.2 g/kg/d (n = 922), and ≥1.2 g/kg/d (n = 440). Associations between protein intake and AGE-RAGE biomarkers were examined using linear regression models adjusted for demographics, height, lifestyle behaviors, prevalent disease, cognitive function, inflammation, and other dietary factors.
CML concentrations were higher in individuals with higher total protein intake (adjusted least squares mean ± SE: <0.8 g/kg/d, 829 ± 17 ng/ml; 0.8 to <1.2 g/kg/d, 860 ± 15 ng/ml; ≥1.2 g/kg/d, 919 ± 23 ng/ml; P for trend = 0.001), as were sRAGE concentrations (<0.8 g/kg/d, 1412 ± 34 pg/ml; 0.8 to <1.2 g/kg/d, 1479 ± 31 pg/ml; ≥1.2 g/kg/d, 1574 ± 47 pg/ml; P for trend < 0.0001). Every 0.1 g/kg/d increment in total protein intake was associated with a 13.3 ± 3.0 ng/ml increment in CML and a 22.1 ± 6.0 pg/ml increment in sRAGE (P < 0.0001 for both). Higher CML and sRAGE concentrations were also associated with higher intakes of both animal and vegetable protein (all P values ≤ 0.01). There were no significant associations with the CML/sRAGE ratio.
Higher dietary protein intake was associated with higher CML and sRAGE concentrations in older adults; however, the CML/sRAGE ratio remained similar across groups.
Obesity may accelerate age-related increases in aortic stiffness. Although aerobic exercise training generally has favorable effects on aortic structure and function, exercise alone may not be ...sufficient to improve aortic stiffness in older adults with obesity. We determined the effects of aerobic exercise training with and without moderate- to high-caloric restriction (CR) on the structure and function of the proximal aorta in 160 older (65-79 years) men and women with obesity (body mass index=30-45 kg/m
).
Participants were randomly assigned to 1 of 3 groups: aerobic exercise training only (treadmill 4 days/week for 30 minutes at 65% to 70% of heart rate reserve; n=56), aerobic exercise training plus moderate CR (n=55), or aerobic exercise training plus more intensive CR (n=49) for 20 weeks. Aortic pulse wave velocity, aortic distensibility, and other measures of aortic structure and function were assessed by cardiovascular magnetic resonance imaging. Pearson correlation coefficients were examined to assess associations between changes in proximal aortic stiffness and changes in fitness, fatness, and other potential confounders.
Weight loss in the aerobic exercise training plus moderate CR (-8.0 kg 95% CI, -9.17 to -6.87) and aerobic exercise training plus more intensive CR (-8.98 kg 95% CI, -10.23 to -7.73) groups was significantly greater compared with the aerobic exercise training-only group (-1.66 kg 95% CI, -2.94 to -0.38;
<0.017 for both). There were significant treatment effects for descending aorta distensibility (
=0.008) and strain (
=0.004) and aortic arch pulse wave velocity (
=0.01) with the aerobic exercise training plus moderate CR group having a 21% increase in distensibility (
=0.016) and an 8% decrease in pulse wave velocity (
=0.058). None of the aortic stiffness measures changed significantly in the aerobic exercise training-only or aerobic exercise training plus more intensive CR groups, and there were no significant changes in any other measure of aortic structure or function in these groups. Overall, increases in aortic distensibility were correlated with improvements in body weight and body fat distribution, but these associations were not statistically significant after adjustment for multiple comparisons.
In older adults with obesity, combining aerobic exercise with moderate CR leads to greater improvements in proximal aortic stiffness than exercise alone. Registration: URL: https://clinicaltrials.gov; Unique identifier: NCT01048736.