The availability of life-saving dialysis therapy has been one of the great successes of medicine in the past four decades. Over this time period, despite treatment of hundreds of thousands of ...patients, the overall quality of life for patients with ESRD has not substantially improved. A narrow focus by clinicians and regulators on basic indicators of care, like dialysis adequacy and anemia, has consumed time and resources but not resulted in significantly improved survival; also, frequent hospitalizations and dissatisfaction with the care experience continue to be seen. A new quality paradigm is needed to help guide clinicians, providers, and regulators to ensure that patients' lives are improved by the technically complex and costly therapy that they are receiving. This paradigm can be envisioned as a quality pyramid: the foundation is the basic indicators (outstanding performance on these indicators is necessary but not sufficient to drive the primary outcomes). Overall, these basics are being well managed currently, but there remains an excessive focus on them, largely because of publically reported data and regulatory requirements. With a strong foundation, it is now time to focus on the more complex intermediate clinical outcomes-fluid management, infection control, diabetes management, medication management, and end-of-life care among others. Successfully addressing these intermediate outcomes will drive improvements in the primary outcomes, better survival, fewer hospitalizations, better patient experience with the treatment, and ultimately, improved quality of life. By articulating this view of quality in the ESRD program (pushing up the quality pyramid), the discussion about quality is reframed, and also, clinicians can better target their facilities in the direction of regulatory oversight and requirements about quality. Clinicians owe it to their patients, as the ESRD program celebrates its 40th anniversary, to rekindle the aspirations of the creators of the program, whose primary goal was to improve the lives of the patients afflicted with this devastating condition.
Background Sudden death is a leading cause of death in patients on maintenance hemodialysis therapy. During hemodialysis sessions, the gradient between serum and dialysate levels results in rapid ...electrolyte shifts, which may contribute to arrhythmias and sudden death. Controversies exist about the optimal electrolyte concentration in the dialysate; specifically, it is unclear whether patient outcomes differ among those treated with a dialysate potassium concentration of 3 mEq/L compared to 2 mEq/L. Study Design Prospective cohort study. Setting & Participants 55,183 patients from 20 countries in the Dialysis Outcomes and Practice Patterns Study (DOPPS) phases 1 to 5 (1996-2015). Predictor Dialysate potassium concentration at study entry. Outcomes Cox regression was used to estimate the association between dialysate potassium concentration and both all-cause mortality and an arrhythmia composite outcome (arrhythmia-related hospitalization or sudden death), adjusting for potential confounders. Results During a median follow-up of 16.5 months, 24% of patients died and 7% had an arrhythmia composite outcome. No meaningful difference in clinical outcomes was observed for patients treated with a dialysate potassium concentration of 3 versus 2 mEq/L (adjusted HRs were 0.96 95% CI, 0.91-1.01 for mortality and 0.98 95% CI, 0.88-1.08 for arrhythmia composite). Results were similar across predialysis serum potassium levels. As in prior studies, higher serum potassium level was associated with adverse outcomes. However, dialysate potassium concentration had only minimal impact on serum potassium level measured predialysis (+0.09 95% CI, 0.05-0.14 mEq/L serum potassium per 1 mEq/L greater dialysate potassium concentration). Limitations Data were not available for delivered (vs prescribed) dialysate potassium concentration and postdialysis serum potassium level; possible unmeasured confounding. Conclusions In combination, these results suggest that approaches other than altering dialysate potassium concentration (eg, education on dietary potassium sources and prescription of potassium-binding medications) may merit further attention to reduce risks associated with high serum potassium levels.
Chronic kidney disease (CKD) is a national public health problem beset by inequities in incidence, prevalence, and complications across gender, race/ethnicity, and socioeconomic status. As health ...care providers, we can directly address some factors crucial for closing the disparities gap. Other factors are seemingly beyond our reach, entrenched within the fabric of our society, such as social injustice and human indifference. Paradoxically, the existence of health inequities provides unique, unrecognized opportunities for understanding biologic, environmental, sociocultural, and health care system factors that can lead to improved clinical outcomes. Several recent reports documented that structured medical care systems can reduce many CKD-related disparities and improve patient outcomes. Can the moral imperative to eliminate CKD inequities inspire the nephrology community not only to advocate for but also to demand high-quality, structured health care delivery systems for all Americans in the context of social reform that improves the ecology, health, and well-being of our communities? If so, then perhaps we can eliminate the unacceptable premature morbidity and mortality associated with CKD and the tragedy of health inequities. By so doing, we could become global leaders not only in medical technology, as we currently are, but also in health promotion and disease prevention, truly leaving no patient behind.
In individuals with chronic kidney disease, high dietary phosphorus (P) burden may worsen hyperparathyroidism and renal osteodystrophy, promote vascular calcification and cardiovascular events, and ...increase mortality. In addition to the absolute amount of dietary P, its type (organic versus inorganic), source (animal versus plant derived), and ratio to dietary protein may be important. Organic P in such plant foods as seeds and legumes is less bioavailable because of limited gastrointestinal absorption of phytate-based P. Inorganic P is more readily absorbed by intestine, and its presence in processed, preserved, or enhanced foods or soft drinks that contain additives may be underreported and not distinguished from the less readily absorbed organic P in nutrient databases. Hence, P burden from food additives is disproportionately high relative to its dietary content as compared with natural sources that are derived from organic (animal and vegetable) food proteins. Observational and metabolic studies indicate nutritional and longevity benefits of higher protein intake in dialysis patients. This presents challenges to providing appropriate nutrition because protein and P intakes are closely correlated. During dietary counseling of patients with chronic kidney disease, the absolute dietary P content as well as the P-to-protein ratio in foods should be addressed. Foods with the least amount of inorganic P, low P-to-protein ratios, and adequate protein content that are consistent with acceptable palatability and enjoyment to the individual patient should be recommended along with appropriate prescription of P binders. Provision of in-center and monitored meals during hemodialysis treatment sessions in the dialysis clinic may facilitate the achievement of these goals.
Background In contrast to the general population, higher body mass index (BMI) is associated with greater survival in patients receiving hemodialysis (HD; “obesity paradox”). We hypothesized that ...this paradoxical association between BMI and death may be modified by age and dialysis vintage. Study Design Retrospective observational study using a large HD patient cohort. Setting & Participants 123,383 maintenance HD patients treated in DaVita dialysis clinics between July 1, 2001, and June 30, 2006, with follow-up through September 30, 2009. Predictors Age, dialysis vintage, and time-averaged BMI. Time-averaged BMI was divided into 6 subgroups; <18.5, 18.5-<23.0, 23.0-<25.0, 25.0-<30.0, 30.0-<35.0, and ≥35.0 kg/m2 . BMI category of 23-<25 kg/m2 was used as the reference category. Outcomes All-cause, cardiovascular, and infection-related mortality. Results Mean BMI of study participants was 27 ± 7 kg/m2 . Time-averaged BMI was <18.5 and ≥35 kg/m2 in 5% and 11% of patients, respectively. With progressively higher time-averaged BMI, there was progressively lower all-cause, cardiovascular, and infection-related mortality in patients younger than 65 years. In those 65 years or older, even though overweight/obese patients had lower mortality compared with underweight/normal-weight patients, sequential increases in time-averaged BMI > 25 kg/m2 added no additional benefit. Based on dialysis vintage, incident HD patients had greater all-cause and cardiovascular survival benefit with a higher time-averaged BMI compared with the longer term HD patients. Limitations Causality cannot be determined, and residual confounding cannot be excluded given the observational study design. Conclusions Higher BMI is associated with lower death risk across all age and dialysis vintage groups. This benefit is more pronounced in incident HD patients and those younger than 65 years. Given the robustness of the survival advantage of higher BMI, examining interventions to maintain or even increase dry weight in HD patients irrespective of age and vintage are warranted.
Background Hyperkalemia has been associated with higher mortality in long-term hemodialysis (HD) patients. There are few data concerning the relationship between dietary potassium intake and outcome. ...Study Design The mortality predictability of dietary potassium intake from reported food items estimated using the Block Food Frequency Questionnaire (FFQ) at the start of the cohort was examined in a 5-year (2001-2006) cohort of 224 HD patients in Southern California using Cox proportional hazards regression. Setting & Participants 224 long-term HD patients from 8 DaVita dialysis clinics. Predictors Dietary potassium intake ranking using the Block FFQ. Outcomes 5-year survival. Results HD patients with higher potassium intake had greater dietary energy, protein, and phosphorus intakes and higher predialysis serum potassium and phosphorus levels. Greater dietary potassium intake was associated with significantly increased death HRs in unadjusted models and after incremental adjustments for case-mix, nutritional factors (including 3-month averaged predialysis serum creatinine, potassium, and phosphorus levels; body mass index; normalized protein nitrogen appearance; and energy, protein, and phosphorus intake) and inflammatory marker levels. HRs for death across the 3 higher quartiles of dietary potassium intake in the fully adjusted model (compared with the lowest quartile) were 1.4 (95% CI, 0.6-3.0), 2.2 (95% CI, 0.9-5.4), and 2.4 (95% CI, 1.1-7.5), respectively ( P for trend = 0.03). Restricted cubic spline analyses confirmed the incremental mortality predictability of higher potassium intake. Limitations FFQs may underestimate individual potassium intake and should be used to rank dietary intake across the population. Conclusions Higher dietary potassium intake is associated with increased death risk in long-term HD patients, even after adjustments for serum potassium level; dietary protein; energy, and phosphorus intake; and nutritional and inflammatory marker levels. The potential role of dietary potassium in the high mortality rate of HD patients warrants clinical trials.
There are conflicting research results about the survival differences between hemodialysis and peritoneal dialysis, especially during the first 2 years of dialysis treatment. Given the challenges of ...conducting randomized trials, differential rates of modality switch and transplantation, and time-varying confounding in cohort data during the first years of dialysis treatment, use of novel analytical techniques in observational cohorts can help examine the peritoneal dialysis versus hemodialysis survival discrepancy.
This study examined a cohort of incident dialysis patients who initiated dialysis in DaVita dialysis facilities between July of 2001 and June of 2004 and were followed for 24 months. This study used the causal modeling technique of marginal structural models to examine the survival differences between peritoneal dialysis and hemodialysis over the first 24 months, accounting for modality change, differential transplantation rates, and detailed time-varying laboratory measurements.
On dialysis treatment day 90, there were 23,718 incident dialysis-22,360 hemodialysis and 1,358 peritoneal dialysis-patients. Incident peritoneal dialysis patients were younger, had fewer comorbidities, and were nine and three times more likely to switch dialysis modality and receive kidney transplantation over the 2-year period, respectively, compared with hemodialysis patients. In marginal structural models analyses, peritoneal dialysis was associated with persistently greater survival independent of the known confounders, including dialysis modality switch and transplant censorship (i.e., death hazard ratio of 0.52 95% confidence limit 0.34-0.80).
Peritoneal dialysis seems to be associated with 48% lower mortality than hemodialysis over the first 2 years of dialysis therapy independent of modality switches or differential transplantation rates.
For some patients with kidney failure, particularly those who have limited life expectancy or severe comorbidities, the “standard” dialysis treatment regimen may be perceived as excessively ...burdensome and may not align well with the patient's own priorities. For such patients, a palliative approach to the provision of dialysis—whereby treatment is tailored to the needs of the individual so as to optimize quality of life and minimize disease‐related symptoms, but limit treatment burden—might offer a way to better align the delivery of care with the life goals of the patient. Here, we discuss the fundamental principles of palliative dialysis: the patients who might most benefit from this approach, treatment strategies and considerations for implementation, as well as potential barriers to its provision.
Maintenance hemodialysis (MHD) patients with larger body or fat mass have greater survival than normal to low mass. We hypothesized that mid-arm muscle circumference (MAMC), a conveniently measured ...surrogate of lean body mass (LBM), has stronger association with clinical outcomes than triceps skinfold (TSF), a surrogate of fat mass.
The associations of TSF, MAMC, and serum creatinine, another LBM surrogate, with baseline short form 36 quality-of-life scores and 5-year survival were examined in 792 MHD patients. In a randomly selected subsample of 118 subjects, LBM was measured by dual-energy x-ray absorptiometry.
Dual-energy x-ray absorptiometry-assessed LBM correlated most strongly with MAMC and serum creatinine. Higher MAMC was associated with better short form 36 mental health scale and lower death hazard ratios (HRs) after adjustment for case-mix, malnutrition-inflammation-cachexia syndrome, and inflammatory markers. Adjusted death HRs were 1.00, 0.86, 0.69, and 0.63 for the first to fourth MAMC quartiles, respectively. Higher serum creatinine and TSF were also associated with lower death HRs, but these associations were mitigated after multivariate adjustments. Using median values of TSF and MAMC to dichotomize, combined high MAMC with either high or low TSF (compared with low MAMC/TSF) exhibited the greatest survival, i.e., death HRs of 0.52 and 0.59, respectively.
Higher MAMC is a surrogate of larger LBM and an independent predictor of better mental health and greater survival in MHD patients. Sarcopenia-correcting interventions to improve clinical outcomes in this patient population warrant controlled trials.
Uncorrected serum calcium concentration is the first mineral metabolism metric planned for use as a quality measure in the United States ESRD population. Few studies in patients undergoing either ...peritoneal dialysis (PD) or hemodialysis (HD) have assessed the association of uncorrected serum calcium concentration with clinical outcomes. We obtained data from 129,076 patients on dialysis (PD, 10,066; HD, 119,010) treated in DaVita, Inc. facilities between July 1, 2001, and June 30, 2006. After adjustment for potential confounders, uncorrected serum calcium <8.5 and ≥10.2 mg/dl were associated with excess mortality in patients on PD or HD (comparison group uncorrected calcium 9.0 to <9.5 mg/dl). Additional adjustment for serum albumin concentration substantially attenuated the all-cause mortality hazard ratios (HRs) associated with uncorrected calcium <8.5 mg/dl (HR, 1.29; 95% confidence interval 95% CI, 1.16 to 1.44 for PD; HR, 1.17; 95% CI, 1.13 to 1.20 for HD) and amplified the HRs associated with calcium ≥10.2 mg/dl (HR, 1.65; 95% CI, 1.42 to 1.91 for PD; HR, 1.59; 95% CI, 1.53 to 1.65 for HD). Albumin-corrected calcium ≥10.2 mg/dl and serum phosphorus ≥6.4 mg/dl were also associated with increased risk for death, irrespective of dialysis modality. In summary, in a large nationally representative cohort of patients on dialysis, abnormalities in markers of mineral metabolism, particularly high concentrations of serum calcium and phosphorus, were associated with increased mortality risk. Additional studies are needed to investigate whether control of hypercalcemia and hyperphosphatemia in patients undergoing dialysis results in improved clinical outcomes.