Frailty and Access to Kidney Transplantation Haugen, Christine E; Chu, Nadia M; Ying, Hao ...
Clinical journal of the American Society of Nephrology,
04/2019, Volume:
14, Issue:
4
Journal Article
Peer reviewed
Open access
Frailty, a syndrome distinct from comorbidity and disability, is clinically manifested as a decreased resistance to stressors and is present in up to 35% of patient with ESKD. It is associated with ...falls, hospitalizations, poor cognitive function, and mortality. Also, frailty is associated with poor outcomes after kidney transplant, including delirium and mortality. Frailty is likely also associated with decreased access to kidney transplantation, given its association with poor outcomes on dialysis and post-transplant. Yet, clinicians have difficulty identifying which patients are frail; therefore, we sought to quantify if frail kidney transplant candidates had similar access to kidney transplantation as nonfrail candidates.
We studied 7078 kidney transplant candidates (2009-2018) in a three-center prospective cohort study of frailty. Fried frailty (unintentional weight loss, grip strength, walking speed, exhaustion, and activity level) was measured at outpatient kidney transplant evaluation. We estimated time to listing and transplant rate by frailty status using Cox proportional hazards and Poisson regression, adjusting for demographic and health factors.
The mean age was 54 years (SD 13; range, 18-89), 40% were women, 34% were black, and 21% were frail. Frail participants were almost half as likely to be listed for kidney transplantation (hazard ratio, 0.62; 95% confidence interval, 0.56 to 0.69;
<0.001) compared with nonfrail participants, independent of age and other demographic factors. Furthermore, frail candidates were transplanted 32% less frequently than nonfrail candidates (incidence rate ratio, 0.68; 95% confidence interval, 0.58 to 0.81;
<0.001).
Frailty is associated with lower chance of listing and lower rate of transplant, and is a potentially modifiable risk factor.
AbstractObjectiveGuidelines from the Society for Vascular Surgery and the Choosing Wisely campaign recommend that peripheral vascular interventions (PVIs) be limited to claudication patients with ...lifestyle-limiting symptoms only after a failed trial of medical and exercise therapy. We sought to explore practice patterns and physician characteristics associated with early PVI after a new claudication diagnosis to evaluate adherence to these guidelines. MethodsWe used 100% Medicare fee-for-service claims to identify patients diagnosed with claudication for the first time between 2015 and 2017. Early PVI was defined as an aortoiliac or femoropopliteal PVI performed within 6 months of initial claudication diagnosis. A physician-level PVI utilization rate was calculated for physicians who diagnosed >10 claudication patients and performed at least one PVI (regardless of indication) during the study period. Hierarchical multivariable logistic regression was used to identify physician-level factors associated with early PVI. ResultsOf 194,974 patients who had a first-time diagnosis of claudication during the study period, 6286 (3.2%) underwent early PVI. Among the 5664 physicians included in the analysis, the median physician-level early PVI rate was low at 0% (range, 0%-58.3%). However, there were 320 physicians (5.6%) who had an early PVI rate ≥14% (≥2 standard deviations above the mean). After accounting for patient characteristics, a higher percentage of services delivered in ambulatory surgery center or office settings was associated with higher PVI utilization (vs 0%-22%; 23%-47%: adjusted odds ratio aOR, 1.23; 48%-68%: aOR, 1.49; 69%-100%: aOR, 1.72; all P < .05). Other risk-adjusted physician factors independently associated with high PVI utilization included male sex (aOR, 2.04), fewer years in practice (vs ≥31 years; 11-20 years: aOR, 1.23; 21-30 years: aOR, 1.13), rural location (aOR, 1.25), and lower volume claudication practice (vs ≥30 patients diagnosed during study period; ≤17 patients: aOR, 1.30; 18-29 patients: aOR, 1.35; all P < .05). ConclusionsOutlier physicians with a high early PVI rate for patients newly diagnosed with claudication are identifiable using a claims-based practice pattern measure. Given the shared Society for Vascular Surgery and Choosing Wisely initiative goal to avoid interventions for first-line treatment of claudication, confidential data-sharing programs using national benchmarks and educational guidance may be useful to address high utilization in the management of claudication.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
Distal revascularization and interval ligation (DRIL) is an effective approach to the management of hemodialysis access-related ischemia that offers both symptom relief and access salvage. The great ...saphenous vein (GSV) has been the most commonly used conduit. However, the use of an ipsilateral arm vein will allow for performance of the operation with the patient under regional anesthesia and might result in lower harvest site morbidity than the GSV. We sought to determine the suitability of DRIL using an arm vein compared with a GSV conduit.
All patients who had undergone DRIL from 2008 to 2019 were retrospectively identified in the electronic medical records. The characteristics and outcomes of those with an arm vein vs a GSV conduit were compared using the Wilcoxon log-rank and χ2 tests. Access patency was examined using Kaplan-Meier methods, with censoring at lost to follow-up or death.
A total of 66 patients who had undergone DRIL for hand ischemia were included in the present study. An arm vein conduit was used in 40 patients (median age, 65 years; 25% male) and a GSV conduit in 26 patients (median age, 58 years; 19% male). No significant differences in comorbidities were found between the two groups, with the exception of diabetes mellitus (arm vein group, 78%; GSV group, 50% GSV; P = .02). No difference in the ischemia stage at presentation was present between the groups, with most patients presenting with stage 3 ischemia. Also, no differences in patency of hemodialysis access after DRIL between the two groups were found (P = .96). At 12 and 24 months after DRIL, 86.9% (95% confidence interval CI, 68.3%-94.9%) and 82.0% (95% CI, 61.3%-92.3%) of patients with an arm vein conduit had access patency compared with 93.8% (95% CI, 63.2%-99.1%) and 76.9% (95% CI, 43.0%-92.2%) of those with a GSV conduit, respectively. All but one patient had symptom resolution. The incidence of wound complications was significantly greater in the GSV group than in the arm vein group (46% vs 11%; P = .003). DRIL bypass had remained patent in all but one patient in each group, with a median follow-up of 18 months (range, 1-112 months) in the arm vein conduit group and 15 months (range, 0.25-105 months) in the GSV conduit group.
DRIL procedures using an arm vein have advantages over those performed with the GSV. In our series, symptom resolution and access salvage were similar but distinctly fewer wound complications had occurred in the arm vein group. Additionally, the use of an arm vein conduit avoids the need for general anesthesia. If an ipsilateral arm vein is available, it should be the conduit of choice when performing DRIL.
Transplanting the Untransplantable Holscher, Courtenay M.; Jackson, Kyle R.; Segev, Dorry L.
American journal of kidney diseases,
January 2020, 2020-01-00, 20200101, Volume:
75, Issue:
1
Journal Article
Peer reviewed
With implementation of the Kidney Allocation System, the growth of kidney paired donation programs, and advances in desensitization and immunosuppression, the outlook for “untransplantable” kidney ...transplantation candidates has never been more promising. The Kidney Allocation System prioritized compatible matches for candidates with calculated panel-reactive antibody levels of 98%, 99%, or 100% and broadened allocation of non-A1 and non–A1-B subgroup kidneys to blood group type B candidates. Concurrently, the growth of kidney paired donation programs and use of incompatible transplantation as part of kidney paired donation to achieve “more compatible” kidney transplantation has improved options for candidates with an incompatible living donor. Finally, advances in desensitization and immunosuppression have strengthened the ability to manage donor-specific antibodies and antibody-mediated rejection. Although no patient should be labeled “untransplantable” due to blood group type or donor-specific antibody, all candidates should be provided with individualized and realistic counseling regarding their anticipated wait times for deceased donor or kidney paired donation matching, with early referral to expert centers when needed. In this Perspective, we consider blood group type ABO incompatibility, HLA antigen incompatibility, antibody-mediated rejection, kidney paired donation, and recent developments in incompatible transplantation in more depth and recommend an approach to the sensitized candidate.
IMPORTANCE: In light of the growing population of older adults in the United States, older donors (aged ≥70 years) represent an expansion of the donor pool; however, their organs are underused. Liver ...grafts from older donors were historically associated with poor outcomes and higher discard rates, but clinical protocols, organ allocation, and the donor pool have changed in the past 15 years. OBJECTIVE: To evaluate trends in demographics, discard rates, and outcomes among older liver donors and transplant recipients of livers from older donors in a large national cohort. DESIGN, SETTING, AND PARTICIPANTS: Prospective cohort study of 4127 liver grafts from older donors and 3350 liver-only recipients of older donor grafts and 78 990 liver grafts from younger donors (aged 18-69 years) and 64 907 liver-only recipients of younger donor grafts between January 1, 2003, and December 31, 2016, in the United States. The Scientific Registry of Transplant Recipients, which includes data on all transplant recipients in the United States that are submitted by members of the Organ Procurement and Transplantation Network, was used. EXPOSURES: Year of liver transplant and age of liver donor. MAIN OUTCOMES AND MEASURES: Odds of graft discard and posttransplant outcomes of all-cause graft loss and mortality. RESULTS: In this study, 4127 liver grafts from older donors were recovered for liver transplant across the study period (2003-2016); 747 liver grafts from older donors were discarded, and 3350 liver grafts from older donors were used for liver-only recipients. After adjusting for donor characteristics other than age and accounting for Organ Procurement Organization–level variation, liver grafts from older donors were more likely to be discarded compared with liver grafts from younger donors in 2003-2006 (adjusted odds ratio aOR, 1.97; 95% CI, 1.68-2.31), 2007-2009 (aOR, 2.55; 95% CI, 2.17-3.01), 2010-2013 (aOR, 2.04; 95% CI, 1.68-2.46), and 2013-2016 (aOR, 2.37; 95% CI, 1.96-2.86) (P < .001 for all). Transplants of liver grafts from older donors represented a progressively lower proportion of all adult liver transplants, from 6.0% (n = 258 recipients) in 2003 to 3.2% (n = 211 recipients) in 2016 (P = .001). However, outcomes in recipients of grafts from older donors improved over time, with 40% lower graft loss risk (adjusted hazard ratio, 0.60; 95% CI, 0.53-0.68; P < .001) and 41% lower mortality risk (adjusted hazard ratio, 0.59; 95% CI, 0.52-0.68; P < .001) in 2010 through 2016 vs 2003 through 2009; these results were beyond the general temporal improvements in graft loss (interaction P = .03) and mortality risk (interaction P = .04) among recipients of liver grafts from younger donors. CONCLUSIONS AND RELEVANCE: These findings show that from 2003 to 2016, liver graft loss and mortality among recipients of liver grafts from older donors improved; however, liver graft discard from older donors remained increased and the number of transplants performed with liver grafts from older donors decreased. Expansion of the donor pool through broader use of liver grafts from older donors might be reasonable.
Transplant candidates who accept a kidney labeled increased risk for disease transmission (IRD) accept a low risk of window period infection, yet those who decline must wait for another offer that ...might harbor other risks or never even come. To characterize survival benefit of accepting IRD kidneys, we used 2010‐2014 Scientific Registry of Transplant Recipients data to identify 104 998 adult transplant candidates who were offered IRD kidneys that were eventually accepted by someone; the median (interquartile range) Kidney Donor Profile Index (KDPI) of these kidneys was 30 (16‐49). We followed patients from the offer decision until death or end‐of‐study. After 5 years, only 31.0% of candidates who declined IRDs later received non‐IRD deceased donor kidney transplants; the median KDPI of these non‐IRD kidneys was 52, compared to 21 of the IRDs they had declined. After a brief risk period in the first 30 days following IRD acceptance (adjusted hazard ratio aHR accept vs decline: 1.222.063.49, P = .008) (absolute mortality 0.8% vs. 0.4%), those who accepted IRDs were at 33% lower risk of death 1‐6 months postdecision (aHR 0.500.670.90, P = .006), and at 48% lower risk of death beyond 6 months postdecision (aHR 0.460.520.58, P < .001). Accepting an IRD kidney was associated with substantial long‐term survival benefit; providers should consider this benefit when counseling patients on IRD offer acceptance.
Kidney transplant candidates in the United States who accept kidneys classified as increased risk for disease transmission have better long‐term survival than those who decline these kidneys. See Kaul's editorial on page 535.
Full text
Available for:
BFBNIB, FZAB, GEOZS, GIS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, SBMB, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
Deceased donor kidney transplantation (DDKT) rates for highly sensitized (HS) candidates increased early after implementation of the Kidney Allocation System (KAS) in 2014. However, this may ...represent a bolus effect, and a granular investigation of the current state of DDKT for HS candidates remains lacking. We studied 270 722 DDKT candidates from the SRTR from 12/4/2011 to 12/3/2014 (“pre‐KAS”) and 12/4/2014 to 12/3/2017 (“post‐KAS”), analyzing DDKT rates for HS candidates using adjusted negative binomial regression. Post‐KAS, candidates with the highest levels of sensitization had an increased DDKT rate compared with pre‐KAS (cPRA 98% adjusted incidence rate ratio aIRR:1.271.772.46 P = .001, cPRA 99% aIRR:3.184.365.98 P < .001, cPRA 99.5–99.9% aIRR:16.9124.2934.89 P < .001, and cPRA 99.9%+ aIRR:8.7911.5815.26 P < .001). To determine whether these changes produced more equitable access to DDKT, we compared DDKT rates of HS to non‐HS candidates (cPRA 0–79%). Post‐KAS, cPRA, 98% candidates had an equivalent DDKT rate (aIRR:0.650.941.36, P = .8) to non‐HS candidates, whereas 99% candidates had a higher DDKT rate (aIRR:1.191.682.38, P = .02). Although cPRA 99.5–99.9% candidates had an increased DDKT rate (aIRR:2.463.504.98, P < .001) compared to non‐HS candidates, cPRA 99.9%+ candidates had a significantly lower DDKT rate (aIRR:0.290.400.56, P < .001). KAS has improved access to DDKT for HS candidates, although substantial imbalance exists between cPRA 99.5–99.9% and 99.9%+ candidates.
The authors examine the current state of deceased donor kidney transplantation for highly sensitized candidates under the Kidney Allocation System and find that transplant rates have become more balanced across calculated panel reactive antibody levels, though waitlist mortality has not significantly changed.
Full text
Available for:
BFBNIB, FZAB, GEOZS, GIS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, SBMB, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
OBJECTIVE:To determine if the association of frailty and waitlist mortality varies by candidate age.
BACKGROUND:Frailty, a construct developed in geriatrics, is a state of decreased physiologic ...reserve, and is associated with mortality while awaiting liver transplantation (LT). However, older candidates have high comorbidity burden and less physiologic reserve, so the relationship between frailty and waitlist mortality may vary by candidate age.
METHODS:We studied adults listed for LT at 2 transplant centers. The liver frailty index (grip strength, chair stands, balance) was measured at evaluation, with frailty defined as liver frailty index ≥ 4.5. We compared the prevalence of frailty in older (≥65 yr) and younger (18–64 yr) candidates. We studied the association between frailty, age, interaction between the 2, and waitlist mortality using competing risks regression adjusted for sex, BMI, and MELDNa.
RESULTS:Among 882 LT candidates, 16.6% were ≥ 65 years. Older candidates were more likely to be frail (33.3% vs 21.7%, P = 0.002). Older age adjusted subhazard ratio (aSHR)2.16, 95% CI1.51–3.09, P < 0.001 and frailty (aSHR1.92, 95% CI1.38–2.67, P < 0.001) were independently associated with higher risk of waitlist mortality. However, the association between waitlist mortality and frailty did not vary by candidate age (aSHR of frailty for younger patients1.90, 95% CI1.28–2.80, P = 0.001; aSHR of frailty for older patients1.98, 95% CI1.07–3.67, P = 0.03; P interaction = 0.9).
CONCLUSIONS:Older candidates experienced higher rates of frailty than younger candidates. However, regardless of age, frailty was associated with nearly 2-fold increased risk of waitlist mortality. Our data support the applicability of the frailty concept to the whole LT population and can guide the development of prehabilitation programs targeting frailty in LT patients of all ages.
Length of stay (LOS) is a major driver of cost and resource utilization following lower extremity bypass (LEB). However, the variable comorbidity burden and mobility status of LEB patients makes ...implementing enhanced recovery after surgery pathways challenging. The aim of this study was to use a large national database to identify patient factors associated with ultrashort LOS among patients undergoing LEB for peripheral artery disease.
All patients undergoing LEB for peripheral artery disease in the National Surgical Quality Improvement Project database from 2011 to 2018 were included. Patients were divided into two groups based on the postoperative length of stay : ultrashort (≤2 days) and standard (>2 days). Thirty-day outcomes were compared using descriptive statistics, and multivariable logistic regression was used to identify patient factors associated with ultrashort LOS.
Overall, 17,510 patients were identified who underwent LEB, of which 2678 patients (15.3%) had an ultrashort postoperative LOS (mean, 1.8 days) and 14,832 (84.7%) patients had a standard LOS (mean, 7.1 days). When compared to patients with a standard LOS, patients with an ultrashort LOS were more likely to be admitted from home (95.9% vs 88.0%; P < .001), undergo elective surgery (86.1% vs 59.1%; P < .001), and be active smokers (52.1% vs 40.4%; P < .001). Patients with an ultrashort LOS were also more likely to have claudication as the indication for LEB (53.1% vs 22.5%; P < .001), have a popliteal revascularization target rather than a tibial/pedal target (76.7% vs 55.3%; P < .001), and have a prosthetic conduit (40.0% vs 29.9%; P < .001). There was no significant difference in mortality between the two groups (1.4% vs 1.8%; P = .21); however, patients with an ultrashort LOS had a lower frequency of unplanned readmission (10.7% vs 18.8%; P < .001) and need for major reintervention (1.9% vs 5.6%; P < .001). On multivariable analysis, elective status (odds ratio , 2.66; 95% confidence interval CI, 2.33-3.04), active smoking (OR, 1.18; 95% CI, 1.07-1.30), and lack of vein harvest (OR, 1.55; 95% CI, 1.41-1.70) were associated with ultrashort LOS. Presence of rest pain (OR, 0.57; 95% CI, 0.51-0.63), tissue loss (OR, 0.30; 95% CI, 0.27-0.34), and totally dependent functional status (OR, 0.54; 95% CI, 0.35-0.84) were associated negatively with an ultrashort LOS. When examining the subgroup of patients who underwent vein harvest, totally dependent (OR, 0.38; 95% CI, 0.19-0.75) and partially dependent (OR, 0.53; 95% CI, 0.32-0.88) functional status were persistently negatively associated with ultrashort LOS.
Ultrashort LOS (≤2 days) after LEB is uncommon but feasible in select patients. Preoperative functional status and mobility are important factors to consider when identifying LEB patients who may be candidates for early discharge.