Since the first attempt of pig‐to‐primate liver xenotransplantation (LXT) in 1968, survival has been limited. We evaluated a model utilizing α‐1,3‐galactosyltransferase knockout donors, continuous ...posttransplant infusion of human prothrombin concentrate complex, and immunosuppression including anti–thymocyte globulin, FK‐506, methylprednisone, and costimulation blockade (belatacept, n = 3 or anti‐CD40 mAb, n = 1) to extend survival. Baboon 1 remained well until postoperative day (POD) 25, when euthanasia was required because of cholestasis and plantar ulcers. Baboon 2 was euthanized following a seizure on POD 5, despite normal liver function tests (LFTs) and no apparent pathology. Baboon 3 demonstrated initial stable liver function but was euthanized on POD 8 because of worsening LFTs. Pathology revealed C4d positivity, extensive hemorrhagic necrosis, and a focal cytomegalovirus inclusion. Baboon 4 was clinically well with stable LFTs until POD29, when euthanasia was again necessitated by plantar ulcerations and rising LFTs. Final pathology was C4d negative and without evidence of rejection, inflammation, or thrombotic microangiopathy. Thus, nearly 1‐mo rejection‐free survival has been achieved following LXT in two of four consecutive recipients, demonstrating that the porcine liver can support life in primates for several weeks and has encouraging potential for clinical application as a bridge to allotransplantation for patients with acute‐on‐chronic or fulminant hepatic failure.
The addition of costimulation blockade, in conjunction with exogenous coagulation factors, prolongs survival following pig‐to‐primate liver xenotransplantation, achieving the longest survivals reported to date.
Steroid-sparing strategies have been attempted in recent decades to avoid morbidity from long-term steroid intake among kidney transplant recipients. Previous systematic reviews of steroid withdrawal ...after kidney transplantation have shown a significant increase in acute rejection. There are various protocols to withdraw steroids after kidney transplantation and their possible benefits or harms are subject to systematic review. This is an update of a review first published in 2009.
To evaluate the benefits and harms of steroid withdrawal or avoidance for kidney transplant recipients.
We searched the Cochrane Kidney and Transplant Specialised Register to 15 February 2016 through contact with the Information Specialist using search terms relevant to this review.
All randomised and quasi-randomised controlled trials (RCTs) in which steroids were avoided or withdrawn at any time point after kidney transplantation were included.
Assessment of risk of bias and data extraction was performed by two authors independently and disagreement resolved by discussion. Statistical analyses were performed using the random-effects model and dichotomous outcomes were reported as relative risk (RR) and continuous outcomes as mean difference (MD) with 95% confidence intervals.
We included 48 studies (224 reports) that involved 7803 randomised participants. Of these, three studies were conducted in children (346 participants). The 2009 review included 30 studies (94 reports, 5949 participants). Risk of bias was assessed as low for sequence generation in 19 studies and allocation concealment in 14 studies. Incomplete outcome data were adequately addressed in 22 studies and 37 were free of selective reporting.The 48 included studies evaluated three different comparisons: steroid avoidance or withdrawal compared with steroid maintenance, and steroid avoidance compared with steroid withdrawal. For the adult studies there was no significant difference in patient mortality either in studies comparing steroid withdrawal versus steroid maintenance (10 studies, 1913 participants, death at one year post transplantation: RR 0.68, 95% CI 0.36 to 1.30) or in studies comparing steroid avoidance versus steroid maintenance (10 studies, 1462 participants, death at one year after transplantation: RR 0.96, 95% CI 0.52 to 1.80). Similarly no significant difference in graft loss was found comparing steroid withdrawal versus steroid maintenance (8 studies, 1817 participants, graft loss excluding death with functioning graft at one year after transplantation: RR 1.17, 95% CI 0.72 to 1.92) and comparing steroid avoidance versus steroid maintenance (7 studies, 1211 participants, graft loss excluding death with functioning graft at one year after transplantation: RR 1.09, 95% CI 0.64 to 1.86). The risk of acute rejection significantly increased in patients treated with steroids for less than 14 days after transplantation (7 studies, 835 participants: RR 1.58, 95% CI 1.08 to 2.30) and in patients who were withdrawn from steroids at a later time point after transplantation (10 studies, 1913 participants, RR 1.77, 95% CI 1.20 to 2.61). There was no evidence to suggest a difference in harmful events, such as infection and malignancy, in adult kidney transplant recipients. The effect of steroid withdrawal in children is unclear.
This updated review increases the evidence that steroid avoidance and withdrawal after kidney transplantation significantly increase the risk of acute rejection. There was no evidence to suggest a difference in patient mortality or graft loss up to five year after transplantation, but long-term consequences of steroid avoidance and withdrawal remain unclear until today, because prospective long-term studies have not been conducted.
Xenotransplantation has the potential to alleviate the organ shortage that prevents many patients with end‐stage renal disease from enjoying the benefits of kidney transplantation. Despite ...significant advances in other models, pig‐to‐primate kidney xenotransplantation has met limited success. Preformed anti‐pig antibodies are an important component of the xenogeneic immune response. To address this, we screened a cohort of 34 rhesus macaques for anti‐pig antibody levels. We then selected animals with both low and high titers of anti‐pig antibodies to proceed with kidney transplant from galactose‐α1,3‐galactose knockout/CD55 transgenic pig donors. All animals received T‐cell depletion followed by maintenance therapy with costimulation blockade (either anti‐CD154 mAb or belatacept), mycophenolate mofetil, and steroid. The animal with the high titer of anti‐pig antibody rejected the kidney xenograft within the first week. Low‐titer animals treated with anti‐CD154 antibody, but not belatacept exhibited prolonged kidney xenograft survival (>133 and >126 vs. 14 and 21 days, respectively). Long‐term surviving animals treated with the anti‐CD154‐based regimen continue to have normal kidney function and preserved renal architecture without evidence of rejection on biopsies sampled at day 100. This description of the longest reported survival of pig‐to‐non‐human primate kidney xenotransplantation, now >125 days, provides promise for further study and potential clinical translation.
Xenotransplantation using pig organs could end the donor organ shortage for transplantation, but humans have xenoreactive antibodies that cause early graft rejection. Genome editing can eliminate ...xenoantigens in donor pigs to minimize the impact of these xenoantibodies. Here we determine whether an improved cross-match and chemical immunosuppression could result in prolonged kidney xenograft survival in a pig-to-rhesus preclinical model.
Double xenoantigen (Gal and Sda) knockout (DKO) pigs were created using CRISPR/Cas. Serum from rhesus monkeys (n = 43) was cross-matched with cells from the DKO pigs. Kidneys from the DKO pigs were transplanted into rhesus monkeys (n = 6) that had the least reactive cross-matches. The rhesus recipients were immunosuppressed with anti-CD4 and anti-CD8 T-cell depletion, anti-CD154, mycophenolic acid, and steroids.
Rhesus antibody binding to DKO cells is reduced, but all still have positive CDC and flow cross-match. Three grafts were rejected early at 5, 6, and 6 days. Longer survival was achieved in recipients with survival to 35, 100, and 435 days. Each of the 3 early graft losses was secondary to IgM antibody-mediated rejection. The 435-day graft loss occurred secondary to IgG antibody-mediated rejection.
Reducing xenoantigens in donor pigs and chemical immunosuppression can be used to achieve prolonged renal xenograft survival in a preclinical model, suggesting that if a negative cross-match can be obtained for humans then prolonged survival could be achieved.
Extending the functional integrity of renal allografts is the primary goal of transplant medicine. The development of donor‐specific antibodies (DSAs) posttransplantation leads to chronic active ...antibody‐mediated rejection (cAMR) and transplant glomerulopathy (TG), resulting in the majority of graft losses that occur in the United States. This reduces the quality and length of life for patients and increases cost. There are no approved treatments for cAMR. Evidence suggests the proinflammatory cytokine interleukin 6 (IL‐6) may play an important role in DSA generation and cAMR. We identified 36 renal transplant patients with cAMR plus DSAs and TG who failed standard of care treatment with IVIg plus rituximab with or without plasma exchange. Patients were offered rescue therapy with the anti–IL‐6 receptor monoclonal tocilizumab with monthly infusions and monitored for DSAs and long‐term outcomes. Tocilizumab‐treated patients demonstrated graft survival and patient survival rates of 80% and 91% at 6 years, respectively. Significant reductions in DSAs and stabilization of renal function were seen at 2 years. No significant adverse events or severe adverse events were seen. Tocilizumab provides good long‐term outcomes for patients with cAMR and TG, especially compared with historical published treatments. Inhibition of the IL‐6–IL‐6 receptor pathway may represent a novel approach to stabilize allograft function and extend patient lives.
Anti–interleukin‐6 receptor monoclonal tocilizumab preserves allograft survival and renal function with minimal side effects in >80% of patients with biopsy‐proven chronic antibody‐mediated rejection and transplant glomerulopathy who failed conventional treatment.
Many patients with hematologic malignancies cannot tolerate hematopoietic cell transplantation (HCT), whereas others may not have a compatible human leukocyte antigen–matched donor. To overcome these ...limitations, we optimized a conditioning regimen employing anti-CD45 radioimmunotherapy (RIT) replacing total body irradiation (TBI) before haploidentical HCT in a murine model. Mice received 200 to 400 μCi 90Y-anti-CD45 antibody (30F11), with or without fludarabine (5 days starting day –8), with cyclophosphamide (CY; days –2 and +2) for graft-versus-host disease prophylaxis, and 1.5 × 107 haploidentical donor bone marrow cells (day 0). Haploidentical bone marrow transplantation (BMT) with 300 μCi 90Y-anti-CD45 RIT and CY, without TBI or fludarabine, led to mixed chimeras with 81.3 ± 10.6% mean donor origin CD8+ cells detected 1 month after BMT, and remained stable (85.5 ± 11% mean donor origin CD8+ cells) 6 months after haploidentical BMT. High chimerism levels were induced across multiple hematopoietic lineages 28 days after haploidentical BMT with 69.3 ± 14.1%, 75.6 ± 20.2%, and 88.5 ± 11.8% CD3+ T cells, B220+ B cells, and CD11b+ myeloid cells, respectively. Fifty percent of SJL leukemia-bearing mice treated with 400 μCi 90Y-DOTA-30F11, CY, and haploidentical BMT were cured and lived >200 days. Mice treated with 200 μCi 90Y-DOTA-30F11 had a median overall survival of 73 days, while untreated leukemic mice had a median overall survival of 34 days (P < .001, Mantel-Cox test). RIT-mediated haploidentical BMT without TBI may increase treatment options for aggressive hematologic malignancies.
•Anti-CD45 RIT may replace TBI and simplify BMT-preparative regimens.•Anti-CD45 RIT and haploidentical BMT, without TBI, prolongs survival in a murine leukemia model.
Recurrent primary biliary cholangitis (rPBC) develops in approximately 30% of patients and negatively impacts graft and overall patient survival after liver transplantation (LT). There is a lack of ...data regarding the response rate to ursodeoxycholic acid (UDCA) in rPBC. We evaluated a large, international, multi-center cohort to assess the performance of PBC scores in predicting the risk of graft and overall survival after LT in patients with rPBC.
A total of 332 patients with rPBC after LT were evaluated from 28 centers across Europe, North and South America. The median age at the time of rPBC was 58.0 years IQR 53.2–62.6, and 298 patients (90%) were female. The biochemical response was measured with serum levels of alkaline phosphatase (ALP) and bilirubin, and Paris-2, GLOBE and UK-PBC scores at 1 year after UDCA initiation.
During a median follow-up of 8.7 years IQR 4.3–12.9 after rPBC diagnosis, 52 patients (16%) had graft loss and 103 (31%) died. After 1 year of UDCA initiation the histological stage at rPBC (hazard ratio HR 3.97, 95% CI 1.36-11.55, p = 0.01), use of prednisone (HR 3.18, 95% CI 1.04-9.73, p = 0.04), ALP xULN (HR 1.59, 95% CI 1.26-2.01, p <0.001), Paris-2 criteria (HR 4.14, 95% CI 1.57-10.92, p = 0.004), GLOBE score (HR 2.82, 95% CI 1.71-4.66, p <0.001), and the UK-PBC score (HR 1.06, 95% CI 1.03-1.09, p <0.001) were associated with graft survival in the multivariate analysis. Similar results were observed for overall survival.
Patients with rPBC and disease activity, as indicated by standard PBC risk scores, have impaired outcomes, supporting efforts to treat recurrent disease in similar ways to pre-transplant PBC.
One in three people who undergo liver transplantation for primary biliary cholangitis develop recurrent disease in their new liver. Patients with recurrent primary biliary cholangitis and incomplete response to ursodeoxycholic acid, according to conventional prognostic scores, have worse clinical outcomes, with higher risk of graft loss and mortality in similar ways to the disease before liver transplantation. Our results supportsupport efforts to treat recurrent disease in similar ways to pre-transplant primary biliary cholangitis.
Display omitted
•Recurrent PBC (rPBC) develops in approximately 30% of patients and negatively impacts graft and overall patient survival after LT.•Levels of alkaline phosphatase after 1 year of UDCA predict graft loss and mortality in patients with rPBC after LT.•Prognostic scores for UDCA-treated patients predict graft loss and mortality in patients with rPBC after LT.•Future studies to evaluate second-line treatments in patients with rPBC and incomplete response to UDCA are warranted.
One of the main unresolved questions in solid organ transplantation is how to establish indefinite graft survival that is free from long-term treatment with immunosuppressive drugs and chronic ...rejection (i.e., the establishment of tolerance). The failure to achieve this goal may be related to the difficulty in identifying the phenotype and function of the cell subsets that participate in the induction of tolerance. To address this issue, we investigated the suppressive roles of recipient myeloid cells that may be manipulated to induce tolerance to transplanted hearts in mice. Using depleting mAbs, clodronate-loaded liposomes, and transgenic mice specific for depletion of CD11c+, CD11b+, or CD115+ cells, we identified a tolerogenic role for CD11b+CD115+Gr1+ monocytes during the induction of tolerance by costimulatory blockade with CD40L-specific mAb. Early after transplantation, Gr1+ monocytes migrated from the bone marrow into the transplanted organ, where they prevented the initiation of adaptive immune responses that lead to allograft rejection and participated in the development of Tregs. Our results suggest that mobilization of bone marrow CD11b+CD115+Gr1+ monocytes under sterile inflammatory conditions mediates the induction of indefinite allograft survival. We propose that manipulating the common bone marrow monocyte progenitor could be a useful clinical therapeutic approach for inducing transplantation tolerance.
Management of High-risk Corneal Transplantation Di Zazzo, Antonio, MD; Kheirkhah, Ahmad, MD; Abud, Tulio B., MD ...
Survey of ophthalmology,
11/2017, Letnik:
62, Številka:
6
Journal Article
Recenzirano
Odprti dostop
Abstract The cornea is the most commonly transplanted tissue in medicine. The main cause of corneal graft failure is allograft rejection. The incidence of graft rejection depends on the presence of ...high-risk characteristics, most notably corneal neovascularization. Although corneal grafting has a high success rates in the absence of these risk factors, high-risk keratoplasty is associated with low success rates because of a high incidence of immune-mediated graft rejection. To improve the survival of high-risk corneal transplantation, various preoperative, intraoperative, and postoperative measures can be considered.; however, the key step in the management of these grafts is the long-term use of local and/or systemic immunosuppressive agents. Although a number of immunosuppressive agents have been employed for this purpose, the results vary significantly across different studies. This is partly due to the lack of an optimized method for their use, as well as the lack of a precise stratification of the degree of risk in each individual patient. New targeted biologic treatments, as well as tolerance-inducing methods, show promising horizons in the management of high-risk corneal transplantation in near future.
Predicting long‐term outcomes in renal transplant recipients is essential to optimize medical therapy and determine the frequency of posttransplant histologic and serologic monitoring. Nonadherence ...and human leukocyte antigen (HLA) mismatch are risk factors that have been associated with poor long‐term outcomes and may help individualize care. In the present study, class II HLA mismatches were determined at the HLA epitope level in 195 renal transplant recipients in whom medication adherence was prospectively measured using electronic monitors in medication vial caps. Recipients were grouped by medication adherence and high (≥10 HLA‐DR, ≥17 HLA‐DQ) or low epitope‐mismatch load. We found that the combination of higher epitope mismatch and poor adherence acted synergistically to determine the risk of rejection or graft loss. Nonadherent recipients with HLA‐DR epitope mismatch ≥10 had increased graft loss (35% vs. 8%, p < 0.01) compared to adherent recipients with low epitope mismatch. At the HLA‐DQ locus nonadherent recipients with HLA‐DQ epitope mismatch ≥17 had increased graft loss (33% vs. 10%, p < 0.01) compared to adherent recipients with low epitope mismatch. Subclinical nonadherence early posttransplant combined with HLA class II epitope mismatch may help identify recipients that could benefit from increased clinical, histologic, and serologic monitoring.
Using electronic monitors in medication vial caps and HLA class II epitope mismatch analysis, the authors find that medication nonadherence and HLA class II epitope mismatch act synergistically to predict renal allograft rejection and graft loss. See editorial from Glotz and Tambur on page 2021.