Donor-specific antibodies are associated with increased risk of antibody-mediated rejection and decreased allograft survival. Therefore, reducing the risk of these antibodies remains a clinical need ...in transplantation. Plasma cells are a logical target of therapy given their critical role in antibody production.
To target plasma cells, we treated sensitized rhesus macaques with daratumumab (anti-CD38 mAb). Before transplant, we sensitized eight macaques with two sequential skin grafts from MHC-mismatched donors; four of them were also desensitized with daratumumab and plerixafor (anti-CXCR4). We also treated two patients with daratumumab in the context of transplant.
The animals treated with daratumumab had significantly reduced donor-specific antibody levels compared with untreated controls (57.9% versus 13% reduction;
<0.05) and prolonged renal graft survival (28.0 days versus 5.2 days;
<0.01). However, the reduction in donor-specific antibodies was not maintained because all recipients demonstrated rapid rebound of antibodies, with profound T cell-mediated rejection. In the two clinical patients, a combined heart and kidney transplant recipient with refractory antibody-mediated rejection and a highly sensitized heart transplant candidate, we also observed a significant decrease in class 1 and 2 donor-specific antibodies that led to clinical improvement of antibody-mediated rejection and to heart graft access.
Targeting CD38 with daratumumab significantly reduced anti-HLA antibodies and anti-HLA donor-specific antibodies in a nonhuman primate model and in two transplant clinical cases before and after transplant. This supports investigation of daratumumab as a potential therapeutic strategy; however, further research is needed regarding its use for both antibody-mediated rejection and desensitization.
Pre‐transplant serum screening of anti‐HLA antibodies is recommended for solid organ transplantations. Many laboratories use the less expensive bead‐based screening assay as the main technique and, ...if positive, turn to single‐antigen beads (SAB). We studied the correlations between these two immunoassays. We re‐analyzed the raw data of the two assays in 3030 first organ transplant recipients, explored with the two tests. We performed a ROC curve analysis of the screening ratio to predict a positive SAB assay. The AUC were 0.72 and 0.64 for class I and class II. The optimal thresholds of screening ratios were 3.28 (class I) and 2.11 (class II). Whatever the class, the negative predictive value was low, around 40%, with 36% of discordant sera, as defined by negative screening and positive SAB. Testing class I discordant sera on acid‐treated SAB showed that 54% of antibodies reacted against denatured HLA molecules. However, these screening‐negative sera may contain donor‐specific antibodies in 13.9% and 28.7% of cases for class I and class II, respectively, involved in antibody‐mediated rejection with the same frequency as non‐discordant sera. Given the low predictive value of screening, both assays should be performed at least once on the same serum before transplantation.
Background
Sensitized patients, i.e. recipients with preformed donor-specific HLA antibodies (pfDSA), are at high-risk of developing antibody-mediated rejections (AMR) and dying after heart ...transplantation (HTx). Perioperative desensitization procedures are associated with better outcomes but can cause sensitization, which may influence their efficacy.
Methods
In sensitized patients (pfDSA>1000 mean immunofluorescence (MFI) units), we assessed the effect of perioperative desensitization by comparing treated patients to a historical control cohort. Multivariable survival analyses were performed on the time to main outcome, a composite of death and biopsy-proven AMR with 5-year follow-up.
Results
The study included 68 patients: 31 control and 37 treated patients. There was no difference in preoperative variables between the two groups, including cumulative pfDSA 4026 (1788;8725)
vs
4560 (3162;13392) MFI units,
p
=0.28. The cause of sensitization was pregnancy in 24/68, 35.3%, transfusion in 61/68, 89.7%, and previous HTx in 4/68, 5.9% patients. Multivariable analysis yielded significant protective association between desensitization and events (adjusted (adj.) hazard ratio (HR)=0.44 (95% confidence interval (95CI)=0.25-0.79),
p
=0.006) and deleterious association between cumulative pfDSA and events per 1000-MFI increase, adj.HR=1.028 (1.002-1.053), p=0.031. There was a sex-difference in the efficacy of desensitization: in men (n=35), the benefit was significant unadj.HR=0.33 (95CI=0.14-0.78); p=0.01, but not in women (n=33) unadj.HR=0.52 (0.23-1.17), p=0.11. In terms of the number of patients treated, in men, 2.1 of patients that were treated prevented 1 event, while in women, 3.1 required treatment to prevent 1 event.
Conclusion
Perioperative desensitization was associated with fewer AMR and deaths after HTx, and efficacy was more pronounced in men than women.
Allele specific antibody response against the polymorphic system of HLA is the allogeneic response marker determining the immunological risk for graft acceptance before and after organ ...transplantation and therefore routinely studied during the patient's workup. Experimentally, bead bound antigen- antibody reactions are detected using a special multicolor flow cytometer (Luminex). Routinely for each sample, antibody responses against 96 different HLA antigen groups are measured simultaneously and a 96-dimensional immune response vector is created. Under a common experimental protocol, using unsupervised clustering algorithms, we analyzed these immune intensity vectors of anti HLA class II responses from a dataset of 1,748 patients before or after renal transplantation residing in a single country. Each patient contributes only one serum sample in the analysis. A population view of linear correlations of hierarchically ordered fluorescence intensities reveals patterns in human immune responses with striking similarities with the previously described CREGs but also brings new information on the antigenic properties of class II HLA molecules. The same analysis affirms that "public" anti-DP antigenic responses are not correlated to anti DR and anti DQ responses which tend to cluster together. Principal Component Analysis (PCA) projections also demonstrate ordering patterns clearly differentiating anti DP responses from anti DR and DQ on several orthogonal planes. We conclude that a computer vision of human alloresponse by use of several dimensionality reduction algorithms rediscovers proven patterns of immune reactivity without any a priori assumption and might prove helpful for a more accurate definition of public immunogenic antigenic structures of HLA molecules. Furthermore, the use of Eigen decomposition on the Immune Response generates new hypotheses that may guide the design of more effective patient monitoring tests.
Natural killer cells are the first lymphocyte subset to reconstitute, and play a major role in early immunity after allogeneic hematopoietic stem cell transplantation. Cells expressing the activating ...receptor NKG2C seem crucial in the resolution of cytomegalovirus episodes, even in the absence of T cells. We prospectively investigated natural killer-cell reconstitution in a cohort of 439 adult recipients who underwent non-T-cell-depleted allogeneic hematopoietic stem cell transplantation between 2005 and 2012. Freshly collected blood samples were analyzed 3, 6, 12 and 24 months after transplantation. Data were studied with respect to conditioning regimen, source of stem cells, underlying disease, occurrence of graft-versus-host disease, and profiles of cytomegalovirus reactivation. In multivariate analysis we found that the absolute numbers of CD56(bright) natural killer cells at month 3 were significantly higher after myeloablative conditioning than after reduced intensity conditioning. Acute graft-versus-host disease impaired reconstitution of total and CD56(dim) natural killer cells at month 3. In contrast, high natural killer cell count at month 3 was associated with a lower incidence of chronic graft-versus-host disease, independently of a previous episode of acute graft-versus-host disease and stem cell source. NKG2C(+)CD56(dim) and total natural killer cell counts at month 3 were lower in patients with reactivation of cytomegalovirus between month 0 and month 3, but expanded greatly afterwards. These cells were also less numerous in patients who experienced later cytomegalovirus reactivation between month 3 and month 6. Our results advocate a direct role of NKG2C-expressing natural killer cells in the early control of cytomegalovirus reactivation after allogeneic hematopoietic stem cell transplantation.
After heart transplant, adding everolimus (EVL) to standard immunosuppressive regimen mostly relies on converting calcineurin inhibitors (CNIs) into EVL. The aim of this study was to describe the ...effects of combining low‐dose EVL and CNIs in maintenance immunosuppression regimen (quadritherapy) and compare it with standard tritherapy associating standard‐dose CNIs, mycophenolate mofetil, and corticosteroids. In the 3‐year registry cohort of heart transplanted patients, those who received quadritherapy were compared with those who received tritherapy. EVL was added after 3 months posttransplant. Three analyses were performed to control for confounders: propensity score matching, multivariable survival, and inverse probability score weighting analyses. Among 213 patients who were included (75 with quadritherapy), propensity score matching selected 64 unique pairs of patients with similar characteristics. In the matched cohort (n = 128), quadritherapy was associated with fewer deaths (3 4.7% vs 17 21.9%, P = .007) and biopsy‐proven acute rejections (15 23.4% vs 31 48.4%, P = .002). These results were confirmed in the overall cohort (n = 213), after multivariable and inverse probability score weighting analyses. Renal function and donor‐specific HLA‐antibodies remained similar in both groups. Low‐dose combination quadritherapy was associated with fewer deaths and rejections, compared with standard immunosuppression tritherapy.
Four‐ compared to three‐drug therapy is associated with fewer deaths and biopsy‐proven acute rejections in a cohort of heart transplant recipients.
The long-term benefits of conversion from calcineurin inhibitors (CNIs) to belatacept in kidney transplant recipients (KTr) are poorly documented
A single-center retrospective work to study ...first-time CNI to belatacept conversion as a rescue therapy eGFR <30 ml/min/1.73 m
, chronic histological lesions, or CNI-induced thrombotic microangiopathy (TMA). Patient and kidney allograft survivals, eGFR, severe adverse events, donor-specific antibodies (DSA), and histological data were recorded over 36 months after conversion.
We included N = 115 KTr. The leading cause for switching was chronic histological lesions with non-optimal eGFR (56.5%). Three years after conversion, patient, and death-censored kidney allograft survivals were 88% and 92%, respectively, eGFR increased significantly from 31.5 ± 17.5 to 36.7 ± 15.7 ml/min/1.73 m
(
< 0.01), the rejection rate was 10.4%, OI incidence was 5.2 (2.9-7.6) per 100 person-years. Older age was associated with death, eGFR was not associated with death nor allograft loss. No patient developed
DSA at M36 after conversion. CNI-induced TMA disappeared in all cases without eculizumab use. Microvascular inflammation and chronic lesions remained stable.
Post-KT conversion from CNIs to belatacept, as rescue therapy, is safe and beneficial irrespective of the switch timing and could represent a good compromise facing organ shortage. Age and eGFR at conversion should be considered in the decision whether to switch.
A major hurdle to improving clinical care in the field of kidney transplantation is the lack of biomarkers of the response to antibody-mediated rejection (ABMR) treatment. To discover these we ...investigated the value of complement-binding donor-specific anti-HLA antibodies (DSAs) for evaluating the response to treatment. The study encompassed a prospective cohort of 139 kidney recipients with ABMR receiving the standard of care treatment, including plasma exchange, intravenous immunoglobulin and rituximab. Patients were systematically assessed at the time of diagnosis and three months after treatment initiation for clinical and allograft histological characteristics and anti-HLA DSAs, including their C1q-binding ability. After adjusting for clinical and histological parameters, post-treatment C1q-binding anti-HLA DSA was an independent and significant determinant of allograft loss (adjusted hazard ratio 2.57 (95% confidence interval 1.29-5.12). In 101 patients without post-treatment C1q-binding anti-HLA DSA there was a significantly improved glomerular filtration rate with significantly reduced glomerulitis, peritubular capillaritis, interstitial inflammation, tubulitis, C4d deposition, and endarteritis compared with 38 patients with posttreatment C1q-binding anti-HLA DSA. A conditional inference tree model identified five prognostic groups at the time of post-treatment evaluation based on glomerular filtration rate, presence of cg lesion and C1q-binding anti-HLA DSA (cross-validated accuracy: 0.77). Thus, circulating complement-binding anti-HLA DSAs are strong and independent predictors of allograft outcome after standard of care treatment in kidney recipients with ABMR.
Display omitted
Abstract
Background
Kidney allograft survival in human immunodeficiency virus (HIV)-positive patients is lower than that in the general population. Belatacept increases long-term patient and ...allograft survival rates when compared with calcineurin inhibitors (CNIs). Its use in HIV-positive recipients remains poorly documented.
Methods
We retrospectively report a French cohort of HIV-positive kidney allograft recipients who were switched from CNI to belatacept, between June 2012 and December 2018. Patient and allograft survival rates, HIV immunovirological and clinical outcomes, acute rejection, opportunistic infections (OIs) and HLA donor-specific antibodies (DSAs) were analysed at 3 and 12 months, and at the end of follow-up (last clinical visit attended after transplantation). Results were compared with HIV-positive recipients group treated with CNI.
Results
Twelve patients were switched to belatacept 10 (2–25) months after transplantation. One year after belatacept therapy, patient and allograft survival rates scored 92% for both, two (17%) HIV virological rebounds occurred due to antiretroviral therapy non-compliance, and CD4+ and CD8+ T-cell counts remained stable over time. Serious adverse events included two (17%) acute steroid-resistant T-cell-mediated rejections and three (25%) OIs. Kidney allograft function significantly increased over the 12 post-switch months (P = 0.009), and DSAs remained stable at 12 months after treatment. The control group showed similar results in terms of patient and kidney allograft survival rates, DSA characteristics and proteinuria
Conclusions
Switch from CNI to belatacept can be considered safe and may increase long-term kidney allograft survival in HIV-positive kidney allograft recipients. These results need to be confirmed in a larger cohort.