Autophagy has been linked to longevity in many species, but the underlying mechanisms are unclear. Using a GFP-tagged and a new tandem-tagged Atg8/LGG-1 reporter, we quantified autophagic vesicles ...and performed autophagic flux assays in multiple tissues of wild-type
and long-lived
/insulin/IGF-1 and
/Notch mutants throughout adulthood. Our data are consistent with an age-related decline in autophagic activity in the intestine, body-wall muscle, pharynx, and neurons of wild-type animals. In contrast,
and
mutants displayed unique age- and tissue-specific changes in autophagic activity, indicating that the two longevity paradigms have distinct effects on autophagy during aging. Although autophagy appeared active in the intestine of both long-lived mutants, inhibition of intestinal autophagy significantly abrogated lifespan extension only in
mutants. Collectively, our data suggest that autophagic activity normally decreases with age in
whereas
and
long-lived mutants regulate autophagy in distinct spatiotemporal-specific manners to extend lifespan.
T cell activation is a complex process that requires multiple cell signaling pathways, including a primary recognition signal and additional costimulatory signals. TCR signaling in the absence of ...costimulatory signals can lead to an abortive attempt at activation and subsequent anergy. One of the best-characterized costimulatory pathways includes the Ig superfamily members CD28 and CTLA-4 and their ligands CD80 and CD86. The development of the fusion protein CTLA-4-Ig as an experimental and subsequent therapeutic tool is one of the major success stories in modern immunology. Abatacept and belatacept are clinically approved agents for the treatment of rheumatoid arthritis and renal transplantation, respectively. Future interventions may include selective CD28 blockade to block the costimulatory potential of CD28 while exploiting the coinhibitory effects of CTLA-4.
Xenotransplantation using pig organs could end the donor organ shortage for transplantation, but humans have xenoreactive antibodies that cause early graft rejection. Genome editing can eliminate ...xenoantigens in donor pigs to minimize the impact of these xenoantibodies. Here we determine whether an improved cross-match and chemical immunosuppression could result in prolonged kidney xenograft survival in a pig-to-rhesus preclinical model.
Double xenoantigen (Gal and Sda) knockout (DKO) pigs were created using CRISPR/Cas. Serum from rhesus monkeys (n = 43) was cross-matched with cells from the DKO pigs. Kidneys from the DKO pigs were transplanted into rhesus monkeys (n = 6) that had the least reactive cross-matches. The rhesus recipients were immunosuppressed with anti-CD4 and anti-CD8 T-cell depletion, anti-CD154, mycophenolic acid, and steroids.
Rhesus antibody binding to DKO cells is reduced, but all still have positive CDC and flow cross-match. Three grafts were rejected early at 5, 6, and 6 days. Longer survival was achieved in recipients with survival to 35, 100, and 435 days. Each of the 3 early graft losses was secondary to IgM antibody-mediated rejection. The 435-day graft loss occurred secondary to IgG antibody-mediated rejection.
Reducing xenoantigens in donor pigs and chemical immunosuppression can be used to achieve prolonged renal xenograft survival in a preclinical model, suggesting that if a negative cross-match can be obtained for humans then prolonged survival could be achieved.
The shortage of available organs remains the greatest barrier to expanding access to transplant. Despite advances in genetic editing and immunosuppression, survival in experimental models of kidney ...xenotransplant has generally been limited to <100 days. We found that pretransplant selection of recipients with low titers of anti‐pig antibodies significantly improved survival in a pig‐to–rhesus macaque kidney transplant model (6 days vs median survival time 235 days). Immunosuppression included transient pan–T cell depletion and an anti‐CD154–based maintenance regimen. Selective depletion of CD4+ T cells but not CD8+ T cells resulted in long‐term survival (median survival time >400 days vs 6 days). These studies suggested that CD4+ T cells may have a more prominent role in xenograft rejection compared with CD8+ T cells. Although animals that received selective depletion of CD8+ T cells showed signs of early cellular rejection (marked CD4+ infiltrates), animals receiving selective CD4+ depletion exhibited normal biopsy results until late, when signs of chronic antibody rejection were present. In vitro study results suggested that rhesus CD4+ T cells required the presence of SLA class II to mount an effective proliferative response. The combination of low pretransplant anti‐pig antibody and CD4 depletion resulted in consistent, long‐term xenograft survival.
CD4 T cell depletion is critical to long‐term (beyond 1 year) survival of pig‐to‐nonhuman primate kidney xenografts.
Here we use a chromosome-level genome assembly of a prairie rattlesnake (
), together with Hi-C, RNA-seq, and whole-genome resequencing data, to study key features of genome biology and evolution in ...reptiles. We identify the rattlesnake Z Chromosome, including the recombining pseudoautosomal region, and find evidence for partial dosage compensation driven by an evolutionary accumulation of a female-biased up-regulation mechanism. Comparative analyses with other amniotes provide new insight into the origins, structure, and function of reptile microchromosomes, which we demonstrate have markedly different structure and function compared to macrochromosomes. Snake microchromosomes are also enriched for venom genes, which we show have evolved through multiple tandem duplication events in multiple gene families. By overlaying chromatin structure information and gene expression data, we find evidence for venom gene-specific chromatin contact domains and identify how chromatin structure guides precise expression of multiple venom gene families. Further, we find evidence for venom gland-specific transcription factor activity and characterize a complement of mechanisms underlying venom production and regulation. Our findings reveal novel and fundamental features of reptile genome biology, provide insight into the regulation of snake venom, and broadly highlight the biological insight enabled by chromosome-level genome assemblies.
Xenotransplantation has the potential to alleviate the organ shortage that prevents many patients with end‐stage renal disease from enjoying the benefits of kidney transplantation. Despite ...significant advances in other models, pig‐to‐primate kidney xenotransplantation has met limited success. Preformed anti‐pig antibodies are an important component of the xenogeneic immune response. To address this, we screened a cohort of 34 rhesus macaques for anti‐pig antibody levels. We then selected animals with both low and high titers of anti‐pig antibodies to proceed with kidney transplant from galactose‐α1,3‐galactose knockout/CD55 transgenic pig donors. All animals received T‐cell depletion followed by maintenance therapy with costimulation blockade (either anti‐CD154 mAb or belatacept), mycophenolate mofetil, and steroid. The animal with the high titer of anti‐pig antibody rejected the kidney xenograft within the first week. Low‐titer animals treated with anti‐CD154 antibody, but not belatacept exhibited prolonged kidney xenograft survival (>133 and >126 vs. 14 and 21 days, respectively). Long‐term surviving animals treated with the anti‐CD154‐based regimen continue to have normal kidney function and preserved renal architecture without evidence of rejection on biopsies sampled at day 100. This description of the longest reported survival of pig‐to‐non‐human primate kidney xenotransplantation, now >125 days, provides promise for further study and potential clinical translation.
Broad paradigms of vertebrate genomic repeat element evolution have been largely shaped by analyses of mammalian and avian genomes. Here, based on analyses of genomes sequenced from over 60 squamate ...reptiles (lizards and snakes), we show that patterns of genomic repeat landscape evolution in squamates challenge such paradigms. Despite low variance in genome size, squamate genomes exhibit surprisingly high variation among species in abundance (ca. 25-73% of the genome) and composition of identifiable repeat elements. We also demonstrate that snake genomes have experienced microsatellite seeding by transposable elements at a scale unparalleled among eukaryotes, leading to some snake genomes containing the highest microsatellite content of any known eukaryote. Our analyses of transposable element evolution across squamates also suggest that lineage-specific variation in mechanisms of transposable element activity and silencing, rather than variation in species-specific demography, may play a dominant role in driving variation in repeat element landscapes across squamate phylogeny.
The myriad of co-stimulatory signals expressed, or induced, upon T-cell activation suggests that these signalling pathways shape the character and magnitude of the resulting autoreactive or ...alloreactive T-cell responses during autoimmunity or transplantation, respectively. Reducing pathological T-cell responses by targeting T-cell co-stimulatory pathways has met with therapeutic success in many instances, but challenges remain. In this Review, we discuss the T-cell co-stimulatory molecules that are known to have critical roles during T-cell activation, expansion, and differentiation. We also outline the functional importance of T-cell co-stimulatory molecules in transplantation, tolerance and autoimmunity, and we describe how therapeutic blockade of these pathways might be harnessed to manipulate the immune response to prevent or attenuate pathological immune responses. Ultimately, understanding the interplay between individual co-stimulatory and co-inhibitory pathways engaged during T-cell activation and differentiation will lead to rational and targeted therapeutic interventions to manipulate T-cell responses and improve clinical outcomes.
After a person has been injured, prehospital administration of plasma in addition to the initiation of standard resuscitation procedures in the prehospital environment may reduce the risk of ...downstream complications from hemorrhage and shock. Data from large clinical trials are lacking to show either the efficacy or the risks associated with plasma transfusion in the prehospital setting.
To determine the efficacy and safety of prehospital administration of thawed plasma in injured patients who are at risk for hemorrhagic shock, we conducted a pragmatic, multicenter, cluster-randomized, phase 3 superiority trial that compared the administration of thawed plasma with standard-care resuscitation during air medical transport. The primary outcome was mortality at 30 days.
A total of 501 patients were evaluated: 230 patients received plasma (plasma group) and 271 received standard-care resuscitation (standard-care group). Mortality at 30 days was significantly lower in the plasma group than in the standard-care group (23.2% vs. 33.0%; difference, -9.8 percentage points; 95% confidence interval, -18.6 to -1.0%; P=0.03). A similar treatment effect was observed across nine prespecified subgroups (heterogeneity chi-square test, 12.21; P=0.79). Kaplan-Meier curves showed an early separation of the two treatment groups that began 3 hours after randomization and persisted until 30 days after randomization (log-rank chi-square test, 5.70; P=0.02). The median prothrombin-time ratio was lower in the plasma group than in the standard-care group (1.2 interquartile range, 1.1 to 1.4 vs. 1.3 interquartile range, 1.1 to 1.6, P<0.001) after the patients' arrival at the trauma center. No significant differences between the two groups were noted with respect to multiorgan failure, acute lung injury-acute respiratory distress syndrome, nosocomial infections, or allergic or transfusion-related reactions.
In injured patients at risk for hemorrhagic shock, the prehospital administration of thawed plasma was safe and resulted in lower 30-day mortality and a lower median prothrombin-time ratio than standard-care resuscitation. (Funded by the U.S. Army Medical Research and Materiel Command; PAMPer ClinicalTrials.gov number, NCT01818427 .).
Recent work suggests that thermally stable nanocrystallinity in metals is achievable in several binary alloys by modifying grain boundary energies via solute segregation. The remarkable thermal ...stability of these alloys has been demonstrated in recent reports, with many alloys exhibiting negligible grain growth during prolonged exposure to near‐melting temperatures. Pt–Au, a proposed stable alloy consisting of two noble metals, is shown to exhibit extraordinary resistance to wear. Ultralow wear rates, less than a monolayer of material removed per sliding pass, are measured for Pt–Au thin films at a maximum Hertz contact stress of up to 1.1 GPa. This is the first instance of an all‐metallic material exhibiting a specific wear rate on the order of 10−9 mm3 N−1 m−1, comparable to diamond‐like carbon (DLC) and sapphire. Remarkably, the wear rate of sapphire and silicon nitride probes used in wear experiments are either higher or comparable to that of the Pt–Au alloy, despite the substantially higher hardness of the ceramic probe materials. High‐resolution microscopy shows negligible surface microstructural evolution in the wear tracks after 100k sliding passes. Mitigation of fatigue‐driven delamination enables a transition to wear by atomic attrition, a regime previously limited to highly wear‐resistant materials such as DLC.
A stable nanocrystalline alloy of Pt and Au is shown to be extremely resistant to mechanical abrasion and fatigue, having volumetric or specific wear rates comparable to diamond‐like carbon. This is the first report of a metal having such wear resistance.