While students entering medical schools are becoming more diverse, trainees in residency programs in competitive specialties and academic medicine faculty have not increased in diversity. As part of ...an educational continuous quality improvement process at the University of California, San Francisco, School of Medicine, the authors examined data for the classes of 2013-2016 to determine whether differences existed between underrepresented in medicine (UIM) and not-UIM students' clinical performance (clerkship director ratings and number of clerkship honors grades awarded) and honor society membership-all of which influence residency selection and academic career choices.This analysis demonstrated differences that consistently favored not-UIM students. Whereas the size and magnitude of differences in clerkship director ratings were small, UIM students received approximately half as many honors grades as not-UIM students and were three times less likely to be selected for honor society membership.The authors use these findings to illustrate the amplification cascade, a phenomenon in which small differences in assessed performance lead to larger differences in grades and selection for awards. The amplification cascade raises concerns about opportunities for UIM students to compete successfully for competitive residency programs and potentially enter academic careers. Using a fishbone diagram, a continuous quality improvement root cause analysis tool, the authors contextualize their institutional results. They describe potential causes of group differences, drawing from the education disparities literature, and propose interventions and future research. They also share countermeasures adopted at their institution and encourage other medical schools to consider similar exploration of their institutional data.
A critical step in breast cancer progression is local tissue invasion, during which cells pass from the epithelial compartment to the stromal compartment. We recently showed that malignant leader ...cells can promote the invasion of otherwise non-invasive epithelial follower cells, but the effects of this induced-invasion phenomenon on follower cell phenotype remain unclear. Notably, this process can expose epithelial cells to the stromal extracellular matrix (ECM), which is distinct from the ECM within the normal epithelial microenvironment. Here, we used a 3D epithelial morphogenesis model in which cells were cultured in biochemically and mechanically defined matrices to examine matrix-mediated gene expression and the associated phenotypic response. We found that 3D collagen matrix promoted expression of mesenchymal genes including MT1-MMP, which was required for collagen-stimulated invasive behavior. Epithelial invasion required matrix anchorage as well as signaling through Src, PI3K, and Rac1, and increasingly stiff collagen promoted dispersive epithelial cell invasion. These results suggest that leader cell-facilitated access to the stromal ECM may trigger an invasive phenotype in follower epithelial cells that could enable them to actively participate in local tissue invasion.
Self-selection into residential neighbourhoods is a widely acknowledged, but under-studied problem in research investigating neighbourhood influences on physical activity and diet. Failure to handle ...neighbourhood self-selection can lead to biased estimates of the association between the neighbourhood environment and behaviour. This means that effects could be over- or under-estimated, both of which have implications for public health policies related to neighbourhood (re)design. Therefore, it is important that methods to deal with neighbourhood self-selection are identified and reviewed. The aim of this review was to assess how neighbourhood self-selection is conceived and accounted for in the literature.
Articles from a systematic search undertaken in 2017 were included if they examined associations between neighbourhood environment exposures and adult physical activity or dietary behaviour. Exposures could include any objective measurement of the built (e.g., supermarkets), natural (e.g., parks) or social (e.g., crime) environment. Articles had to explicitly state that a given method was used to account for neighbourhood self-selection. The systematic review was registered with the PROSPERO International Prospective Register of Systematic Reviews (number CRD42018083593) and was conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement.
Of 31 eligible articles, almost all considered physical activity (30/31); few examined diet (2/31). Methods used to address neighbourhood self-selection varied. Most studies (23/31) accounted for items relating to participants' neighbourhood preferences or reasons for moving to the neighbourhood using multi-variable adjustment in regression models (20/23) or propensity scores (3/23). Of 11 longitudinal studies, three controlled for neighbourhood self-selection as an unmeasured confounder using fixed effects regression.
Most studies accounted for neighbourhood self-selection by adjusting for measured attributes of neighbourhood preference. However, commonly the impact of adjustment could not be assessed. Future studies using adjustment should provide estimates of associations with and without adjustment for self-selection; consider temporality in the measurement of self-selection variables relative to the timing of the environmental exposure and outcome behaviours; and consider the theoretical plausibility of presumed pathways in cross-sectional research where causal direction is impossible to establish.
While platelets are primary mediators of hemostasis, there is emerging evidence to show that they may also mediate pathologic thrombogenesis. Little data are available on risks and benefits ...associated with platelet transfusions in thrombotic thrombocytopenic purpura (TTP), heparin-induced thrombocytopenia (HIT) and immune thrombocytopenic purpura (ITP). This study utilized the Nationwide Inpatient Sample to evaluate the current in-hospital platelet transfusion practices and their association with arterial/venous thrombosis, acute myocardial infarction (AMI), stroke, and in-hospital mortality over 5 years (2007-2011). Age and gender-adjusted odds ratios (adjOR) associated with platelet transfusions were calculated. There were 10 624 hospitalizations with TTP; 6332 with HIT and 79 980 with ITP. Platelet transfusions were reported in 10.1% TTP, 7.1% HIT, and 25.8% ITP admissions. Platelet transfusions in TTP were associated with higher odds of arterial thrombosis (adjOR = 5.8, 95%CI = 1.3-26.6), AMI (adjOR = 2.0, 95%CI = 1.2-3.3) and mortality (adjOR = 2.0,95%CI = 1.3-3.0), but not venous thrombosis. Platelet transfusions in HIT were associated with higher odds of arterial thrombosis (adjOR = 3.4, 95%CI = 1.2-9.5) and mortality (adjOR = 5.2, 95%CI = 2.6-10.5) but not venous thrombosis. Except for AMI, all relationships remained significant after adjusting for clinical severity and acuity. No associations were significant for ITP. Platelet transfusions are associated with higher odds of arterial thrombosis and mortality among TTP and HIT patients.
•Platelet transfusions are frequently administered to hospitalized patients with platelet consumptive/destructive disorders such as TTP, HIT, and ITP.•Platelet transfusions are associated with higher odds of arterial thrombosis and mortality among TTP and HIT patients.
More than 20,000 candidates for kidney transplantation in the United States are sensitized to HLA and may have a prolonged wait for a transplant, with a reduced transplantation rate and an increased ...rate of death. One solution is to perform live-donor renal transplantation after the depletion of donor-specific anti-HLA antibodies. Whether such antibody depletion results in a survival benefit as compared with waiting for an HLA-compatible kidney is unknown.
We used a protocol that included plasmapheresis and the administration of low-dose intravenous immune globulin to desensitize 211 HLA-sensitized patients who subsequently underwent renal transplantation (treatment group). We compared rates of death between the group undergoing desensitization treatment and two carefully matched control groups of patients on a waiting list for kidney transplantation who continued to undergo dialysis (dialysis-only group) or who underwent either dialysis or HLA-compatible transplantation (dialysis-or-transplantation group).
In the treatment group, Kaplan-Meier estimates of patient survival were 90.6% at 1 year, 85.7% at 3 years, 80.6% at 5 years, and 80.6% at 8 years, as compared with rates of 91.1%, 67.2%, 51.5%, and 30.5%, respectively, for patients in the dialysis-only group and rates of 93.1%, 77.0%, 65.6%, and 49.1%, respectively, for patients in the dialysis-or-transplantation group (P<0.001 for both comparisons).
Live-donor transplantation after desensitization provided a significant survival benefit for patients with HLA sensitization, as compared with waiting for a compatible organ. By 8 years, this survival advantage more than doubled. These data provide evidence that desensitization protocols may help overcome incompatibility barriers in live-donor renal transplantation. (Funded by the National Institute of Diabetes and Digestive and Kidney Diseases and the Charles T. Bauer Foundation.).
Over recent decades, the southeastern United States (Southeast) has become increasingly well represented by the terrestrial climate proxy record. However, while the paleo proxy records capture the ...region's hydroclimatic history over the last several centuries, the understanding of near surface air temperature variability is confined to the comparatively shorter observational period (1895‐present). Here, we detail the application of blue intensity (BI) methods on a network of tree‐ring collections and examine their utility for producing robust paleotemperature estimates. Results indicate that maximum latewood BI (LWBI) chronologies exhibit positive and temporally stable correlations (r = 0.28–0.54, p < 0.01) with summer maximum temperatures. As such, we use a network of LWBI chronologies to reconstruct August‐September average maximum temperatures for the Southeast spanning the period 1760–2010 CE. Our work demonstrates the utility of applying novel dendrochronological techniques to improve the understanding of the multi‐centennial temperature history of the Southeast.
Plain Language Summary
Tree‐ring data are important sources of paleoclimate information, which allow for the longer‐term evaluation of modern climate values and trends. Compared to much of North America, the Southeastern United States (Southeast) contains substantially fewer paleoclimate records from tree rings, and no estimates of past temperature variability which extend before the observational period. Employing a recently developed technique, which uses light reflectance properties of wood to obtain a representative metric of tree‐ring density, we develop a network of temperature‐sensitive tree‐ring records across the Southeast. These records enable us to reconstruct late summer maximum temperatures across the region spanning the period 1760–2023 CE. As few ground‐based, pre‐instrumental temperature records previously existed for this region, our reconstruction allows for an improved understanding of the region's multi‐centennial climatic history.
Key Points
Maximum latewood blue intensity from tree rings can effectively be used to develop paleotemperature estimates for the southeastern US
The fidelity of tree‐ring density parameters for paleoclimate reconstruction are influenced by disturbance regimes and microsite conditions
Compared to the last 260 years, regional 20th‐century maximum late summer temperatures are not characterized by unprecedented positive trend
BACKGROUND
Red blood cell (RBC) transfusion thresholds have yet to be examined in large randomized trials in hematologic malignancies. This pilot study in acute leukemia uses a restrictive compared ...to a liberal transfusion strategy.
STUDY DESIGN AND METHODS
A randomized (2:1) study was conducted of restrictive (LOW) hemoglobin (Hb) trigger (7 g/dL) compared to higher (HIGH) Hb trigger (8 g/dL). The primary outcome was feasibility of conducting a larger trial. The four requirements for success required that more than 50% of the eligible patients could be consented, more than 75% of the patients randomized to the LOW arm tolerated the transfusion trigger, fewer than 15% of patients crossed over from the LOW arm to the HIGH arm, and no indication for the need to pause the study for safety concerns. Secondary outcomes included fatigue, bleeding, and RBCs and platelets transfused.
RESULTS
Ninety patients were consented and randomly assigned to LOW to HIGH. The four criteria for the primary objective of feasibility were met. When the number of units transfused was compared, adjusting for baseline Hb, the LOW arm was transfused on average 8.0 (95% confidence interval CI, 6.9‐9.1) units/patient while the HIGH arm received 11.7 (95% CI, 10.1‐13.2) units (p = 0.0003). There was no significant difference in bleeding events or neutropenic fevers between study arms.
CONCLUSION
This study establishes feasibility for trial of Hb thresholds in leukemia through demonstration of success in all primary outcome metrics and a favorable safety profile. This population requires further study to evaluate the equivalence of liberal and restrictive transfusion thresholds in this unique clinical setting.
High-dose granulocyte transfusion therapy has been available for 20 years, yet its clinical efficacy has never been conclusively demonstrated. We report here the results of RING (Resolving Infection ...in Neutropenia with Granulocytes), a multicenter randomized controlled trial designed to address this question. Eligible subjects were those with neutropenia (absolute neutrophil count <500/μL) and proven/probable/presumed infection. Subjects were randomized to receive either (1) standard antimicrobial therapy or (2) standard antimicrobial therapy plus daily granulocyte transfusions from donors stimulated with granulocyte colony-stimulating factor (G-CSF) and dexamethasone. The primary end point was a composite of survival plus microbial response, at 42 days after randomization. Microbial response was determined by a blinded adjudication panel. Fifty-six subjects were randomized to the granulocyte arm and 58 to the control arm. Transfused subjects received a median of 5 transfusions. Mean transfusion dose was 54.9 × 109 granulocytes. Overall success rates were 42% and 43% for the granulocyte and control groups, respectively (P > .99), and 49% and 41%, respectively, for subjects who received their assigned treatments (P = .64). Success rates for granulocyte and control arms did not differ within any infection type. In a post hoc analysis, subjects who received an average dose per transfusion of ≥0.6 × 109 granulocytes per kilogram tended to have better outcomes than those receiving a lower dose. In conclusion, there was no overall effect of granulocyte transfusion on the primary outcome, but because enrollment was half that planned, power to detect a true beneficial effect was low. RING was registered at www.clinicaltrials.gov as #NCT00627393.
•Overall, no benefit of granulocyte transfusion therapy was observed, but the power of the study was reduced due to low accrual.•Post hoc secondary analysis suggested that patients receiving higher doses tended to have better outcomes than those receiving lower ones.
BACKGROUND: The incidence of allergic transfusion reactions (ATRs) ranges from 1% to 3% of all transfusions, and they are difficult to prevent. This study evaluated whether removing plasma from ...apheresis platelets (APs) or red blood cells (RBCs) by concentrating or washing transfusion products can decrease the incidence of ATRs.
STUDY DESIGN AND METHODS: A retrospective cohort study of 179 individuals who received unmanipulated and subsequently concentrated and/or washed APs was conducted. Poisson regression with generalized estimating equations was used to estimate the incident rate ratios and 95% confidence intervals (CIs) of ATRs.
RESULTS: The incidence of ATRs to unmanipulated APs was 5.5% (306 ATRs/5575 AP units). The incidence decreased to 1.7% (135 ATRs/4327 AP units) when individuals received concentrated APs (73% reduction; 95% CI, 65%‐79%) and 0.5% (21 ATRs/4082 AP units) when individuals received washed APs (95% reduction; 95% CI, 91%‐97%). Of the 39 individuals who received unmanipulated RBCs and subsequently washed RBCs, the incidence of ATRs decreased from 2.7% (33 ATRs/1236 RBC units) to 0.3% (2 ATRs/733 RBC units; 89.4% reduction; 95% CI, 55.5%‐97.5%). The median number of AP transfusions to first ATR was six (interquartile range IQR, 2‐19) for unmanipulated APs and increased to 13 (IQR, 4‐32) for concentrated APs and 40 (IQR, 29‐73.5) for washed APs.
CONCLUSIONS: Concentrating APs and washing APs and RBCs substantially reduces ATRs, suggesting that the plasma component of APs and RBCs has an essential role in the etiology of ATRs.
Background
Allergic transfusion reaction (ATR) incidence ranges from 1% to 3% of all transfusions. We evaluated the impact of InterSol platelet additive solution (PAS) apheresis platelets (APs) on ...the incidence of ATRs and the posttransfusion platelet (PLT) increment.
Study Design and Methods
This retrospective study evaluated all ATRs among patients at a university hospital that maintained a mixed inventory of PAS APs and non‐PAS APs (standard plasma‐suspended PLTs). Corrected count increments (CCIs) were calculated for AP transfusions of individuals who received both a PAS and a non‐PAS AP transfusion within a 7‐day period. Hypothesis testing was performed with chi‐square test for dichotomous variables and t tests for continuous variables.
Results
The incidence of ATRs among the non‐PAS APs was 1.85% (72 ATRs/3884 transfusions) and 1.01% (12 ATRs/1194 transfusions) for PAS APs (risk ratio RR, 0.54; 95% confidence interval CI = 0.30‐0.99; p = 0.04). However, there was no difference in the incidence of febrile nonhemolytic transfusion reactions between non‐PAS APs (incidence, 0.70%; 27/3884) compared to PAS APs (incidence, 0.59%; 7/1194; p = 0.69). Among 223 individuals with paired non‐PAS and PAS AP transfusions, the mean CCI at 1 to 4 hours after transfusion was 4932 (95% CI, 4452‐5412) for non‐PAS APs and was lower for PAS APs (CCI, 3766; 95% CI, 3375‐4158; p ≤ 0.001). However, there was no significant difference in mean CCI at 12 to 24 hours between non‐PAS (CCI, 2135; 95% CI, 1696‐2573) and PAS APs (CCI, 1745; 95% CI, 1272‐2217; p = 0.14).
Conclusions
PAS APs substantially reduce the number of ATRs. CCIs for PAS APs were lower immediately after transfusion, but not significantly different at 12 to 24 hours.