Perioperative red blood cell transfusion is commonly used to address anemia, an independent risk factor for morbidity and mortality after cardiac operations; however, evidence regarding optimal blood ...transfusion practice in patients undergoing cardiac surgery is lacking.
To define whether a restrictive perioperative red blood cell transfusion strategy is as safe as a liberal strategy in patients undergoing elective cardiac surgery.
The Transfusion Requirements After Cardiac Surgery (TRACS) study, a prospective, randomized, controlled clinical noninferiority trial conducted between February 2009 and February 2010 in an intensive care unit at a university hospital cardiac surgery referral center in Brazil. Consecutive adult patients (n = 502) who underwent cardiac surgery with cardiopulmonary bypass were eligible; analysis was by intention-to-treat.
Patients were randomly assigned to a liberal strategy of blood transfusion (to maintain a hematocrit ≥30%) or to a restrictive strategy (hematocrit ≥24%).
Composite end point of 30-day all-cause mortality and severe morbidity (cardiogenic shock, acute respiratory distress syndrome, or acute renal injury requiring dialysis or hemofiltration) occurring during the hospital stay. The noninferiority margin was predefined at -8% (ie, 8% minimal clinically important increase in occurrence of the composite end point).
Hemoglobin concentrations were maintained at a mean of 10.5 g/dL (95% confidence interval CI, 10.4-10.6) in the liberal-strategy group and 9.1 g/dL (95% CI, 9.0-9.2) in the restrictive-strategy group (P < .001). A total of 198 of 253 patients (78%) in the liberal-strategy group and 118 of 249 (47%) in the restrictive-strategy group received a blood transfusion (P < .001). Occurrence of the primary end point was similar between groups (10% liberal vs 11% restrictive; between-group difference, 1% 95% CI, -6% to 4%; P = .85). Independent of transfusion strategy, the number of transfused red blood cell units was an independent risk factor for clinical complications or death at 30 days (hazard ratio for each additional unit transfused, 1.2 95% CI, 1.1-1.4; P = .002).
Among patients undergoing cardiac surgery, the use of a restrictive perioperative transfusion strategy compared with a more liberal strategy resulted in noninferior rates of the combined outcome of 30-day all-cause mortality and severe morbidity.
clinicaltrials.gov Identifier: NCT01021631.
Although disinfection is key to infection control, the colonization patterns and resistomes of hospital-environment microbes remain underexplored. We report the first extensive genomic ...characterization of microbiomes, pathogens and antibiotic resistance cassettes in a tertiary-care hospital, from repeated sampling (up to 1.5 years apart) of 179 sites associated with 45 beds. Deep shotgun metagenomics unveiled distinct ecological niches of microbes and antibiotic resistance genes characterized by biofilm-forming and human-microbiome-influenced environments with corresponding patterns of spatiotemporal divergence. Quasi-metagenomics with nanopore sequencing provided thousands of high-contiguity genomes, phage and plasmid sequences (>60% novel), enabling characterization of resistome and mobilome diversity and dynamic architectures in hospital environments. Phylogenetics identified multidrug-resistant strains as being widely distributed and stably colonizing across sites. Comparisons with clinical isolates indicated that such microbes can persist in hospitals for extended periods (>8 years), to opportunistically infect patients. These findings highlight the importance of characterizing antibiotic resistance reservoirs in hospitals and establish the feasibility of systematic surveys to target resources for preventing infections.
Abstract
The objectives of this study were to evaluate the growth rates, carcass quality, shelf-life, tenderness, sensory characteristics, volatile compounds, and fatty acid composition of wool, ...hair, and composite (wool × hair) lambs. Twenty-one wether lambs wool (Suffolk × Polypay/Targhee; n = 7), hair (Dorper × Dorper; n = 7), and composite (Dorper × Polypay/Targhee; n = 7) were fed from weaning to finishing at the University of Idaho Sheep Center and subsequently harvested under United States Department of Agriculture inspection at the University of Idaho Meat Lab. At 48 h postmortem, carcass measurements were taken to determine the percent boneless closely trimmed retail cuts, yield grade, and quality grade. Loins were fabricated from each carcass and wet-aged at 0°C until 10-d postmortem. Following aging, 2.54-cm bone-in loin chops were cut and randomly assigned to 4 d of retail display, Warner–Bratzler Shear Force (WBSF), or sensory analyses. Thiobarbituric acid reactive substances were analyzed on days 0 and 4 of retail display while subjective and objective color measurements were observed once daily. Samples (24 g) were also collected for volatile compound and fatty acid analysis. A mixed model analysis of variance was used to assess breed differences. Discernable effects were considered at P < 0.05. Wool lambs had heavier hot carcass weights (P < 0.001), larger rib-eye area (P = 0.015), and higher dressing percent (P < 0.001) than the other breeds. There was an interaction observed between breed and days of retail display for browning (P = 0.006). On day 1 chops from the composite breed had more browning than chops from the wool breed. No differences were observed between groups for lean muscle L* values (P = 0.432), a* values (P = 0.757), and b* values (P = 0.615). Differences were not observed in lipid oxidation (P = 0.159), WBSF (P = 0.540), or consumer acceptability (P = 0.295). There were differences found for 7 of the 45 fatty acids detected and in 3 of the 67 volatile compounds detected. In conclusion, wool lambs were heavier and had a greater carcass yield than the hair lamb carcasses. Regardless of breed, consumers did not detect sensory traits that would impact their eating experience.
The current study found the wool group maintained higher growth from weaning until harvest with improved carcass yield and rib-eye area maintaining a superior product for the producers. There was minimal variation between groups for shelf-life and sensory analysis establishing that U.S. consumers are unable to detect breed differences.
Lay Summary
The U.S. lamb industry has been steadily losing production operations since the 1990s due to a decline in lamb consumption, caused by consumer flavor preferences, an increase in synthetic fabrics, and importation from Australia and New Zealand. This has limited the overall need for U.S. producers. The focus of this research was to evaluate the differences among wool, hair, and composite groups of lamb for growth traits, carcass characteristics, shelf-life stability, consumer flavor, volatile compounds, and fatty acid profiling. The study found the wool group maintained a higher overall weight, as well as a taller shoulder height, and heart girth circumference over the hair group. The wool group also had a superior final carcass size; however, the quality grade did not differ between the composite and wool groups. There was no difference between groups for consumer sensory levels or cook loss. Minimal variation was shown throughout the retail shelf-life; however, general discoloration increased as expected during retail display. Tenderness levels between groups were also similar. Volatile compounds showed differences for three compounds with variation between groups. Fatty acid analysis also determined seven compounds of interest; however, common fatty acids known to be flavor factors were not different between groups.