Intravenous fluids, an essential component of sepsis resuscitation, may paradoxically worsen outcomes by exacerbating endothelial injury. Preclinical models suggest that fluid resuscitation degrades ...the endothelial glycocalyx, a heparan sulfate-enriched structure necessary for vascular homeostasis. We hypothesized that endothelial glycocalyx degradation is associated with the volume of intravenous fluids administered during early sepsis resuscitation.
We used mass spectrometry to measure plasma heparan sulfate (a highly sensitive and specific index of systemic endothelial glycocalyx degradation) after 6 h of intravenous fluids in 56 septic shock patients, at presentation and after 24 h of intravenous fluids in 100 sepsis patients, and in two groups of non-infected patients. We compared plasma heparan sulfate concentrations between sepsis and non-sepsis patients, as well as between sepsis survivors and sepsis non-survivors. We used multivariable linear regression to model the association between volume of intravenous fluids and changes in plasma heparan sulfate.
Consistent with previous studies, median plasma heparan sulfate was elevated in septic shock patients (118 IQR, 113-341 ng/ml 6 h after presentation) compared to non-infected controls (61 45-79 ng/ml), as well as in a second cohort of sepsis patients (283 155-584 ng/ml) at emergency department presentation) compared to controls (177 144-262 ng/ml). In the larger sepsis cohort, heparan sulfate predicted in-hospital mortality. In both cohorts, multivariable linear regression adjusting for age and severity of illness demonstrated a significant association between volume of intravenous fluids administered during resuscitation and plasma heparan sulfate. In the second cohort, independent of disease severity and age, each 1 l of intravenous fluids administered was associated with a 200 ng/ml increase in circulating heparan sulfate (p = 0.006) at 24 h after enrollment.
Glycocalyx degradation occurs in sepsis and septic shock and is associated with in-hospital mortality. The volume of intravenous fluids administered during sepsis resuscitation is independently associated with the degree of glycocalyx degradation. These findings suggest a potential mechanism by which intravenous fluid resuscitation strategies may induce iatrogenic endothelial injury.
Laboratory and recent clinical data suggest that hyperoxemia after resuscitation from cardiac arrest is harmful; however, it remains unclear if the risk of adverse outcome is a threshold effect at a ...specific supranormal oxygen tension, or is a dose-dependent association. We aimed to define the relationship between supranormal oxygen tension and outcome in postresuscitation patients.
This was a multicenter cohort study using the Project IMPACT database (intensive care units at 120 US hospitals). Inclusion criteria were age >17 years, nontrauma, cardiopulmonary resuscitation preceding intensive care unit arrival, and postresuscitation arterial blood gas obtained. We excluded patients with hypoxia or severe oxygenation impairment. We defined the exposure by the highest partial pressure of arterial oxygen (PaO(2)) over the first 24 hours in the ICU. The primary outcome measure was in-hospital mortality. We tested the association between PaO(2) (continuous variable) and mortality using multivariable logistic regression adjusted for patient-oriented covariates and potential hospital effects. Of 4459 patients, 54% died. The median postresuscitation PaO(2) was 231 (interquartile range 149 to 349) mm Hg. Over ascending ranges of oxygen tension, we found significant linear trends of increasing in-hospital mortality and decreasing survival as functionally independent. On multivariable analysis, a 100 mm Hg increase in PaO(2) was associated with a 24% increase in mortality risk (odds ratio 1.24 95% confidence interval 1.18 to 1.31. We observed no evidence supporting a single threshold for harm from supranormal oxygen tension.
In this large sample of postresuscitation patients, we found a dose-dependent association between supranormal oxygen tension and risk of in-hospital death.
Acute appendicitis is the most common abdominal emergency requiring emergency surgery. However, the diagnosis is often challenging and the decision to operate, observe or further work-up a patient is ...often unclear. The utility of clinical scoring systems (namely the Alvarado score), laboratory markers, and the development of novel markers in the diagnosis of appendicitis remains controversial. This article presents an update on the diagnostic approach to appendicitis through an evidence-based review.
We performed a broad Medline search of radiological imaging, the Alvarado score, common laboratory markers, and novel markers in patients with suspected appendicitis.
Computed tomography (CT) is the most accurate mode of imaging for suspected cases of appendicitis, but the associated increase in radiation exposure is problematic. The Alvarado score is a clinical scoring system that is used to predict the likelihood of appendicitis based on signs, symptoms and laboratory data. It can help risk stratify patients with suspected appendicitis and potentially decrease the use of CT imaging in patients with certain Alvarado scores. White blood cell (WBC), C-reactive protein (CRP), granulocyte count and proportion of polymorphonuclear (PMN) cells are frequently elevated in patients with appendicitis, but are insufficient on their own as a diagnostic modality. When multiple markers are used in combination their diagnostic utility is greatly increased. Several novel markers have been proposed to aid in the diagnosis of appendicitis; however, while promising, most are only in the preliminary stages of being studied.
While CT is the most accurate mode of imaging in suspected appendicitis, the accompanying radiation is a concern. Ultrasound may help in the diagnosis while decreasing the need for CT in certain circumstances. The Alvarado Score has good diagnostic utility at specific cutoff points. Laboratory markers have very limited diagnostic utility on their own but show promise when used in combination. Further studies are warranted for laboratory markers in combination and to validate potential novel markers.
Abstract Background We sought to compare the association of whole-blood lactate kinetics with survival in patients with septic shock undergoing early quantitative resuscitation. Methods This was a ...preplanned analysis of a multicenter, ED-based, randomized, controlled trial of early sepsis resuscitation. Inclusion criteria were suspected infection, two or more systemic inflammation criteria, either systolic BP< 90 mm Hg after a fluid bolus or lactate level > 4 mM, two serial lactate measurements, and an initial lactate level > 2.0 mM. We calculated the relative lactate clearance, rate of lactate clearance, and occurrence of early lactate normalization (decline to < 2.0 mM in the first 6 h). Area under the receiver operating characteristic curve (AUC) and multivariate logistic regression were used to determine the lactate kinetic parameters that were the strongest predictors of survival. Results The analysis included 187 patients, of whom 36% (n = 68) normalized their lactate level. Overall survival was 76.5% (143 of 187 patients), and the AUC of initial lactate to predict survival was 0.64. The AUCs for relative lactate clearance and lactate clearance rate were 0.67 and 0.58, respectively. Lactate normalization was the strongest predictor of survival (adjusted OR, 5.2; 95% CI, 1.7–15.8), followed by lactate clearance ≥ 50% (OR, 4.0; 95% CI, 1.6–10.0). Lactate clearance ≥ 10% (OR, 1.6; 95% CI, 0.6–4.4) was not a significant independent predictor in this cohort. Conclusions In patients in the ED with a sepsis diagnosis, early lactate normalization during the first 6 h of resuscitation was the strongest independent predictor of survival and was superior to other measures of lactate kinetics. Trial registry ClinicalTrials.gov ; No.: NCT00372502 ; URL: clinicaltrials.gov
Objectives
To create a risk prediction rule for delirium in elderly adults in the emergency department (ED) and to compare mortality and resource use of elderly adults in the ED with and without ...delirium.
Design
Prospective observational study.
Setting
Urban tertiary care ED.
Participants
Individuals aged 65 and older presenting for ED care (N = 700).
Measurements
A trained research assistant performed a structured mental status assessment and attention tests, after which delirium was determined using the Confusion Assessment Method. Data were collected on participant demographics, comorbidities, medications, ED course, hospital and intensive care unit (ICU) admission, length of stay, hospital charges, 30‐day rehospitalization, and mortality.
Results
Nine percent of elderly study participants had delirium. Using logistic regression, a delirium prediction rule consisting of older age, prior stroke or transient ischemic attack, dementia, suspected infection, and acute intracranial hemorrhage was created had good predictive accuracy (area under the receiver operating characteristic curve = 0.77). Admitted participants with ED delirium had longer median lengths of stay (4 vs 2 days) and were more likely to require ICU admission (13% vs 6%) and to be discharged to a new long‐term care facility (37% vs 9%) than those without. In all participants, ED delirium was associated with higher 30‐day mortality (6% vs 1%) and 30‐day readmission (27% vs 13%).
Conclusion
This risk prediction rule may help identify a group of individuals in the ED at high risk of developing delirium who should undergo screening, but it requires external validation. Identification of delirium in the ED may enable physicians to implement strategies to decrease delirium duration and avoid inappropriate discharge of individuals with acute delirium, improving outcomes.
In septic shock, the first few hours of care are critical for survival. In this study, two protocols for the care of patients with septic shock were compared with usual care with respect to 60-day ...mortality and other outcomes. There were no significant differences in outcome.
There are more than 750,000 cases of severe sepsis and septic shock in the United States each year.
1
Most patients who present with sepsis receive initial care in the emergency department, and the short-term mortality is 20% or more.
2
,
3
In 2001, Rivers et al. reported that among patients with severe sepsis or septic shock in a single urban emergency department, mortality was significantly lower among those who were treated according to a 6-hour protocol of early goal-directed therapy (EGDT) than among those who were given standard therapy (30.5% vs. 46.5%).
4
On the basis of the premise that usual care . . .
Disordered coagulation contributes to death in sepsis and lacks effective treatments. Existing markers of disseminated intravascular coagulation (DIC) reflect its sequelae rather than its causes, ...delaying diagnosis and treatment. Here we show that disruption of the endothelial Tie2 axis is a sentinel event in septic DIC. Proteomics in septic DIC patients revealed a network involving inflammation and coagulation with the Tie2 antagonist, angiopoietin-2 (Angpt-2), occupying a central node. Angpt-2 was strongly associated with traditional DIC markers including platelet counts, yet more accurately predicted mortality in 2 large independent cohorts (combined N = 1,077). In endotoxemic mice, reduced Tie2 signaling preceded signs of overt DIC. During this early phase, intravital imaging of microvascular injury revealed excessive fibrin accumulation, a pattern remarkably mimicked by Tie2 deficiency even without inflammation. Conversely, Tie2 activation normalized prothrombotic responses by inhibiting endothelial tissue factor and phosphatidylserine exposure. Critically, Tie2 activation had no adverse effects on bleeding. These results mechanistically implicate Tie2 signaling as a central regulator of microvascular thrombus formation in septic DIC and indicate that circulating markers of the Tie2 axis could facilitate earlier diagnosis. Finally, interventions targeting Tie2 may normalize coagulation in inflammatory states while averting the bleeding risks of current DIC therapies.
Background
Maintenance of mean arterial pressure (MAP) at levels sufficient to avoid tissue hypoperfusion is a key tenet in the management of distributive shock. We hypothesized that patients with ...distributive shock sometimes have a MAP below that typically recommended and that such hypotension is associated with increased mortality.
Methods
In this retrospective analysis of the Medical Information Mart for Intensive Care (MIMIC-III) database from Beth Israel Deaconess Medical Center, Boston, USA, we included all intensive care unit (ICU) admissions between 2001 and 2012 with distributive shock, defined as continuous vasopressor support for ≥ 6 h and no evidence of low cardiac output shock. Hypotension was evaluated using five MAP thresholds: 80, 75, 65, 60 and 55 mmHg. We evaluated the longest continuous episode below each threshold during vasopressor therapy. The primary outcome was ICU mortality.
Results
Of 5347 patients with distributive shock, 95.7%, 91.0%, 62.0%, 36.0% and 17.2%, respectively, had MAP < 80, < 75, < 65, < 60 and < 55 mmHg for more than two consecutive hours. On average, ICU mortality increased by 1.3, 1.8, 5.1, 7.9 and 14.4 percentage points for each additional 2 h with MAP < 80, < 75, < 65, < 60 and < 55 mmHg, respectively. Multivariable logistic modeling showed that, compared to patients in whom MAP was never < 65 mmHg, ICU mortality increased as duration of hypotension < 65 mmHg increased for > 0 to < 2 h, odds ratio (OR) 1.76,
p
= 0.005; ≥ 6 to < 8 h, OR 2.90,
p
< 0.0001; ≥ 20 h, OR 7.10,
p
< 0.0001. When hypotension was defined as MAP < 60 or < 55 mmHg, the associations between duration and mortality were generally stronger than when hypotension was defined as MAP < 65 mmHg. There was no association between hypotension and mortality when hypotension was defined as MAP < 80 mmHg.
Conclusions
Within the limitations due to the nature of the study, most patients with distributive shock experienced at least one episode with MAP < 65 mmHg lasting > 2 h. Episodes of prolonged hypotension were associated with higher mortality.
BACKGROUND:Studies examining the association between hyperoxia exposure after resuscitation from cardiac arrest and clinical outcomes have reported conflicting results. Our objective was to test the ...hypothesis that early postresuscitation hyperoxia is associated with poor neurological outcome.
METHODS:This was a multicenter prospective cohort study. We included adult patients with cardiac arrest who were mechanically ventilated and received targeted temperature management after return of spontaneous circulation. We excluded patients with cardiac arrest caused by trauma or sepsis. Per protocol, partial pressure of arterial oxygen (PaO2) was measured at 1 and 6 hours after return of spontaneous circulation. Hyperoxia was defined as a PaO2 >300 mm Hg during the initial 6 hours after return of spontaneous circulation. The primary outcome was poor neurological function at hospital discharge, defined as a modified Rankin Scale score >3. Multivariable generalized linear regression with a log link was used to test the association between PaO2 and poor neurological outcome. To assess whether there was an association between other supranormal PaO2 levels and poor neurological outcome, we used other PaO2 cut points to define hyperoxia (ie, 100, 150, 200, 250, 350, 400 mm Hg).
RESULTS:Of the 280 patients included, 105 (38%) had exposure to hyperoxia. Poor neurological function at hospital discharge occurred in 70% of patients in the entire cohort and in 77% versus 65% among patients with versus without exposure to hyperoxia respectively (absolute risk difference, 12%; 95% confidence interval, 1–23). Hyperoxia was independently associated with poor neurological function (relative risk, 1.23; 95% confidence interval, 1.11–1.35). On multivariable analysis, a 1-hour-longer duration of hyperoxia exposure was associated with a 3% increase in risk of poor neurological outcome (relative risk, 1.03; 95% confidence interval, 1.02–1.05). We found that the association with poor neurological outcome began at ≥300 mm Hg.
CONCLUSIONS:Early hyperoxia exposure after resuscitation from cardiac arrest was independently associated with poor neurological function at hospital discharge.
CONTEXT Goal-directed resuscitation for severe sepsis and septic shock has been reported to reduce mortality when applied in the emergency department. OBJECTIVE To test the hypothesis of ...noninferiority between lactate clearance and central venous oxygen saturation (ScvO2) as goals of early sepsis resuscitation. DESIGN, SETTING, AND PATIENTS Multicenter randomized, noninferiority trial involving patients with severe sepsis and evidence of hypoperfusion or septic shock who were admitted to the emergency department from January 2007 to January 2009 at 1 of 3 participating US urban hospitals. INTERVENTIONS We randomly assigned patients to 1 of 2 resuscitation protocols. The ScvO2 group was resuscitated to normalize central venous pressure, mean arterial pressure, and ScvO2 of at least 70%; and the lactate clearance group was resuscitated to normalize central venous pressure, mean arterial pressure, and lactate clearance of at least 10%. The study protocol was continued until all goals were achieved or for up to 6 hours. Clinicians who subsequently assumed the care of the patients were blinded to the treatment assignment. MAIN OUTCOME MEASURE The primary outcome was absolute in-hospital mortality rate; the noninferiority threshold was set at Δ equal to −10%. RESULTS Of the 300 patients enrolled, 150 were assigned to each group and patients were well matched by demographic, comorbidities, and physiological features. There were no differences in treatments administered during the initial 72 hours of hospitalization. Thirty-four patients (23%) in the ScvO2 group died while in the hospital (95% confidence interval CI, 17%-30%) compared with 25 (17%; 95% CI, 11%-24%) in the lactate clearance group. This observed difference between mortality rates did not reach the predefined −10% threshold (intent-to-treat analysis: 95% CI for the 6% difference, −3% to 15%). There were no differences in treatment-related adverse events between the groups. CONCLUSION Among patients with septic shock who were treated to normalize central venous and mean arterial pressure, additional management to normalize lactate clearance compared with management to normalize ScvO2 did not result in significantly different in-hospital mortality. TRIAL REGISTRATION clinicaltrials.gov Identifier: NCT00372502