The purpose of this study was to assess whether the immediate availability of serum markers would increase the appropriate use of thrombolytic therapy.
Serum markers such as myoglobin and creatine ...kinase, MB fraction (CK-MB) are effective in detecting acute myocardial infarction (AMI) in the emergency setting. Appropriate candidates for thrombolytic therapy are not always identified in the emergency department (ED), as 20% to 30% of eligible patients go untreated, representing 10% to 15% of all patients with AMI. Patients presenting with chest pain consistent with acute coronary syndrome were evaluated in the EDs of 12 hospitals throughout North America.
In this randomized, controlled clinical trial, physicians received either the immediate myoglobin/CK-MB results at 0 and 1 h after enrollment (stat) or conventional reporting of myoglobin/CK-MB 3 h or more after hospital admission (control). The primary end point was the comparison of the proportion of patients within the stat group versus control group who received appropriate thrombolytic therapy. Secondary end points included the emergent use of any reperfusion treatment in both groups, initial hospital disposition of patients (coronary care unit, monitor or nonmonitor beds) and the proportion of patients appropriately discharged from the ED.
Of 6,352 patients enrolled, 814 (12.8%) were diagnosed as having AMI. For patients having AMI, there were no statistically significant differences in the proportion of patients treated with thrombolytic therapy between the stat and control groups (15.1% vs. 17.1%, p = 0.45). When only patients with ST segment elevation on their initial electrocardiogram were compared, there were still no significant differences between the groups. Also, there was no difference in the hospital placement of patients in critical care and non- critical care beds. The availability of early markers was associated with more hospital admissions as compared to the control group, as the number of patients discharged from the ED was decreased in the stat versus control groups (28.4% vs. 31.5%, p = 0.023).
The availability of 0- and 1-h myoglobin and CK-MB results after ED evaluation had no effect on the use of thrombolytic therapy for patients presenting with AMI, and it slightly increased the number of patients admitted to the hospital who had no evidence of acute myocardial necrosis.
Because of the difficulty and cost of collecting the time from collapse to placing the 911 call in instances of out-of-hospital cardiac arrest and because of the potential noise and bias that might ...be inherent in such data, a simulation study was conducted to quantitate the impact that such data might have on estimates of the relationship between time from collapse to defibrillation and probability of survival. In the absence of bias, an underestimate of the slope on the order of 20% to 30% might be expected. However, in the presence of bias, the impact on the slope estimate is unpredictable. The most likely bias would tend to cause an overestimate of the slope. It is suggested that unless the time from collapse to placing the 911 call can be obtained accurately and without bias, it is probably not worthwhile to do so.
A prehospital computer-interpreted electrocardiogram (ECG) was obtained in 1,189 patients with chest pain of suspected cardiac origin during an ongoing trial of prehospital thrombolytic therapy in ...acute myocardial infarction. Electrocardiograms were performed by paramedics 1.5 ± 1.2 h after the onset of symptoms. Of 391 patients with evidence of acute myocardial infarction, 202 (52%) were identified as having ST segment elevation (acute injury) by the computer-interpreted ECG compared with 259 (66%) by an electrocardiographer (p < 0.001). Of 798 patients with chest pain but no infarction, 785 (98%) were appropriately excluded by computer compared with 757 (95%) by an electrocardiographer (p < 0.001). The positive predictive value of the computer and physician-interpreted ECG was, respectively, 94% and 86% and the negative predictive value was 81% find 85%.
Prehospital screening of possible candidates for thrombolytic therapy with the aid of a computerized ECG is feasible, highly specific and with further enhancement can speed the care of all patients with acute myocardial infarction.
•We quantify nutritional, climate and land use effects of reduced meat consumption.•Dietary recommendations entail a 25% cut which notably reduce saturated fat intake.•Restriction in meat consumption ...is most critical for the intake of iron and zinc.•Greenhouse gases will be approximately halved but additional measures are needed.•Land demand is roughly halved but still correspond to a significant share.
Food consumption is one of the most important drivers of environmental pressures. Adoption of healthy diets is suggested to be an option for less environmentally intensive food habits and improved public health. In particular, changes in meat consumption are believed to bring potential benefits.
To quantify the impact of changes in meat consumption on the dietary contribution of nutrients, GHG emissions and on land requirement.
Scenario analysis is performed for three scenarios representing different variants of meat consumption in Sweden. The reference scenario is based on average Swedish meat consumption while NUTR-1 and NUTR-2 are hypothetical scenarios in line with prevailing dietary guidelines. The results are evaluated in relation to the recommended daily intake of nutrients, international climate goals and global capacity for sustainable expansion of agricultural land. Uncertainties and variations in data are captured by using Monte Carlo simulation.
Meat consumption in line with nutritional guidelines, implying an approximate 25% reduction of Swedish average intake, reduces the contribution of total and saturated fat by 59–76%, energy, iron and zinc by about half and protein by one quarter. Restrictions in meat consumption are most critical for the intake of iron and zinc, whereas positive effects on public health are expected due to the reduced intake of saturated fat. Aligning meat consumption with dietary guidelines reduces GHG emissions from meat production from 40% to approximately 15–25% of the long-term (2050) per capita budget of sustainable GHG emissions and the share of per capita available cropland from 50% to 20–30%.
This quantitative analysis suggests that beneficial synergies, in terms of public health, GHG emissions and land use pressure, can be provided by reducing current Swedish meat consumption.
The mammalian brain is a complex organ composed of many specialized cells, harboring sets of both common, widely distributed, as well as specialized and discretely localized proteins. Here we focus ...on the human brain, utilizing transcriptomics and public available Human Protein Atlas (HPA) data to analyze brain-enriched (frontal cortex) polyadenylated messenger RNA and long non-coding RNA and generate a genome-wide draft of global and cellular expression patterns of the brain. Based on transcriptomics analysis of altogether 27 tissues, we have estimated that approximately 3% (n=571) of all protein coding genes and 13% (n=87) of the long non-coding genes expressed in the human brain are enriched, having at least five times higher expression levels in brain as compared to any of the other analyzed peripheral tissues. Based on gene ontology analysis and detailed annotation using antibody-based tissue micro array analysis of the corresponding proteins, we found the majority of brain-enriched protein coding genes to be expressed in astrocytes, oligodendrocytes or in neurons with molecular properties linked to synaptic transmission and brain development. Detailed analysis of the transcripts and the genetic landscape of brain-enriched coding and non-coding genes revealed brain-enriched splice variants. Several clusters of neighboring brain-enriched genes were also identified, suggesting regulation of gene expression on the chromatin level. This multi-angle approach uncovered the brain-enriched transcriptome and linked genes to cell types and functions, providing novel insights into the molecular foundation of this highly specialized organ.
Survival to hospital discharge was related to the clinical history and emergency care system factors in 285 patients with witnessed cardiac arrest due to ventricular fibrillation. Only the emergency ...care factors were associated with differences in outcome. Both the period from collapse until initiation of basic life support and the duration of basic life support before delivery of the first defibrillatory shock were shorter in patients who survived compared with those who died (3.6 ± 2.5 versus 6.1 ± 3.3 minutes and 4.3 ± 3.3 versus 7.3 ± 4.2 minutes; p < 0.05).
A linear regression model based on emergency response times for 942 patients discovered in ventricular fibrillation was used to estimate expected survival rates if the first-responding rescuers, in addition to paramedics, had been equipped and trained to defibrillate. Expected survival rates were higher with early defibrillation (38 ± 3%; 95% confidence limits) than the observed rate (28 ± 3%).
Because outcome from cardiac arrest is primarily influenced by delays in providing cardiopulmonary resuscitation and defibrillation, factors affecting response time should be carefully examined by all emergency care systems.
One hundred ninety-nine patients with out-of-hospital cardiac arrest persisted in ventricular fibrillation after the first defibrillation attempt and were then randomly assigned to receive either ...epinephrine or lidocaine before the next two shocks. The resulting electrocardiographic rhythms and outcomes for each group of patients were compared for each group and also compared with results during the prior 2 years, a period when similar patients primarily received sodium bicarbonate as initial adjunctive therapy. Asystole occurred after defibrillation with threefold frequency after repeated injection of lidocaine (15 of 59, 25%) compared with patients treated with epinephrine (four of 55, 7%) (p less than 0.02). There was no difference in the proportion of patients resuscitated after treatment with either lidocaine or epinephrine (51 of 106, 48% vs. 50 of 93, 54%) and in the proportion surviving (18, 19% vs. 21, 20%), respectively. Resuscitation (64% vs. 50%, p less than 0.005) but not survival rates (24% vs. 20%) were higher during the prior 2-year period in which initial adjunctive drug treatment for persistent ventricular fibrillation primarily consisted of a continuous infusion of sodium bicarbonate. The negative effect of lidocaine or epinephrine treatment was explained in part by their influence on delaying subsequent defibrillation attempts. Survival rates were highest (30%) in a subset of patients who received no drug therapy between shocks. We conclude that currently recommended doses of epinephrine and lidocaine are not useful for improving outcome in patients who persist in ventricular fibrillation.
Background Previous clinical studies have shown that direct antithrombins can accelerate clot lysis after treatment with streptokinase in acute myocardial infarction (MI). Efegatran is a new direct ...antithrombin, which in experimental animals has been shown to enhance thrombolysis, reduce rate of reocclusion, and limit infarct size. This study was designed to compare the efficacy of efegatran plus streptokinase versus heparin plus accelerated tissue plasminogen activator (TPA) in coronary reperfusion in acute MI.
Methods and Results In this randomized, dose-finding study (n = 245), we initially explored 4 doses of efegatran sulfate in combination with streptokinase (1.5 million U) given intravenously within 12 hours of symptom onset. The optimal dosage group of 0.5 mg/kg per hour was expanded and compared with heparin plus accelerated TPA. The primary end point was complete patency (Thrombolysis In Myocardial Infarction TIMI grade 3) at 90 minutes after thrombolytic therapy, assessed in a core angiographic laboratory. Infarct-related vessel patency (TIMI grade 2 or 3) and complete patency (TIMI grade 3) were 73% and 40% in the efegatran/streptokinase group versus 79% and 53% in the heparin/TPA group (
P = not significant). In-hospital mortality rate was 5% for the efegatran/streptokinase group versus 0% for the heparin/TPA group (
P = not significant). Major bleeding occurred in 23% of patients in the efegatran/streptokinase group versus 11% in the heparin/TPA group (
P = not significant). No intracranial hemorrhage occurred.
Conclusions The combination of efegatran plus streptokinase is not superior to the current therapy of heparin and accelerated TPA in achieving early patency. In addition, there is no indication that this experimental treatment can achieve better clinical outcome. (Am Heart J 1999;138:696-704.)
Tumor budding and a proficient mismatch repair (pMMR) status are considered adverse prognostic factors in colorectal cancer (CRC). The aim of this pilot study was to assess tumor budding in primary ...CRC with pMMR versus that with deficient mismatch repair (dMMR).
Tumor budding was retrospectively examined in the tumor from 134 patients with stage II and stage III CRC with known MMR status. The 29 available dMMR cases who developed recurrence or distant metastases (met+) were matched with a dMMR group with no recurrence or metastases (met-), and the pMMR/met+ group with pMMR/met- cases.
Using tumor budding cut-offs of 5 and 10, a significantly higher percentage of high-grade tumor budding (≥5 and ≥10) was only found in the dMMR/met+ compared to pMMR/met+ subgroup (p=0.01 and p=0.02, respectively).
A significantly higher grade of tumor budding was observed in the dMMR/met+ group, suggesting that tumor budding can provide prognostic information for patients with a dMMR status.