Prompt identification of acute coronary syndrome is a challenge in clinical practice. The 12-lead electrocardiogram (ECG) is readily available during initial patient evaluation, but current ...rule-based interpretation approaches lack sufficient accuracy. Here we report machine learning-based methods for the prediction of underlying acute myocardial ischemia in patients with chest pain. Using 554 temporal-spatial features of the 12-lead ECG, we train and test multiple classifiers on two independent prospective patient cohorts (n = 1244). While maintaining higher negative predictive value, our final fusion model achieves 52% gain in sensitivity compared to commercial interpretation software and 37% gain in sensitivity compared to experienced clinicians. Such an ultra-early, ECG-based clinical decision support tool, when combined with the judgment of trained emergency personnel, would help to improve clinical outcomes and reduce unnecessary costs in patients with chest pain.
No studies have performed direct pairwise comparisons of the effectiveness and safety of warfarin and the new oral anticoagulants (NOACs) apixaban, dabigatran, and rivaroxaban. Using 2013 to 2014 ...claims from a 5% random sample of Medicare beneficiaries, we identified patients newly diagnosed with atrial fibrillation who initiated apixaban, dabigatran, rivaroxaban, warfarin, or no oral anticoagulation therapy in 2013 to 2014. Outcomes included the composite of ischemic stroke, systemic embolism (SE) and death, any bleeding event, gastrointestinal bleeding, intracranial bleeding, and treatment persistence. We constructed Cox proportional hazard models to compare outcomes between each pair of treatment groups. The composite risk of ischemic stroke, SE, and death was lower for NOACs than for warfarin: hazard ratio (HR) 0.86, 95% confidence interval (CI) 0.76 to 0.98 for apixaban; 0.73, 95% CI 0.63 to 0.86 for dabigatran; and 0.82, 95% CI 0.75 to 0.89 for rivaroxaban, all compared with warfarin. There were no differences in effectiveness across NOACs. The risk of any bleeding was lower with apixaban than with warfarin, but higher with rivaroxaban than with warfarin. Apixaban (HR 0.69, 95% CI 0.60 to 0.79) and dabigatran (HR 0.79, 95% CI 0.69 to 0.92) were associated with lower bleeding risk than rivaroxaban. Treatment persistence was highest for apixaban (82%), and lowest for dabigatran and warfarin (64%) (p value <0.001). Compared with warfarin, NOACs are more effective in preventing stroke but their risk of bleeding varies, with rivaroxaban having higher risk than warfarin. Altogether, apixaban had the most favorable effectiveness, safety, and persistence profile.
Background:
Cardiovascular implantable electronic device (CIED) infections have been increasing out of proportion to the number of devices implanted, based on data available through 2003. We ...investigated recent trends and possible causes of the increasing numbers of CIED infections.
Methods:
We analyzed the occurrence of CIED infections and the associated changes in characteristics of CIED recipients, using the National Hospital Discharge Survey database from 1996 through 2006.
Results:
The number of CIED implantations continued to increase after 2003 from 199,516 in 2004 to 222,940 in 2006, representing a 12% increment. In the same period, the number of CIED infections increased from 8,273 in 2004 to 12,979 in 2006, representing a 57% increment. From 1996 to 2006, comorbid illnesses in recipients of new CIED devices became more prevalent with an increasing percentage of patients with end‐organ failures (6.5% in 1996 vs 8.0% in 2006, P < 0.001) and diabetes mellitus (14.5% in 1996 vs 16.5% in 2006, P = 0.005). The proportion of Caucasian recipients also decreased (65.6% in 1996 vs 57.6% in 2006, P < 0.001). During that same period, the number of implanted cardiac resynchronization devices increased dramatically while the age of CIED recipients did not change.
Conclusion:
The number of patients with CIED‐related infections in the United States continues to increase out of proportion to the increase in implantation rates. Possible causes for this on‐going epidemic include sicker patients with varying racial backgrounds, and more complex procedures. These insights may help improve our ability to best select patients for CIED implantation in "real‐life" settings. (PACE 2010; 414–419)
Abstract Background Current guidelines suggest that patients with left bundle branch block (LBBB) be treated with cardiac resynchronization therapy (CRT); however, one-third do not have a significant ...activation delay, which can result in nonresponse. By identifying characteristic opposing wall contraction, 2-dimensional strain echocardiography (2DSE) may detect true LBBB activation. Objectives This study sought to investigate whether the absence of a typical LBBB mechanical activation pattern by 2DSE was associated with unfavorable long-term outcome and if this is additive to electrocardiographic (ECG) morphology and duration. Methods From 2 centers, 208 CRT candidates (New York Heart Association classes II to IV, ejection fraction ≤35%, QRS duration ≥120 ms) with LBBB by ECG were prospectively included. Before CRT implantation, longitudinal strain in the apical 4-chamber view determined whether typical LBBB contraction was present. The pre-defined outcome was freedom from death, left ventricular assist device, or heart transplantation over 4 years. Results Two-thirds of patients (63%) had a typical LBBB contraction pattern. During 4 years, 48 patients (23%) reached the primary endpoint. Absence of a typical LBBB contraction was independently associated with increased risk of adverse outcome after adjustment for ischemic heart disease and QRS width (hazard ratio HR: 3.1; 95% CI: 1.64 to 5.88; p < 0.005). Adding pattern assessment to a risk prediction model including QRS duration and ischemic heart disease significantly improved the net reclassification index to 0.14 (p = 0.04) and improved the C-statistics (0.63 95% CI: 0.54 to 0.72 vs. 0.71 95% CI: 0.63 to 0.80; p = 0.02). Use of strict LBBB ECG criteria was not independently associated with outcome in the multivariate model (HR: 1.72; 95% CI: 0.89 to 3.33; p = 0.11. Assessment of LBBB contraction pattern was superior to time-to-peak indexes of dyssynchrony (p < 0.01 for all). Conclusions Contraction pattern assessment to identify true LBBB activation provided important prognostic information in CRT candidates.
Background About 30% of patients with heart failure do not respond to cardiac resynchronization therapy (CRT). We hypothesized that scar burden can predict poor response to CRT in patients with ...ischemic cardiomyopathy (ICM). Methods Fifty patients (age, 68.5 ± 9.2 years; 84% men; mean left ventricular ejection fraction (LVEF), 19.7% ± 5.2%) with ICM who underwent CRT-defibrillator implantation and201 Tl single photon emission computed tomography myocardial perfusion imaging were included. Myocardial perfusion imaging studies were read quantitatively, generating a summed perfusion score (SPS). Left ventricular (LV) lead position was determined by chest radiography. Echocardiograms were performed before and after (median, 11.0 months) CRT. Results Echocardiographic response, defined as ≥15% relative increase in LVEF, was documented in 28 (56%) patients. The mean SPS (18.8 ± 11.3 vs 33.7 ± 11.1; P = .000025) and the average scar density in the segments immediately adjacent to the LV lead (0.70 ± 0.91 vs 1.64 ± 0.82; P = .0004) were significantly lower in responders versus nonresponders. Global scar burden ( r = −0.53; P = .00007), scar burden near the LV lead ( r = −0.49; P = .0003), and the number of segments with a score of 4 ( r = −0.53; P = .0007) inversely correlated with increase in LVEF after CRT. The hazard ratio for nonresponse increased with increasing tertiles of global SPS, scar density in the vicinity of the LV lead, and number of segments with transmural scar (ie, perfusion score = 4). Conclusions Higher overall scar burden, a larger number of severely scarred segments, and greater scar density near the LV lead tip portend an unfavorable response to CRT in ICM patients. Prospective confirmation of these findings is warranted.
Previous methods to quantify dyssynchrony could not determine regional 3-dimensional (3-D) strain. We hypothesized that a novel 3-D speckle tracking strain imaging system can quantify left ...ventricular (LV) dyssynchrony and site of latest mechanical activation. We studied 64 subjects; 54 patients with heart failure were referred for cardiac resynchronization therapy (CRT) with an ejection fraction 25 ± 6% and QRS interval 165 ± 29 ms and 10 healthy volunteer controls. The 3-D speckle tracking system determined radial strain using a 16-segment model from a pyramidal 3-D dataset. Dyssynchrony was quantified as maximal opposing wall delay and SD in time to peak strain. The 3-D analysis was compared to standard 2-dimensional (2-D) strain datasets and site of 3-D latest mechanical activation, not possible by 2D was quantified. As expected, dyssynchrony in patients on CRT was significantly greater than in controls (maximal opposing wall delay 316 ± 112 vs 59 ± 12 ms and SD 124 ± 48 vs 28 ± 11 ms, p <0.001 vs normal). The 3-D opposing wall delay was closely correlated with 3-D 16-segment SD (r = 0.95) and 2-D mid-LV strain (r = 0.83) and SD (r = 0.85, all p values <0.001). The 3-D site of the latest mechanical activation was most commonly midposterior (26%), basal posterior (22%), midlateral (20%), and basal lateral (17%). Eleven patients studied after CRT demonstrated improvements in 3-D synchrony (300 ± 124 to 94 ± 37 ms) and ejection fraction (24 ± 6% to 31 ± 7%, p <0.05). In conclusion, 3-D speckle tracking can successfully quantify 3-D dyssynchrony and site the latest mechanical activation. This approach may play a clinical role in management of patients on CRT.
Cardiac resynchronization therapy (CRT) improves morbidity and mortality in patients with heart failure with QRS >120 ms, yet most patients studied in clinical trials manifested baseline left branch ...bundle block (LBBB). It is unclear whether benefits of CRT extend to patients with right branch bundle block (RBBB) or a paced QRS at baseline despite QRS >120 ms. Orthotopic heart transplantation– and ventricular assist device–free survival, symptomatic response, and echocardiographic response were evaluated in the 636 patients who underwent CRT at our institution from 2000 to 2007 in whom the baseline electrocardiogram showed LBBB (n = 412; 65%), paced QRS (n = 162; 26%), or RBBB (n = 62; 10%). Mortality was assessed using the Social Security Death Index, and the medical record was analyzed for clinical data. A decrease in New York Heart Association class ≥0.5 after ≥6 months of CRT defined symptomatic response. Echocardiographic evidence of improved left ventricular function and reverse remodeling was evaluated after ≥6 months of CRT. Survival free from orthotopic heart transplantation and ventricular assist device placement was best in patients with LBBB and worst in those with RBBB, whereas patients with paced QRS had an intermediate prognosis (p = 0.003). This finding remained significant after controlling for baseline differences among the 3 groups. Symptomatic response was observed most often in patients with LBBB (60%), occurred least often in patients with RBBB (14%), and was intermediate in patients with paced QRS (46%; p <0.001). Echocardiographic improvement showed a similar stepwise trend. In conclusion, patients with RBBB undergoing CRT had low rates of symptomatic and echocardiographic response, and their survival free from orthotopic heart transplantation or ventricular assist device placement was significantly worse than in patients with LBBB. Patients with conventionally paced QRS experienced an intermediate response.
Mechanical dyssynchrony is a potential means to predict response to cardiac resynchronization therapy (CRT). We hypothesized that novel echocardiographic image speckle tracking can quantify ...dyssynchrony and predict response to CRT.
Seventy-four subjects were studied: 64 heart failure patients undergoing CRT (aged 64+/-12 years, ejection fraction 26+/-6%, QRS duration 157+/-28 ms) and 10 normal controls. Speckle tracking applied to routine midventricular short-axis images calculated radial strain from multiple circumferential points averaged to 6 standard segments. Dyssynchrony from timing of speckle-tracking peak radial strain was correlated with tissue Doppler measures in 47 subjects (r=0.94, P<0.001; 95% CI 0.90 to 0.96). The ability of baseline speckle-tracking radial dyssynchrony (time difference in peak septal wall-to-posterior wall strain > or =130 ms) to predict response to CRT was then tested. It predicted an immediate increase in stroke volume in 48 patients studied the day after CRT with 91% sensitivity and 75% specificity. In 50 patients with long-term follow-up 8+/-5 months after CRT, baseline speckle-tracking radial dyssynchrony predicted a significant increase in ejection fraction with 89% sensitivity and 83% specificity. Patients in whom left ventricular lead position was concordant with the site of latest mechanical activation by speckle-tracking radial strain had an increase in ejection fraction from baseline to a greater degree (10+/-5%) than patients with discordant lead position (6+/-5%; P<0.05).
Speckle-tracking radial strain can quantify dyssynchrony and predict immediate and long-term response to CRT and has potential for clinical application.