Summary Background Digoxin is a widely used drug for ventricular rate control in patients with atrial fibrillation (AF), despite a scarcity of randomised trial data. We studied the use and outcomes ...of digoxin in patients in the Rivaroxaban Once Daily Oral Direct Factor Xa Inhibition Compared with Vitamin K Antagonism for Prevention of Stroke and Embolism Trial in Atrial Fibrillation (ROCKET AF). Methods For this retrospective analysis, we included and classified patients from ROCKET AF on the basis of digoxin use at baseline and during the study. Patients in ROCKET AF were recruited from 45 countries and had AF and risk factors putting them at moderate-to-high risk of stroke, with or without heart failure. We used Cox proportional hazards regression models adjusted for baseline characteristics and drugs to investigate the association of digoxin with all-cause mortality, vascular death, and sudden death. ROCKET AF was registered with ClinicalTrials.gov , number NCT00403767. Findings In 14 171 randomly assigned patients, digoxin was used at baseline in 5239 (37%). Patients given digoxin were more likely to be female (42% vs 38%) and have a history of heart failure (73% vs 56%), diabetes (43% vs 38%), and persistent AF (88% vs 77%; p<0·0001 for each comparison). After adjustment, digoxin was associated with increased all-cause mortality (5·41 vs 4·30 events per 100 patients-years; hazard ratio 1·17; 95% CI 1·04–1·32; p=0·0093), vascular death (3·55 vs 2·69 per 100 patient-years; 1·19; 1·03–1·39, p=0·0201), and sudden death (1·68 vs 1·12 events per 100 patient-years; 1·36; 1·08–1·70, p=0·0076). Interpretation Digoxin treatment was associated with a significant increase in all-cause mortality, vascular death, and sudden death in patients with AF. This association was independent of other measured prognostic factors, and although residual confounding could account for these results, these data show the possibility of digoxin having these effects. A randomised trial of digoxin in treatment of AF patients with and without heart failure is needed. Funding Janssen Research & Development and Bayer HealthCare AG.
A prospective cohort study.
To assess 10-year outcomes of patients with sciatica resulting from a lumbar disc herniation treated surgically or nonsurgically.
There is little information comparing ...long-term outcomes of surgical and conservative therapy of lumbar disc herniation in contemporary clinical practice. Prior studies suggest that these outcomes are similar.
Patients recruited from the practices of orthopedic surgeons, neurosurgeons, and occupational medicine physicians throughout Maine had baseline interviews with follow-up questionnaires mailed at regular intervals over 10 years. Clinical data were obtained at baseline from a physician questionnaire. Primary analyses were based on initial treatment received, either surgical or nonsurgical. Secondary analyses examined actual treatments received by 10 years. Outcomes included patient-reported symptoms of leg and back pain, functional status, satisfaction, and work and disability compensation status.
Of 507 eligible consenting patients initially enrolled, 10-year outcomes were available for 400 of 477 (84%) surviving patients; 217 of 255 (85%) treated surgically, and 183 of 222 (82%) treated nonsurgically. Patients undergoing surgery had worse baseline symptoms and functional status than those initially treated nonsurgically. By 10 years, 25% of surgical patients had undergone at least one additional lumbar spine operation, and 25% of nonsurgical patients had at least one lumbar spine operation. At 10-year follow-up, 69% of patients initially treated surgically reported improvement in their predominant symptom (back or leg pain) versus 61% of those initially treated nonsurgically (P = 0.2). A larger proportion of surgical patients reported that their low back and leg pain were much better or completely gone (56% vs. 40%, P = 0.006) and were more satisfied with their current status (71% vs. 56%, P = 0.002). Treatment group differences persisted after adjustment for other determinants of outcome in multivariate models. Change in the modified Roland back-specific functional status scale favored surgical treatment, and the relative benefit persisted over the follow-up period. Despite these differences, work and disability status at 10 years were comparable among those treated surgically or nonsurgically.
Surgically treated patients with a herniated lumbar disc had more complete relief of leg pain and improved function and satisfaction compared with nonsurgically treated patients over 10 years. Nevertheless, improvement in the patient's predominant symptom and work and disability outcomes were similar regardless of treatment received. For patients in whom elective discectomy is a treatment option, an individualized treatment plan requires patients and their physicians to integrate clinical findings with patient preferences based on their symptoms and goals.
The temporal relationship of atrial fibrillation (AF) and stroke risk is controversial. We evaluated this relationship via a case-crossover analysis of ischemic strokes in a large cohort of patients ...with cardiac implantable electronic devices.
We identified 9850 patients with cardiac implantable electronic devices remotely monitored in the Veterans Administration Health Care System between 2002 and 2012. There were 187 patients with acute ischemic stroke and continuous heart rhythm monitoring for 120 days before the stroke (age, 69±8.4 years; 98% with an implantable defibrillator). We compared each patient's daily AF burden in the 30 days before stroke (case period) with their AF burden during days 91 to 120 pre stroke (control period). Defining positive AF burden as ≥5.5 hours of AF on any given day, 156 patients (83%) had no positive AF burden in both periods and, in fact, had little to no AF; 15 (8%) patients had positive AF burden in both periods. Among the discordant (informative) patients, 13 exceeded 5.5 hours of AF in the case period but not in the control period, whereas 3 had positive AF burden in the control but not in the case period (warfarin-adjusted odds ratio for stroke, 4.2; 95% confidence interval, 1.5-13.4). Odds ratio for stroke was highest (17.4; 95% confidence interval, 5.39-73.1) in the 5 days immediately after a qualifying occurrence of AF and decreased toward 1.0 as the period after the AF occurrence increased beyond 30 days.
In this population with continuous heart rhythm recording, multiple hours of AF had a strong but transient effect raising stroke risk.
The rate of ischemic stroke associated with traditional risk factors for patients with atrial fibrillation has declined over the past 2 decades. Furthermore, new and potentially safer anticoagulants ...are on the horizon. Thus, the balance between risk factors for stroke and benefit of anticoagulation may be shifting.
The Markov state transition decision model was used to analyze the CHADS(2) score, above which anticoagulation is preferred, first using the stroke rate predicted for the CHADS(2) derivation cohort, and then using the stroke rate from the more contemporary AnTicoagulation and Risk Factors In Atrial Fibrillation cohort for any CHADS(2) score. The base case was a 69-year-old man with atrial fibrillation. Interventions included oral anticoagulant therapy with warfarin or a hypothetical "new and safer" anticoagulant (based on dabigatran), no antithrombotic therapy, or aspirin. Warfarin is preferred above a stroke rate of 1.7% per year, whereas aspirin is preferred at lower rates of stroke. Anticoagulation with warfarin is preferred even for a score of 0 using the higher rates of the older CHADS(2) derivation cohort. Using more contemporary and lower estimates of stroke risk raises the threshold for use of warfarin to a CHADS(2) score ≥2. However, anticoagulation with a "new, safer" agent, modeled on the results of the Randomized Evaluation of Long-Term Anticoagulation Therapy trial of dabigatran, leads to a lowering of the threshold for anticoagulation to a stroke rate of 0.9% per year.
Use of a more contemporary estimate of stroke risk shifts the "tipping point," such that anticoagulation is preferred at a higher CHADS(2) score, reducing the number of patients for whom anticoagulation is recommended. The introduction of "new, safer" agents, however, would shift the tipping point in the opposite direction.
Anticoagulation prophylaxis for stroke is recommended for at-risk patients with either persistent or paroxysmal atrial fibrillation (AF). We compared outcomes in patients with persistent vs. ...paroxysmal AF receiving oral anticoagulation.
Patients randomized in the Rivaroxaban Once Daily Oral Direct Factor Xa Inhibition Compared With Vitamin K Antagonism for Prevention of Stroke and Embolism Trial in Atrial Fibrillation (ROCKET-AF) trial (n = 14 264) were grouped by baseline AF category: paroxysmal or persistent. Multivariable adjustment was performed to compare thrombo-embolic events, bleeding, and death between groups, in high-risk subgroups, and across treatment assignment (rivaroxaban or warfarin). Of 14 062 patients, 11 548 (82%) had persistent AF and 2514 (18%) had paroxysmal AF. Patients with persistent AF were marginally older (73 vs. 72, P = 0.03), less likely female (39 vs. 45%, P < 0.0001), and more likely to have previously used vitamin K antagonists (64 vs. 56%, P < 0.0001) compared with patients with paroxysmal AF. In patients randomized to warfarin, time in therapeutic range was similar (58 vs. 57%, P = 0.94). Patients with persistent AF had higher adjusted rates of stroke or systemic embolism (2.18 vs. 1.73 events per 100-patient-years, P = 0.048) and all-cause mortality (4.78 vs. 3.52, P = 0.006). Rates of major bleeding were similar (3.55 vs. 3.31, P = 0.77). Rates of stroke or systemic embolism in both types of AF did not differ by treatment assignment (rivaroxaban vs. warfarin, Pinteraction = 0.6).
In patients with AF at moderate-to-high risk of stroke receiving anticoagulation, those with persistent AF have a higher risk of thrombo-embolic events and worse survival compared with paroxysmal AF.
The incidence of stroke in patients with atrial fibrillation is greatly reduced by oral anticoagulation, with the full effect seen at international normalized ratio (INR) values of 2.0 or greater. ...The effect of the intensity of oral anticoagulation on the severity of atrial fibrillation-related stroke is not known but is central to the choice of the target INR.
We studied incident ischemic strokes in a cohort of 13,559 patients with nonvalvular atrial fibrillation. Strokes were identified through hospitalization data bases and validated on the basis of medical records, which also provided information on the use of warfarin or aspirin, the INR at admission, and coexisting illnesses. The severity of stroke was graded according to a modified Rankin scale. Thirty-day mortality was ascertained from hospitalization and mortality files.
Of 596 ischemic strokes, 32 percent occurred during warfarin therapy, 27 percent during aspirin therapy, and 42 percent during neither type of therapy. Among patients who were taking warfarin, an INR of less than 2.0 at admission, as compared with an INR of 2.0 or greater, independently increased the odds of a severe stroke in a proportional-odds logistic-regression model (odds ratio, 1.9; 95 percent confidence interval, 1.1 to 3.4) across three severity categories and the risk of death within 30 days (hazard ratio, 3.4; 95 percent confidence interval, 1.1 to 10.1). An INR of 1.5 to 1.9 at admission was associated with a mortality rate similar to that for an INR of less than 1.5 (18 percent and 15 percent, respectively). The 30-day mortality rate among patients who were taking aspirin at the time of the stroke was similar to that among patients who were taking warfarin and who had an INR of less than 2.0.
Among patients with nonvalvular atrial fibrillation, anticoagulation that results in an INR of 2.0 or greater reduces not only the frequency of ischemic stroke but also its severity and the risk of death from stroke. Our findings provide further evidence against the use of lower INR target levels in patients with atrial fibrillation.
Background
Cigarette smoking is a risk factor for severe COVID-19 disease. Understanding smokers’ responses to the pandemic will help assess its public health impact and inform future public health ...and provider messages to smokers.
Objective
To assess risk perceptions and change in tobacco use among current and former smokers during the COVID-19 pandemic.
Design
Cross-sectional survey conducted in May–July 2020 (55% response rate)
Participants
694 current and former daily smokers (mean age 53, 40% male, 78% white) who had been hospitalized pre-COVID-19 and enrolled into a smoking cessation clinical trial at hospitals in Massachusetts, Pennsylvania, and Tennessee.
Main Measures
Perceived risk of COVID-19 due to tobacco use; changes in tobacco consumption and interest in quitting tobacco use; self-reported quitting and relapse since January 2020.
Key Results
68% (95% CI, 65–72%) of respondents believed that smoking increases the risk of contracting COVID-19 or having a more severe case. In adjusted analyses, perceived risk was higher in Massachusetts where COVID-19 had already surged than in Pennsylvania and Tennessee which were pre-surge during survey administration (AOR 1.56, 95% CI, 1.07–2.28). Higher perceived COVID-19 risk was associated with increased interest in quitting smoking (AOR 1.72, 95% CI 1.01–2.92). During the pandemic, 32% (95% CI, 27–37%) of smokers increased, 37% (95% CI, 33–42%) decreased, and 31% (95% CI, 26–35%) did not change their cigarette consumption. Increased smoking was associated with higher perceived stress (AOR 1.49, 95% CI 1.16–1.91). Overall, 11% (95% CI, 8–14%) of respondents who smoked in January 2020 (pre-COVID-19) had quit smoking at survey (mean, 6 months later) while 28% (95% CI, 22–34%) of former smokers relapsed. Higher perceived COVID-19 risk was associated with higher odds of quitting and lower odds of relapse.
Conclusions
Most smokers believed that smoking increased COVID-19 risk. Smokers’ responses to the pandemic varied, with increased smoking related to stress and increased quitting associated with perceived COVID-19 vulnerability.
Comparison of Risk Stratification Schemes to Predict Thromboembolism in People With Nonvalvular Atrial Fibrillation Margaret C. Fang, Alan S. Go, Yuchiao Chang, Leila Borowsky, Niela K. Pomernacki, ...Daniel E. Singer, for the ATRIA Study Group The predictive ability of 5 major risk schemes used to predict atrial fibrillation (AF)–related thromboembolism was assessed in a cohort of 13,559 adults with AF. We identified 685 thromboembolic events occurring during 32,721 person-years off warfarin therapy. Schemes had only a fair ability to predict thromboembolism, and the proportion of patients categorized in individual risk categories varied widely across risk schemes. For example, the proportion of patients considered high risk ranged from 16% to 80%. Choice of antithrombotic therapy for individual patients may vary widely depending on which scheme is used. More informative ways to predict AF–related thromboembolism are needed.
Background Many patients with cirrhosis have concurrent nonvalvular atrial fibrillation (NVAF). Data are lacking regarding recent oral anticoagulant (OAC) usage trends among US patients with ...cirrhosis and NVAF. Methods and Results Using MarketScan claims data (2012-2019), we identified patients with cirrhosis and NVAF eligible for OACs (CHA
DS
-VASc score ≥2 men or ≥3 women). We calculated the yearly proportion of patients prescribed a direct OAC (DOAC), warfarin, or no OAC. We stratified by high-risk features (decompensated cirrhosis, thrombocytopenia, coagulopathy, chronic kidney disease, or end-stage renal disease). Among 32 487 patients (mean age=71.6 years, 38.5% women, 15.1% with decompensated cirrhosis, mean CHA
DS
-VASc=4.2), 44.6% used OACs within 180 days of NVAF diagnosis, including DOACs (20.2%) or warfarin (24.4%). Compared with OAC nonusers, OAC users were less likely to have decompensated cirrhosis (18.6% versus 10.7%), thrombocytopenia (19.5% versus 12.5%), or chronic kidney disease/end-stage renal disease (15.5% versus 14.0%). Between 2012 and 2019, warfarin use decreased by 21.0% (32.0% to 11.0%), whereas DOAC use increased by 30.6% (7.4% to 38.0%), and among all DOACs between 2012 and 2019, apixaban was the most commonly prescribed (46.1%). Warfarin use decreased and DOAC use increased in all subgroups, including in compensated and decompensated cirrhosis, thrombocytopenia, coagulopathy, chronic kidney disease/end-stage renal disease, and across CHA
DS
-VASc categories. Among OAC users (2012-2019), DOAC use increased by 58.9% (18.7% to 77.6%). Among DOAC users, the greatest proportional increase was with apixaban (61.2%;
<0.001). Conclusions Among US patients with cirrhosis and NVAF, DOAC use has increased substantially and surpassed warfarin, including in decompensated cirrhosis. Nevertheless, >55% of patients remain untreated, underscoring the need for clearer treatment guidance.
Undiagnosed atrial fibrillation (AF) may cause preventable strokes. Guidelines differ regarding AF screening recommendations. We tested whether point-of-care screening with a handheld single-lead ECG ...at primary care practice visits increases diagnoses of AF.
We randomized 16 primary care clinics 1:1 to AF screening using a handheld single-lead ECG (AliveCor KardiaMobile) during vital sign assessments, or usual care. Patients included were ages ≥65 years. Screening results were provided to primary care clinicians at the encounter. All confirmatory diagnostic testing and treatment decisions were made by the primary care clinician. New AF diagnoses during the 1-year follow-up were ascertained electronically and manually adjudicated. Proportions and incidence rates were calculated. Effect heterogeneity was assessed.
Of 30 715 patients without prevalent AF (n=15 393 screening 91% screened, n=15 322 control), 1.72% of individuals in the screening group had new AF diagnosed at 1 year versus 1.59% in the control group (risk difference, 0.13% 95% CI, -0.16 to 0.42;
=0.38). In prespecified subgroup analyses, new AF diagnoses in the screening and control groups were greater among those aged ≥85 years (5.56% versus 3.76%, respectively; risk difference, 1.80% 95% CI, 0.18 to 3.30). The difference in newly diagnosed AF between the screening period and the previous year was marginally greater in the screening versus control group (0.32% versus -0.12%; risk difference, 0.43% 95% CI, -0.01 to 0.84). The proportion of individuals with newly diagnosed AF who were initiated on oral anticoagulants was not different in the screening (n=194, 73.5%) and control (n=172, 70.8%) arms (risk difference, 2.7% 95% CI, -5.5 to 10.4).
Screening for AF using a single-lead ECG at primary care visits did not affect new AF diagnoses among all individuals aged 65 years or older compared with usual care.
URL: https://www.
gov; Unique identifier: NCT03515057.