Balancing ischemic and bleeding risk is an evolving framework.
Our objectives were to simulate changes in risks for adverse events and event-driven costs with use of ticagrelor or prasugrel versus ...clopidogrel according to varying levels of ischemic and bleeding risk. Using the validated PARIS risk functions, we estimated 1-year ischemic (myocardial infarction or stent thrombosis) and bleeding (Bleeding Academic Research Consortium types 3 or 5) event rates among PARIS study participants who underwent percutaneous coronary intervention with drug-eluting stent implantation for an acute coronary syndrome and were discharged with aspirin and clopidogrel (n=1497). Simulated changes in adverse events with ticagrelor or prasugrel were calculated by applying treatment effects from randomized trials for a 1-year time horizon. Event costs were estimated using National Inpatient Sample data. Net costs were calculated between antiplatelet therapy groups according to level of ischemic and bleeding risk. After weighting events for quality-of-life impact, we calculated event rates and costs for risk-tailored treatment versus clopidogrel under multiple drug pricing assumptions. One-year rates (per 1000 person-years) for ischemic events were 12.6, 24.1, and 66.1, respectively, among those at low (n=630), intermediate (n=536), and high (n=331) ischemic risk. Analogous bleeding rates were 11.0, 23.9, and 66.2, respectively, among low (n=728), intermediate (n=634), and high (n=135) bleeding risk patients. Mean per event costs were $22 174 (ischemic) and $12 203 (bleeding). When risks for ischemia matched or exceeded bleeding, simulated utility-weighted event rates favored ticagrelor/prasugrel, whereas clopidogrel reduced utility-weighted events when bleeding exceeded ischemic risk. One-year costs were sensitive to drug pricing assumptions, and risk-tailored treatment with either agent progressed from cost incurring to cost saving with increasing generic market share.
Tailoring antiplatelet therapy intensity to patient risk may improve health utility and could produce cost savings in the first year after percutaneous coronary intervention.
URL: https://www.clinicaltrials.gov . Unique identifier: NCT00998127.
Abstract Objectives The aim of this study was to examine the independent associations between actionable bleeding (AB) and coronary thrombotic events (CTE) on mortality risk after percutaneous ...coronary intervention (PCI). Background The independent impact of AB and CTE on mortality risk after PCI remains poorly characterized. Methods A post hoc analysis was conducted of the PARIS (Patterns of Non-Adherence to Dual Antiplatelet Therapy in Stented Patients) registry, a real-world cohort of 5,018 patients undergoing PCI with stent implantation. CTE included definite or probable stent thrombosis or myocardial infarction. AB was defined as Bleeding Academic Research Consortium type 2 or 3. Associations between CTE and AB, both of which were modeled as time-dependent covariates, and 2-year mortality risk were examined using extended Cox regression. Results Over 2 years, the cumulative incidence of CTE, AB, and all-cause mortality was 5.9% (n = 289), 8.1% (n = 391), and 4.7% (n = 227), respectively. Adjusted hazard ratios for mortality associated with CTE and AB were 3.3 (95% confidence interval: 2.2 to 4.9) and 3.5 (95% confidence interval: 2.3 to 5.4), respectively. Temporal gradients in risk after either event were highest in the first 30 days and declined rapidly thereafter. Thrombotic events occurring while patients were on versus off dual-antiplatelet therapy were associated with a higher mortality risk, whereas risk related to AB was not influenced by dual-antiplatelet therapy status at the time of bleeding. Conclusions Intracoronary thrombosis and AB are associated with mortality risks of comparable magnitude over a 2-year period after PCI, findings that might inform risk/benefit calculations for extension versus discontinuation of dual-antiplatelet therapy.
Bioluminescence imaging (BLI) has greatly facilitated the development of animal models of cancer, allowing sensitive detection of luciferase-expressing cancer cells in living mice. Previous efforts ...characterizing such models have involved small numbers of animals, limiting understanding of their performance features. We employed BLI to serially image the growth and distribution of a prostate cancer cell line, 22Rv1, after intracardiac injection into scid mice (n = 85). This approach models hematogenous dissemination of cancer cells and allows inquiry of the process of metastatic colonization at various organ sites, although accurately injecting cancer cells into the left ventricle remains challenging. Therefore, to predict injection success we measured the ratio of the thoracic bioluminescence signal to the whole body bioluminescence signal (T/WB ratio) immediately following intracardiac injection. A T/WB ratio less than 0.50 predicted the development of tumors outside of the thoracic cavity while a T/WB greater than 0.50 predicted the development of tumors entirely within the thoracic cavity, suggestive of a failed injection. Progressive tumor growth was quantified using BLI. Tumors colonized multiple organ sites including bone, liver, and adrenal glands resembling the spectrum of metastases in autopsy studies of patients with prostate cancer. Tumors growing in bone exhibited mixed osteolytic and osteoblastic features, eliciting a spiculated periosteal response. With the ability to more accurately predict injection success, we can now monitor efficacy of intracardiac injections facilitating the performance of this model.
Background & Aims A common cause of liver donor ineligibility is macrosteatosis. Recovery of such livers could enhance donor availability. Living donor studies have shown diet-induced reduction of ...macrosteatosis enables transplantation. However, cadaveric liver macrosteatotic reduction must be performed ex vivo within hours. Towards this goal, we investigated the effect of accelerated macrosteatosis reduction on hepatocyte viability and function using a novel system of macrosteatotic hepatocytes. Methods Hepatocytes isolated from lean Zucker rats were cultured in a collagen sandwich, incubated for 6 days in fatty acid-supplemented medium to induce steatosis, and then switched for 2 days to medium supplemented with lipid metabolism promoting agents. Intracellular lipid droplet size distribution and triglyceride, viability, albumin and urea secretion, and bile canalicular function were measured. Results Fatty acid-supplemented medium induced microsteatosis in 3 days and macrosteatosis in 6 days, the latter evidenced by large lipid droplets dislocating the nucleus to the cell periphery. Macrosteatosis significantly impaired all functions tested. Macrosteatosis decreased upon returning hepatocytes to standard medium, and the rate of decrease was 4-fold faster with supplemented agents, yielding 80% reduction in 2 days. Viability of macrosteatosis reduced hepatocytes was similar to control lean cells. Accelerated macrosteatotic reduction led to faster recovery of urea secretion and bile canalicular function, but not of albumin secretion. Conclusions Macrosteatosis reversibly decreases hepatocyte function and supplementary agents accelerate macrosteatosis reduction and some functional restoration with no effect on viability. This in vitro model may be useful to screen agents for macrosteatotic reduction in livers before transplantation.
Abstract
This study evaluated the effects of providing inorganic, chelated, and inorganic plus injectable trace mineral supplementation strategies with or without the inclusion of bismuth ...subsalicylate (BSS) on dry matter intake (DMI), water intake, ruminal hydrogen sulfide (H2S) concentration, and trace mineral status of growing beef heifers provided high sulfate (5,055 ± 228 mg/L) water. This study consisted of an 84-d feeding period with a 2×2 + 1 factorial treatment arrangement conducted using 2 blocks. Beef heifers (n = 15/block; 367 ± 33 kg BW) in each block were stratified according to their initial liver copper concentration. Mineral treatments included inorganic added CuSO4 (7.76 mg/kg), ZnSO4 (22.92 mg/kg), MnSO4 (15.28 mg/kg), chelated (100% of the Cu-methionine (7.68 mg/kg ), Zn-methionine (22.89 mg/kg), and Mn-methionine (15.27 mg/kg), and injectable trace mineral (15 mg/mL Cu, 10 mg/mL Mn, 60 mg/mL Zn) provided in combination with the inorganic mineral treatment. The BSS treatments were 0.0 (CON) or 0.2% (BSS) on a DM basis. Feed and water intake (weekly), ruminal H2S concentration (d 42 and 84), and liver (pre-study, d 42 and 84) and serum trace mineral concentrations (d 1, 28, 56, and 84) were evaluated. Initial liver trace mineral concentration was used as a covariate. Dry matter intake tended to be greater (mineral × BSS, P = 0.05) for heifers provided chelated minerals and fed CON, compared with BSS, and tended to be less for heifers fed inorganic and CON than BSS. Heifers fed chelated minerals drank 6.1 L/d more (P < 0.05) water than those provided inorganic minerals. Injectable trace-mineral provision did not affect DMI or water intake. Ruminal H2S was not affected by mineral type or BSS (P ≥ 0.33). The inclusion of BSS reduced (P < 0.01) liver Cu concentration from 60.5 to 31.3 ppm. Heifers provided the injectable minerals had greater (P < 0.01) liver Cu concentration than the inorganic without BSS treatment. Serum copper was not affected by BSS or mineral treatment (P ≥ 0.44). The liver concentration of Se was reduced (P < 0.01) by the inclusion of BSS. The serum Se concentration was not affected by mineral type, BSS, time, or any two- or three-way interactions (P ≥ 0.24). Bismuth subsalicylate did not affect ruminal H2S, and negatively affected liver Cu concentration. The use of organic trace mineral supplementation strategies did not affect the trace mineral status of beef heifers drinking high sulfate water, but use of an injectable trace mineral supplement increased liver Cu.
Abstract
Cattle consuming increased concentrations of sulfur (S) are at an increased risk for depletion of copper (Cu) and or S-induced polioencephalomalacia and there are limited mitigation ...strategies to alleviate risk for cattle consuming high sulfate water. This study evaluated the effects of feeding growing beef heifers bismuth subsalicylate (BSS; 0.0 vs. 0.4% DM basis) when provided water with a low (LS; 346 ± 13) or high (HS; 4,778 ± 263 mg/L) sulfate concentration on dry matter intake (DMI), water intake, ruminal hydrogen sulfide (H2S) concentration, and trace mineral status. Twenty-four beef heifers (221 ± 41 kg) were stratified based on initial liver copper concentrations collected 13 d before the start of the study, into a completely randomized block design using a high forage diet fed for 98 d. Feed and water intake (weekly), ruminal H2S concentration (d 42 and 91), and liver (pre-study and d 91) and serum trace mineral concentrations (d 1, 28, 56, and 91) were evaluated. Initial liver trace-mineral concentration was used as a covariate in the statistical model. Water intake tended to be reduced with the inclusion of BSS (P = 0.10) but was not affected by water sulfate (P = 0.40). Water sulfate and BSS did not affect DMI (P ≥ 0.89), but total S intake increased (P < 0.01) from LS to HS (17.6 to 51.0 g/d) resulting in diets that contained 0.28 and 0.78% S (DM basis). Heifers consuming HS had 1.58 µg/mL more (P < 0.01) ruminal H2S than LS. The inclusion of BSS reduced (P = 0.04) ruminal H2S concentration by 46%. Regardless of water sulfate concentration, heifers fed BSS had lesser liver Cu (average of 4.08 mg/kg) than heifers not provided BSS, and when not provided BSS, HS had lesser Cu than LS (42.2 vs. 58.3; sulfate × BSS, P = 0.02). Serum concentration of Cu did not differ over time for heifers not provided BSS; whereas heifers provided BSS had less serum Cu on d 91 than d 28 and 55 (BSS × time, P < 0.01). The liver concentration of selenium (Se) was reduced (P < 0.01) with BSS inclusion. The Se concentration in serum was not affected by sulfate, BSS, or time (P ≥ 0.16). Bismuth subsalicylate reduces ruminal H2S concentration, but depletes liver Cu and Se. Moreover, sulfate concentration in water does not appear to affect DMI or water intake but reduces liver Cu concentration.
Postmastectomy radiation therapy (PMRT) has been shown to benefit breast cancer patients with 1 to 3 positive lymph nodes, but it is unclear how modern changes in management have affected the ...benefits of PMRT.
We retrospectively analyzed the locoregional recurrence (LRR) rates in 1027 patients with T1,2 breast cancer with 1 to 3 positive lymph nodes treated with mastectomy and adjuvant chemotherapy with or without PMRT during an early era (1978-1997) and a later era (2000-2007). These eras were selected because they represented periods before and after the routine use of sentinel lymph node surgery, taxane chemotherapy, and aromatase inhibitors.
19% of 505 patients treated in the early era and 25% of the 522 patients in the later era received PMRT. Patients who received PMRT had significantly higher-risk disease features. PMRT reduced the rate of LRR in the early era cohort, with 5-year rates of 9.5% without PMRT and 3.4% with PMRT (log-rank P=.028) and 15-year rates 14.5% versus 6.1%, respectively; (Cox regression analysis: adjusted hazard ratio AHR 0.37, P=.035). However, PMRT did not appear to benefit patients treated in the later cohort, with 5-year LRR rates of 2.8% without PMRT and 4.2% with PMRT (P=.48; Cox analysis: AHR 1.41, P=.48). The most significant factor predictive of LRR for the patients who did not receive PMRT was the era in which the patient was treated (AHR 0.35 for later era, P<.001).
The risk of LRR for patients with T1,2 breast cancer with 1 to 3 positive lymph nodes treated with mastectomy and systemic treatment is highly dependent on the era of treatment. Modern treatment advances and the selected use of PMRT for those with high-risk features have allowed for identification of a cohort at very low risk for LRR without PMRT.
The aim of this study was to compare ticagrelor monotherapy with dual-antiplatelet therapy (DAPT) after percutaneous coronary intervention (PCI) with drug-eluting stents.
The role of abbreviated DAPT ...followed by an oral P2Y
inhibitor after PCI remains uncertain.
Two randomized trials, including 14,628 patients undergoing PCI, comparing ticagrelor monotherapy with standard DAPT on centrally adjudicated endpoints were identified, and individual patient data were analyzed using 1-step fixed-effect models. The protocol was registered in PROSPERO (CRD42019143120). The primary outcomes were the composite of Bleeding Academic Research Consortium type 3 or 5 bleeding tested for superiority and, if met, the composite of all-cause death, myocardial infarction, or stroke at 1 year, tested for noninferiority against a margin of 1.25 on a hazard ratio (HR) scale.
Bleeding Academic Research Consortium type 3 or 5 bleeding occurred in fewer patients with ticagrelor than DAPT (0.9% vs. 1.7%, respectively; HR: 0.56; 95% confidence interval CI: 0.41 to 0.75; p < 0.001). The composite of all-cause death, myocardial infarction, or stroke occurred in 231 patients (3.2%) with ticagrelor and in 254 patients (3.5%) with DAPT (HR: 0.92; 95% CI: 0.76 to 1.10; p < 0.001 for noninferiority). Ticagrelor was associated with lower risk for all-cause (HR: 0.71; 95% CI: 0.52 to 0.96; p = 0.027) and cardiovascular (HR: 0.68; 95% CI: 0.47 to 0.99; p = 0.044) mortality. Rates of myocardial infarction (2.01% vs. 2.05%; p = 0.88), stent thrombosis (0.29% vs. 0.38%; p = 0.32), and stroke (0.47% vs. 0.36%; p = 0.30) were similar.
Ticagrelor monotherapy was associated with a lower risk for major bleeding compared with standard DAPT, without a concomitant increase in ischemic events.
Patients With Prior Myocardial Infarction, Stroke, or Symptomatic Peripheral Arterial Disease in the CHARISMA Trial Deepak L. Bhatt, MD, FACC, Marcus D. Flather, MD, Werner Hacke, MD, Peter B. ...Berger, MD, FACC, Henry R. Black, MD, William E. Boden, MD, FACC, Patrice Cacoub, MD, Eric A. Cohen, MD, Mark A. Creager, MD, FACC, J. Donald Easton, MD, Christian W. Hamm, MD, FACC, Graeme J. Hankey, MD, S. Claiborne Johnston, PhD, MD, Koon-Hou Mak, MD, FACC, Jean-Louis Mas, MD, Gilles Montalescot, MD, PhD, Thomas A. Pearson, MD, FACC, P. Gabriel Steg, MD, FACC, Steven R. Steinhubl, MD, FACC, Michael A. Weber, MD, FACC, Liz Fabry-Ribaudo, MSN, RN, Tingfei Hu, MS, Eric J. Topol, MD, FACC, Keith A. A. Fox, MBChB, for the CHARISMA Investigators Dual antiplatelet therapy with clopidogrel plus aspirin has already been validated in the settings of acute coronary syndromes and coronary stenting. We identified 9,478 patients in the CHARISMA (Clopidogrel for High Atherothrombotic Risk and Ischemic Stabilization, Management, and Avoidance) trial who were enrolled with documented prior myocardial infarction (MI), ischemic stroke, or symptomatic peripheral arterial disease. The median duration of follow-up was 27.6 months. The rate of cardiovascular death, MI, or stroke was significantly lower in the clopidogrel plus aspirin arm than in the placebo plus aspirin arm: 7.3% versus 8.8% (hazard ratio 0.83, 95% confidence interval 0.72 to 0.96, p = 0.01); this benefit persisted after multivariate modeling and was not dependent on the time from the ischemic event.