Much has been written about real-world evidence (RWE), a concept that offers an understanding of the effects of healthcare interventions using routine clinical data. The reflection of diverse ...real-world practices is a double-edged sword that makes RWE attractive but also opens doors to several biases that need to be minimised both in the design and analytical phases of non-experimental studies. Additionally, it is critical to ensure that researchers who conduct these studies possess adequate methodological expertise and ability to accurately implement these methods. Critical design elements to be considered should include a clearly defined research question using a causal inference framework, choice of a fit-for-purpose data source, inclusion of new users of a treatment with comparators that are as similar as possible to that group, accurately classifying person-time and deciding censoring approaches. Having taken measures to minimise bias ‘by design’, the next step is to implement appropriate analytical techniques (for example propensity scores) to minimise the remnant potential biases. A clear protocol should be provided at the beginning of the study and a report of the results after, including caveats to consider. We also point the readers to readings on some novel analytical methods as well as newer areas of application of RWE. While there is no one-size-fits-all solution to evaluating RWE studies, we have focused our discussion on key methods and issues commonly encountered in comparative observational cohort studies with the hope that readers are better equipped to evaluate non-experimental studies that they encounter in the future.
Graphical abstract
Confounding can cause substantial bias in nonexperimental studies that aim to estimate causal effects. Propensity score methods allow researchers to reduce bias from measured confounding by ...summarizing the distributions of many measured confounders in a single score based on the probability of receiving treatment. This score can then be used to mitigate imbalances in the distributions of these measured confounders between those who received the treatment of interest and those in the comparator population, resulting in less biased treatment effect estimates. This methodology was formalized by Rosenbaum and Rubin in 1983 and, since then, has been used increasingly often across a wide variety of scientific disciplines. In this review article, we provide an overview of propensity scores in the context of real‐world evidence generation with a focus on their use in the setting of single treatment decisions, that is, choosing between two therapeutic options. We describe five aspects of propensity score analysis: alignment with the potential outcomes framework, implications for study design, estimation procedures, implementation options, and reporting. We add context to these concepts by highlighting how the types of comparator used, the implementation method, and balance assessment techniques have changed over time. Finally, we discuss evolving applications of propensity scores.
Purpose
To provide guidance on data linkage appropriateness and feasibility to plan purposeful and sustainable new linkages that advance pharmacoepidemiology and healthcare research. Planning a new ...data linkage requires careful evaluation to weigh the resources required with the potential overall benefits.
Methods
In response to an International Society for Pharmacoepidemiology (ISPE) call for manuscripts, a working group comprised of members from academic, industry, and government determined priority content areas; appropriateness and feasibility of data linkage was selected. Within this topic, scientific and operational considerations were determined, reviewed, and formulated into key areas, and translated into 12 consensus recommendations.
Results
Guidance for feasibility assessment was categorized into five key areas: (1) research objectives and justification; (2) data quality and completeness; (3) the linkage process; (4) data ownership and governance; and (5) overall value added by linkage. Within these key areas, recommendations to consider prior to initiation were developed to evaluate suitability of the linkage to meet research objectives, assess source data completeness and population coverage, and ensure well‐defined data governance standards and protections. When creating novel linked datasets, researchers must assess the feasibility of both scientific (data quality and linkage methods) and operational (access, data use and transfer, governance, and cost) aspects.
Conclusions
The data linkage feasibility assessment considerations outlined can be used as a guide when designing sustainable linked data resources to generate actionable evidence in healthcare research. These recommendations were constructed for wide applicability and can be adapted depending on the geographic, structural, and data components of the linkage.
•There is unmet medical need in high-risk locally advanced cervical cancer patients.•Most patients received initial cisplatin-based concurrent chemoradiotherapy.•Many patients experienced recurrence ...or had persistent disease following treatment.•Therapies that delay recurrence/progression may clinically benefit these patients.
To characterize the real-world treatment patterns and outcomes of patients with high-risk locally advanced cervical cancer (HR-LACC).
This retrospective study identified and randomly selected adults diagnosed between 2010 and 2018 from the ConcertAI Oncology Dataset. For patients initially treated with concurrent chemoradiotherapy (CCRT), we estimated real-world progression-free survival (rwPFS) among those with persistent disease, real-world time on CCRT, and recurrence-free survival (rwRFS) using Kaplan-Meier methods.
The cohort included 300 patients. Median age at diagnosis was 51 years. 53.7 % were White and 30.0 % were Black; 52.0 % were premenopausal; 89.3 % had squamous cell histology; 75.3 % had stage III disease, and 92.7 % had no evidence of performance status impairment. Initial treatment included CCRT (N = 229), surgery (N = 28), antineoplastics only (N = 11), and radiation only (N = 5). Twenty-seven patients were untreated. Baseline characteristics for the CCRT-first patients were similar to the overall cohort; their median real-world time on treatment was 1.6 months; 78.2 % received cisplatin for a median of 1.2 months; 28.4 % received antineoplastics after CCRT, and 11.8 % initiated a second antineoplastic therapy. Of the CCRT-first patients, 27/143 with a complete response had subsequent recurrent disease (median rwRFS not reached). 179 patients had persistent disease, among whom median (95 % confidence interval CI) rwPFS was 29.7 (16.9–59.3) months.
In this study of United States-based clinical practices, most HR-LACC patients received CCRT as initial treatment. Many patients developed persistent disease after CCRT indicating a need for improved first treatment and maintenance options.
Background:
A growing interest in long-term sequelae of COVID-19 has prompted several systematic literature reviews (SLRs) to evaluate long-COVID-19 effects. However, many of these reviews lack ...in-depth information on the timing, duration, and severity of these conditions.
Objectives:
Our aim was to synthesize both qualitative and quantitative evidence on prevalence and outcomes of long-term effect of COVID-19 through an umbrella review.
Design:
Umbrella review of relevant SLRs on long-COVID-19 in terms of prolonged symptoms and clinical conditions, and comprehensively synthesized the latest existing evidence.
Data Sources and Methods:
We systematically identified and appraised prior systematic reviews/meta-analyses using MEDLINE, Embase, and Cochrane database of systematic review from 2020 to 2021 following the preferred reporting items for systematic reviews and meta-analyses guidance. We summarized and categorized all relevant clinical symptoms and outcomes in adults with COVID-19 using the Medical Dictionary for Regulatory Activities System Organ Class (MedDRA SOC).
Results:
We identified 967 systematic reviews/meta-analyses; 36 were retained for final data extraction. The most prevalent SOC were social circumstances (40%), blood and lymphatic system disorders (39%), and metabolism and nutrition disorder (38%). The most frequently reported SOC outcomes within each MedDRA category were poor quality of life (59%), wheezing and dyspnea (19−49%), fatigue (30−64%), chest pain (16%), decreased or loss of appetite (14–17%), abdominal discomfort or digestive disorder (12−18%), arthralgia with or without myalgia (16–24%), paresthesia (27%) and hair loss (14–25%), and hearing loss or tinnitus (15%).
Conclusion:
This study confirmed a high prevalence of several long COVID-19 outcomes according to the MedDRA categories and indicated that the majority of evidence was rated as moderate to low.
Registration:
The review was registered at PROSPERO (https://www.crd.york.ac.uk/prospero/) (CRD42022303557).
Inhaled medications are the cornerstone of treatment and management of asthma and COPD. However, inhaler device errors are common among patients and have been linked with reduced symptom control, an ...increased risk of exacerbations, and increased healthcare utilisation. These observations have prompted GINA (Global INitiative for Asthma) and GOLD (Global initiative for chronic Obstructive Lung Disease) to recommend regular assessment of inhaler technique in a bid to improve therapeutic outcomes. To better define the relationship between device errors and health outcomes (clinical outcomes, quality of life, and healthcare utilisation) in asthma and COPD, we conducted a systematic review of the literature, with a particular focus on the methods used to assess the relationship between device errors and outcomes. Sixteen studies were identified (12 in patients with asthma, one in patients with COPD, and three in both asthma and COPD) with varying study designs, endpoints, and patient populations. Most of the studies reported that inhalation errors were associated with worse disease outcomes in patients with asthma or COPD. Patients who had a reduction in errors over time had improved outcomes. These findings suggest that time invested by healthcare professionals is vital to improving inhalation technique in asthma and COPD patients to improve health outcomes.
•Of 307 patients in the study, 72.0% received bevacizumab in 1 L.•The most common first-line regimen was bevacizumab/carboplatin/paclitaxel (40.7%).•Median 95% CI overall survival from first line ...start was 16.5 14.2–19.9 months.•The real-world survival seen in this study agrees with previously reported data.
Patients with persistent, recurrent, or metastatic cervical cancer have poor prognosis. While recent advances have expanded treatment options, real-world data on treatment patterns and outcomes in this population are lacking.
This retrospective study identified adult females with persistent, recurrent, or metastatic cervical cancer from the ConcertAI Oncology Dataset who received systemic therapy on or after August 15, 2014. Patients were followed from persistent, recurrent, or metastatic diagnosis through third-line (3 L) therapy, death, end of record, or study end (June 2021). Data collection included patient characteristics, treatment patterns, and clinical outcomes. Kaplan-Meier methods were used for the three most common first-line (1 L) regimens to analyze real-world time on treatment (rwToT), real-world progression-free survival (rwPFS), and real-world overall survival (rwOS). Analyses were stratified by bevacizumab receipt by treatment line.
307 patients were included (mean standard deviation age 51.5 13.2 years, 70.7% White). 91.2% of patients had metastatic disease, 8.5% had persistent disease, and <1% had recurrent disease. The most common 1 L regimen was carboplatin+paclitaxel+bevacizumab (40.7%) with median (95% confidence interval CI) rwToT of 3.5 (2.9–4.4) months. 57.0% of patients proceeded to second line (2 L), and 25.7% went to 3 L. Median (95% CI) rwPFS was 7.2 (6.4–8.1) months, and median (95% CI) rwOS was 16.5 (14.2–19.9) months, from initiation of 1 L.
1 L regimens received in patients with persistent, recurrent, or metastatic cervical cancer generally followed clinical guidelines, and the rwOS agrees with clinical trials. This study highlights the burden of disease and unmet need for specific treatments in these patients.
Background
Data on asthma burden in pediatric patients are limited; this real‐world study investigated exacerbation frequency and health care resource utilization (HCRU) in pediatric asthma patients ...from the US and England.
Methods
Data from pediatric patients (aged 6‐17 years) in the Optum claims database (US) or Clinical Practice Research Datalink with linkage to Hospital Episode Statistics (England) were analyzed. Patients were categorized into four hierarchical groups: treated asthma (patients with ≥1 baseline asthma medication), severe asthma (plus Global Initiative for Asthma Step 4/5), severe refractory asthma (SRA plus ≥2 baseline severe asthma exacerbations), and eosinophilic SRA (SRA plus blood eosinophil count ≥150 cells/µL). Exacerbation frequency and HCRU during the 12 months postindex were described.
Results
Of 151 549 treated asthma patients in the US, 18 086 had severe asthma, 2099 SRA, and 109 eosinophilic SRA. There were 32 893 treated asthma patients in England, of whom 2711 had severe asthma, 265 SRA, and 8 eosinophilic SRA. In the 12 months postindex, ≥1 exacerbation occurred in 12.4% and 10.8% of patients with severe asthma, and 32.6% and 42.6% with SRA in the US and England, respectively. The proportions of patients with ≥1 asthma hospitalization in the 30 days after the first asthma exacerbation were 2.7% and 4.4% (treated), 3.5% and 8.2% (severe asthma), and 6.0% and 16.8% (SRA) in the US and England, respectively.
Conclusion
This study provides insights into current asthma management practices in the US and England and indicates that some patients with severe disease have an unmet need for effective management.
Real‐world data from the US and England were analyzed to investigate exacerbation frequency and health care resource utilization (HCRU) in pediatric asthma patients. Exacerbation frequency and HCRU tended to increase with asthma severity, indicating that disease burden increases with disease severity. This study provides insights into current asthma management practices in the US and England and indicates that some patients with severe disease have an unmet need for effective management.
Abstract only
e18698
Background: Endometrial cancer (EC) is the most common gynecologic cancer in the US, yet real-world disease burden is poorly understood. To address this, we conducted a ...retrospective study exploring patient (pt) characteristics, treatment patterns, overall survival (OS), and healthcare resource utilization (HCRU) among elderly US pts with EC. Methods: Using Surveillance, Epidemiology, and End Results (SEER)-Medicare linked data, we identified beneficiaries aged ≥65 y with newly diagnosed EC between Jan 1, 2007–Dec 31, 2013. Pts were followed from the EC diagnosis date to the earliest of death, loss to follow-up, or Dec 31, 2014. Descriptive analyses were conducted for pt characteristics assessed in the 6-mo baseline period and for treatment patterns and HCRU assessed during follow-up. Median OS was estimated from the start of each line of systemic therapy using the Kaplan-Meier method. Lines of therapy started when pts received a new systemic therapy regimen and ended when pts switched to another regimen, after a 90-d treatment gap, or at end of follow-up. For pts with surgery, systemic therapy starting >120 d after surgery or after discontinuation of adjuvant therapy was defined as first-line (1L) therapy. Adjuvant therapy was defined as any systemic therapy starting ≤120 d after surgery for EC and ending after a 90-d treatment gap following the last prescription. Results: There were 12,710 eligible pts with EC during 2007–2013 in the SEER-Medicare database; median age at diagnosis was 73 y. At initial diagnosis, 9395 (73.9%) pts had stage I/II EC, 2042 (16.1%) had stage III, and 1273 (10.0%) had stage IV. 778 pts did not receive surgery/radiation, 1230 pts received surgery/radiation plus adjuvant therapy, 9729 pts received surgery/radiation only, and 973 (7.7%) pts received 1L systemic therapy. Of these 973 pts, 370 (38.0%) received second-line (2L) and 157 (16.1%) received third-line (3L) treatment. Pts receiving 1L therapy had a mean of 5.6 outpatient physician office visits per month, 22.1% had ≥1 hospitalization, and 38.5% had ≥1 emergency room visit during follow-up. Carboplatin-based regimens were the most frequently used 1L therapies (56.8%), typically combined with paclitaxel (43.5%). Median OS was generally short, particularly for those diagnosed with stage III/IV EC (Table). Conclusions: Medicare beneficiaries receiving systemic chemotherapy for EC generally had high HCRU and poor survival, particularly among pts diagnosed at later stages. This highlights the underlying disease burden and unmet need for more effective treatments in these pts.Table: see text
e17511
Background: Concurrent chemoradiotherapy (CCRT) is standard treatment for patients with locally advanced cervical cancer (LACC). However, little is known on the real-world treatment patterns ...and outcomes among LACC patients. This study evaluated patient characteristics and treatment patterns of LACC patients, and real-world outcomes among patients receiving CCRT as the first treatment after diagnosis (CCRT-first) in US academic and community settings. Methods: Data were drawn from the ConcertAI Oncology Dataset, a US-based electronic medical record dataset. We included adult patients diagnosed between 2010-2018 with LACC, defined as stage IB2-IIB with node involvement or stage III-IVA cervical cancer, with squamous cell carcinoma (excl. verrucous), adenocarcinoma (excl. clear cell and endometroid), or adenosquamous carcinoma. Patients with prior immunotherapy or a second primary cancer (other than in situ or non-melanoma skin cancer) were excluded. Patients were followed from initial diagnosis through end of the second regimen of systemic anti-cancer therapy, end of record, or death, whichever occurred first. Patient characteristics and treatments were reported overall and in CCRT-first patients. Among CCRT-first patients, real-world time on CCRT treatment (rwTOT), recurrence-free survival (rwRFS), and progression-free survival (rwPFS) among patients with persistent disease were estimated using Kaplan-Meier methods. Results: Overall, 300 patients with LACC were included. At LACC diagnosis, median age of patients was 51 years, 53.7% were White, 30.0% were Black, 48.0% were peri/postmenopausal, 89.3% had squamous cell histology, 75.3% had stage III disease, 92.7% had no evidence of performance status impairment, 50.3% were treated in community settings, and 21.7% had only public insurance (11.0% Medicaid, 10.7% Medicare; 56.3% had no documentation of insurance). Distributions were similar among CCRT-first patients. First treatment after diagnosis included CCRT (N=229), surgery (N=28), systemic therapy (N=11), and radiation therapy alone (N=5). 27 were untreated, and 29 patients received CCRT after another therapy. Of the 229 CCRT-first patients, median (95% CI) rwTOT was 1.6 (1.4-1.7) months; 78.2% received cisplatin within CCRT, and median duration of cisplatin treatment was 35 days; 28.4% received a systemic therapy after CCRT, and 11.8% further initiated a second systemic therapy. 27 patients had recurrent disease after complete response (median rwRFS not reached). 179 patients had persistent disease after CCRT, among whom median (95% CI) rwPFS was 29.7 (16.9-59.3) months from CCRT start. Conclusions: In US clinical practice during 2010-2018, most LACC patients received CCRT as the first treatment after diagnosis. The high proportion of patients who develop persistent disease after CCRT indicates a need for improved first treatment options.