This study explores how socio-demographic and health factors shape the relationship between multimorbidity and one-year acute care service use (i.e., hospital, emergency department visits) in older ...adults in Ontario, Canada.
We linked multiple cycles (2005-2006, 2007-2008, 2009-2010, 2011-2012) of the Canadian Community Health Survey (CCHS) to health administrative data to create a cohort of adults aged 65 and older. Administrative data were used to estimate one-year service use and to identify 12 chronic conditions used to measure multimorbidity. We examined the relationship between multimorbidity and service use stratified by a range of socio-demographic and health variables available from the CCHS. Logistic and Poisson regressions were used to explore the association between multimorbidity and service use and the role of socio-demographic factors in this relationship.
Of the 28,361 members of the study sample, 60% were between the ages of 65 and 74 years, 57% were female, 72% were non-immigrant, and over 75% lived in an urban area. Emergency department visits and hospitalizations consistently increased with the level of multimorbidity. This study did not find strong evidence of moderator or interaction effects across a range of socio-demographic factors. Stratified analyses revealed further patterns, with many being similar for both services - e.g., the odds ratios were higher at all levels of multimorbidity for men, older age groups, and those with lower household income. Rurality and immigrant status influenced emergency department use (higher in rural residents and non-immigrants) but not hospitalizations. Multimorbidity and the range of socio-demographic variables remained significant predictors of service use in the regressions.
Strong evidence links multimorbidity with increased acute care service use. This study showed that a range of factors did not modify this relationship. Nevertheless, the factors were independently associated with acute care service use, pointing to modifiable risk factors that can be the focus of resource allocation and intervention design to reduce service use in those with multimorbidity. The study's results suggest that optimizing acute care service use in older adults requires attention to both multimorbidity and social determinants, with programs that are multifactorial and integrated across the health and social service sectors.
Abstract
Background
Researchers often use survey data to study the effect of health and social variables on physician use, but how self-reported physician use compares to administrative data, the ...gold standard, in particular within the context of multimorbidity and functional limitations remains unclear. We examine whether multimorbidity and functional limitations are related to agreement between self-reported and administrative data for physician use.
Methods
Cross-sectional data from 52,854 Ontario participants of the Canadian Community Health Survey linked to administrative data were used to assess agreement on physician use. The number of general practitioner (GP) and specialist visits in the previous year was assessed using both data sources; multimorbidity and functional limitation were from self-report.
Results
Fewer participants self-reported GP visits (84.8%) compared to administrative data (89.1%), but more self-reported specialist visits (69.2% vs. 64.9%). Sensitivity was higher for GP visits (≥90% for all multimorbidity levels) compared to specialist visits (approximately 75% for 0 to 90% for 4+ chronic conditions). Specificity started higher for GP than specialist visits but decreased more swiftly with multimorbidity level; in both cases, specificity levels fell below 50%. Functional limitations, age and sex did not impact the patterns of sensitivity and specificity seen across level of multimorbidity.
Conclusions
Countries around the world collect health surveys to inform health policy and planning, but the extent to which these are linked with administrative, or similar, data are limited. Our study illustrates the potential for misclassification of physician use in self-report data and the need for sensitivity analyses or other corrections.
Background Preliminary evidence suggests that providing longer duration prescriptions at discharge may improve long-term adherence to secondary preventative cardiac medications among post-myocardial ...infarction (MI) patients. We implemented and assessed the effects of two hospital-based interventions--(1) standardized prolonged discharge prescription forms (90-day supply with 3 repeats for recommended cardiac medications) plus education and (2) education only--on long-term cardiac medication adherence among elderly patients post-MI. Methods We conducted an interrupted time series study of all post-MI patients aged 65-104 years in Ontario, Canada, discharged from hospital between September 2015 and August 2018 with greater than or equai to 1 dispensation(s) for a statin, beta blocker, angiotensin system inhibitor, and/or secondary antiplatelet within 7 days post-discharge. The standardized prolonged discharge prescription forms plus education and education-only interventions were implemented at 2 (1,414 patients) and 4 (926 patients) non-randomly selected hospitals in September 2017 for 12 months, with all other Ontario hospitals (n = 143; 18,556 patients) comprising an external control group. The primary outcome, long-term cardiac medication adherence, was defined at the patient-level as an average proportion of days covered (over 1-year post-discharge) greater than or equai to 80% across cardiac medication classes dispensed at their index fill. Primary outcome data were aggregated within hospital groups (intervention 1, 2, or control) to monthly proportions and independently analyzed using segmented regression to evaluate intervention effects. A process evaluation was conducted to assess intervention fidelity. Results At 12 months post-implementation, there was no statistically significant effect on long-term cardiac medication adherence for either intervention--standardized prolonged discharge prescription forms plus education (5.4%; 95% CI - 6.4%, 17.2%) or education only (1.0%; 95% CI - 28.6%, 30.6%)--over and above the counterfactual trend; similarly, no change was observed in the control group (- 0.3%; 95% CI - 3.6%, 3.1%). During the intervention period, only 10.8% of patients in the intervention groups received greater than or equai to 90 days, on average, for cardiac medications at their index fill. Conclusions Recognizing intervention fidelity was low at the pharmacy level, and no statistically significant post-implementation differences in adherence were found, the trends in this study--coupled with other published retrospective analyses of administrative data--support further evaluation of this simple intervention to improve long-term adherence to cardiac medications. Trial registration ClinicalTrials.gov: NCT03257579, registered June 16, 2017 Protocol available at: Keywords: Post-myocardial infarction, Adherence, Standardized discharge prescription form, Secondary prevention, Policy change
Biomarkers lie at the heart of precision medicine. Surprisingly, while rapid genomic profiling is becoming ubiquitous, the development of biomarkers usually involves the application of bespoke ...techniques that cannot be directly applied to other datasets. There is an urgent need for a systematic methodology to create biologically-interpretable molecular models that robustly predict key phenotypes. Here we present SIMMS (Subnetwork Integration for Multi-Modal Signatures): an algorithm that fragments pathways into functional modules and uses these to predict phenotypes. We apply SIMMS to multiple data types across five diseases, and in each it reproducibly identifies known and novel subtypes, and makes superior predictions to the best bespoke approaches. To demonstrate its ability on a new dataset, we profile 33 genes/nodes of the PI3K pathway in 1734 FFPE breast tumors and create a four-subnetwork prediction model. This model out-performs a clinically-validated molecular test in an independent cohort of 1742 patients. SIMMS is generic and enables systematic data integration for robust biomarker discovery.
Knowledge of HIV drug resistance informs the choice of regimens and ensures that the most efficacious options are selected. In January 2014, a policy change to routine resistance testing was ...implemented in Ontario, Canada. The objective of this study was to investigate the policy change impact of routine resistance testing in people with HIV in Ontario, Canada since January 2014.
We used data on people with HIV living in Ontario from administrative databases of the Institute for Clinical Evaluative Sciences (ICES) and Public Health Ontario (PHO), and ran ordinary least squares (OLS) models of interrupted time series to measure the levels and trends of 2-year mortality, 2-year hospitalizations and 2-year emergency department visits before (2005-2013) and after the policy change (2014-2017). Outcomes were collected in biannual periods, generating 18 periods before the intervention and 8 periods after. We included a control series of people who did not receive a resistance test within 3 months of HIV diagnosis.
Data included 12,996 people with HIV, of which 8881 (68.3%) were diagnosed between 2005 and 2013, and 4115 (31.7%) were diagnosed between 2014 and 2017. Policy change to routine resistance testing within 3 months of HIV diagnosis led to a decreasing trend in 2-year mortality of 0.8% every six months compared to the control group. No significant differences in hospitalizations or emergency department visits were noted.
The policy of routine resistance testing within three months of diagnosis is beneficial at the population level.
Abstract Background and purpose Recent data suggest that in vitro and in vivo derived hypoxia gene-expression signatures have prognostic power in breast and possibly other cancers. However, both ...tumour hypoxia and the biological adaptation to this stress are highly dynamic. Assessment of time-dependent gene-expression changes in response to hypoxia may thus provide additional biological insights and assist in predicting the impact of hypoxia on patient prognosis. Materials and methods Transcriptome profiling was performed for three cell lines derived from diverse tumour-types after hypoxic exposure at eight time-points, which include a normoxic time-point. Time-dependent sets of co-regulated genes were identified from these data. Subsequently, gene ontology (GO) and pathway analyses were performed. The prognostic power of these novel signatures was assessed in parallel with previous in vitro and in vivo derived hypoxia signatures in a large breast cancer microarray meta-dataset ( n = 2312). Results We identified seven recurrent temporal and two general hypoxia signatures. GO and pathway analyses revealed regulation of both common and unique underlying biological processes within these signatures. None of the new or previously published in vitro signatures consisting of hypoxia-induced genes were prognostic in the large breast cancer dataset. In contrast, signatures of repressed genes, as well as the in vivo derived signatures of hypoxia-induced genes showed clear prognostic power. Conclusions Only a subset of hypoxia-induced genes in vitro demonstrates prognostic value when evaluated in a large clinical dataset. Despite clear evidence of temporal patterns of gene-expression in vitro , the subset of prognostic hypoxia regulated genes cannot be identified based on temporal pattern alone. In vivo derived signatures appear to identify the prognostic hypoxia induced genes. The prognostic value of hypoxia-repressed genes is likely a surrogate for the known importance of proliferation in breast cancer outcome.
•New biomedical technologies generate measurements at scale and in multiple dimensions.•Large and diverse biomedical data present fundamentally new challenges for machine learning.•Integrative ...approaches combine different types of data to provide a comprehensive systems view.•Data integration creates a holistic picture of the cell, human body, and disease.•Advances in machine learning bring exciting future for biomedical data integration.
New technologies have enabled the investigation of biology and human health at an unprecedented scale and in multiple dimensions. These dimensions include a myriad of properties describing genome, epigenome, transcriptome, microbiome, phenotype, and lifestyle. No single data type, however, can capture the complexity of all the factors relevant to understanding a phenomenon such as a disease. Integrative methods that combine data from multiple technologies have thus emerged as critical statistical and computational approaches. The key challenge in developing such approaches is the identification of effective models to provide a comprehensive and relevant systems view. An ideal method can answer a biological or medical question, identifying important features and predicting outcomes, by harnessing heterogeneous data across several dimensions of biological variation. In this Review, we describe the principles of data integration and discuss current methods and available implementations. We provide examples of successful data integration in biology and medicine. Finally, we discuss current challenges in biomedical integrative methods and our perspective on the future development of the field.
Animated representations of outcomes drawn from distributions (hypothetical outcome plots, or HOPs) are used in the media and other public venues to communicate uncertainty. HOPs greatly improve ...multivariate probability estimation over conventional static uncertainty visualizations and leverage the ability of the visual system to quickly, accurately, and automatically process the summary statistical properties of ensembles. However, it is unclear how well HOPs support applied tasks resembling real world judgments posed in uncertainty communication. We identify and motivate an appropriate task to investigate realistic judgments of uncertainty in the public domain through a qualitative analysis of uncertainty visualizations in the news. We contribute two crowdsourced experiments comparing the effectiveness of HOPs, error bars, and line ensembles for supporting perceptual decision-making from visualized uncertainty. Participants infer which of two possible underlying trends is more likely to have produced a sample of time series data by referencing uncertainty visualizations which depict the two trends with variability due to sampling error. By modeling each participant's accuracy as a function of the level of evidence presented over many repeated judgments, we find that observers are able to correctly infer the underlying trend in samples conveying a lower level of evidence when using HOPs rather than static aggregate uncertainty visualizations as a decision aid. Modeling approaches like ours contribute theoretically grounded and richly descriptive accounts of user perceptions to visualization evaluation.