Raman optical spectroscopy promises label-free bacterial detection, identification, and antibiotic susceptibility testing in a single step. However, achieving clinically relevant speeds and ...accuracies remains challenging due to weak Raman signal from bacterial cells and numerous bacterial species and phenotypes. Here we generate an extensive dataset of bacterial Raman spectra and apply deep learning approaches to accurately identify 30 common bacterial pathogens. Even on low signal-to-noise spectra, we achieve average isolate-level accuracies exceeding 82% and antibiotic treatment identification accuracies of 97.0±0.3%. We also show that this approach distinguishes between methicillin-resistant and -susceptible isolates of Staphylococcus aureus (MRSA and MSSA) with 89±0.1% accuracy. We validate our results on clinical isolates from 50 patients. Using just 10 bacterial spectra from each patient isolate, we achieve treatment identification accuracies of 99.7%. Our approach has potential for culture-free pathogen identification and antibiotic susceptibility testing, and could be readily extended for diagnostics on blood, urine, and sputum.
There is growing concern that racial and ethnic minority communities around the world are experiencing a disproportionate burden of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) ...infection and coronavirus disease 2019 (COVID-19). We investigated racial and ethnic disparities in patterns of COVID-19 testing (i.e., who received testing and who tested positive) and subsequent mortality in the largest integrated healthcare system in the United States.
This retrospective cohort study included 5,834,543 individuals receiving care in the US Department of Veterans Affairs; most (91%) were men, 74% were non-Hispanic White (White), 19% were non-Hispanic Black (Black), and 7% were Hispanic. We evaluated associations between race/ethnicity and receipt of COVID-19 testing, a positive test result, and 30-day mortality, with multivariable adjustment for a wide range of demographic and clinical characteristics including comorbid conditions, health behaviors, medication history, site of care, and urban versus rural residence. Between February 8 and July 22, 2020, 254,595 individuals were tested for COVID-19, of whom 16,317 tested positive and 1,057 died. Black individuals were more likely to be tested (rate per 1,000 individuals: 60.0, 95% CI 59.6-60.5) than Hispanic (52.7, 95% CI 52.1-53.4) and White individuals (38.6, 95% CI 38.4-38.7). While individuals from minority backgrounds were more likely to test positive (Black versus White: odds ratio OR 1.93, 95% CI 1.85-2.01, p < 0.001; Hispanic versus White: OR 1.84, 95% CI 1.74-1.94, p < 0.001), 30-day mortality did not differ by race/ethnicity (Black versus White: OR 0.97, 95% CI 0.80-1.17, p = 0.74; Hispanic versus White: OR 0.99, 95% CI 0.73-1.34, p = 0.94). The disparity between Black and White individuals in testing positive for COVID-19 was stronger in the Midwest (OR 2.66, 95% CI 2.41-2.95, p < 0.001) than the West (OR 1.24, 95% CI 1.11-1.39, p < 0.001). The disparity in testing positive for COVID-19 between Hispanic and White individuals was consistent across region, calendar time, and outbreak pattern. Study limitations include underrepresentation of women and a lack of detailed information on social determinants of health.
In this nationwide study, we found that Black and Hispanic individuals are experiencing an excess burden of SARS-CoV-2 infection not entirely explained by underlying medical conditions or where they live or receive care. There is an urgent need to proactively tailor strategies to contain and prevent further outbreaks in racial and ethnic minority communities.
Warfarin anticoagulation reduces thromboembolic complications in patients with atrial fibrillation or mechanical heart valves, but effective management is complex, and the international normalized ...ratio (INR) is often outside the target range. As compared with venous plasma testing, point-of-care INR measuring devices allow greater testing frequency and patient involvement and may improve clinical outcomes.
We randomly assigned 2922 patients who were taking warfarin because of mechanical heart valves or atrial fibrillation and who were competent in the use of point-of-care INR devices to either weekly self-testing at home or monthly high-quality testing in a clinic. The primary end point was the time to a first major event (stroke, major bleeding episode, or death).
The patients were followed for 2.0 to 4.75 years, for a total of 8730 patient-years of follow-up. The time to the first primary event was not significantly longer in the self-testing group than in the clinic-testing group (hazard ratio, 0.88; 95% confidence interval, 0.75 to 1.04; P=0.14). The two groups had similar rates of clinical outcomes except that the self-testing group reported more minor bleeding episodes. Over the entire follow-up period, the self-testing group had a small but significant improvement in the percentage of time during which the INR was within the target range (absolute difference between groups, 3.8 percentage points; P<0.001). At 2 years of follow-up, the self-testing group also had a small but significant improvement in patient satisfaction with anticoagulation therapy (P=0.002) and quality of life (P<0.001).
As compared with monthly high-quality clinic testing, weekly self-testing did not delay the time to a first stroke, major bleeding episode, or death to the extent suggested by prior studies. These results do not support the superiority of self-testing over clinic testing in reducing the risk of stroke, major bleeding episode, and death among patients taking warfarin therapy. (Funded by the Department of Veterans Affairs Cooperative Studies Program; ClinicalTrials.gov number, NCT00032591.).
Early threat detection and situational awareness are vital to achieving a comprehensive and accurate view of health-related events for federal, state, and local health agencies. Key to this are ...public health and syndromic surveillance systems that can analyze large data sets to discover patterns, trends, and correlations of public health significance. In 2020, Department of Veterans Affairs (VA) evaluated its public health surveillance system and identified areas for improvement.
Using the Centers for Disease Control and Prevention (CDC) Guidelines for Evaluating Public Health Surveillance Systems, we assessed the ability of the Praedico Surveillance System to perform public health surveillance for a variety of health issues and evaluated its performance compared to an enterprise data solution (VA Corporate Data Warehouse), legacy surveillance system (VA ESSENCE) and a national, collaborative syndromic surveillance platform (CDC NSSP BioSense).
Review of system attributes found that the system was simple, flexible, and stable. Representativeness, timeliness, sensitivity, and Predictive Value Positive were acceptable but could be further improved. Data quality issues and acceptability present challenges that potentially affect the overall usefulness of the system.
Praedico is a customizable surveillance and data analytics platform built on big data technologies. Functionality is straightforward, with rapid query generation and runtimes. Data can be graphed, mapped, analyzed, and shared with key decision makers and stakeholders. Evaluation findings suggest that future development and system enhancements should focus on addressing Praedico data quality issues and improving user acceptability. Because Praedico is designed to handle big data queries and work with data from a variety of sources, it could be enlisted as a tool for interdepartmental and interagency collaboration and public health data sharing. We suggest that future system evaluations include measurements of value and effectiveness along with additional organizations and functional assessments.
HIV-1 protease (PR), reverse transcriptase (RT), and integrase (IN) variability presents a challenge to laboratories performing genotypic resistance testing. This challenge will grow with increased ...sequencing of samples enriched for proviral DNA such as dried blood spots and increased use of next-generation sequencing (NGS) to detect low-abundance HIV-1 variants. We analyzed PR and RT sequences from >100,000 individuals and IN sequences from >10,000 individuals to characterize variation at each amino acid position, identify mutations indicating APOBEC-mediated G-to-A editing, and identify mutations resulting from selective drug pressure. Forty-seven percent of PR, 37% of RT, and 34% of IN positions had one or more amino acid variants with a prevalence of ≥1%. Seventy percent of PR, 60% of RT, and 60% of IN positions had one or more variants with a prevalence of ≥0.1%. Overall 201 PR, 636 RT, and 346 IN variants had a prevalence of ≥0.1%. The median intersubtype prevalence ratios were 2.9-, 2.1-, and 1.9-fold for these PR, RT, and IN variants, respectively. Only 5.0% of PR, 3.7% of RT, and 2.0% of IN variants had a median intersubtype prevalence ratio of ≥10-fold. Variants at lower prevalences were more likely to differ biochemically and to be part of an electrophoretic mixture compared to high-prevalence variants. There were 209 mutations indicative of APOBEC-mediated G-to-A editing and 326 mutations nonpolymorphic treatment selected. Identification of viruses with a high number of APOBEC-associated mutations will facilitate the quality control of dried blood spot sequencing. Identifying sequences with a high proportion of rare mutations will facilitate the quality control of NGS.
Most antiretroviral drugs target three HIV-1 proteins: PR, RT, and IN. These proteins are highly variable: many different amino acids can be present at the same position in viruses from different individuals. Some of the amino acid variants cause drug resistance and occur mainly in individuals receiving antiretroviral drugs. Some variants result from a human cellular defense mechanism called APOBEC-mediated hypermutation. Many variants result from naturally occurring mutation. Some variants may represent technical artifacts. We studied PR and RT sequences from >100,000 individuals and IN sequences from >10,000 individuals to quantify variation at each amino acid position in these three HIV-1 proteins. We performed analyses to determine which amino acid variants resulted from antiretroviral drug selection pressure, APOBEC-mediated editing, and naturally occurring variation. Our results provide information essential to clinical, research, and public health laboratories performing genotypic resistance testing by sequencing HIV-1 PR, RT, and IN.
HIV infection in the elderly Nguyen, Nancy; Holodniy, Mark
Clinical interventions in aging,
01/2008, Volume:
3, Issue:
3
Journal Article
Peer reviewed
Open access
In the US, an estimated 1 million people are infected with HIV, although one-third of this population are unaware of their diagnosis. While HIV infection is commonly thought to affect younger adults, ...there are an increasing number of patients over 50 years of age living with the condition. UNAIDS and WHO estimate that of the 40 million people living with HIV/AIDS in the world, approximately 2.8 million are 50 years and older. With the introduction of highly active antiretroviral therapy (HAART) in the mid-1990s, survival following HIV diagnosis has risen dramatically and HIV infection has evolved from an acute disease process to being managed as a chronic medical condition. As treated HIV-infected patients live longer and the number of new HIV diagnoses in older patients rise, clinicians need to be aware of these trends and become familiar with the management of HIV infection in the older patient. This article is intended for the general clinician, including geriatricians, and will review epidemiologic data and HIV treatment as well as provide a discussion on medical management issues affecting the older HIV-infected patient.
We applied lymphogranuloma venereum (LGV) clinical case criteria to a cohort of 1381 Veterans positive for HIV and Chlamydia trachomatis (CT) from 2016 from 2023 and analyzed variables to ascertain ...risk factors for LGV and factors associated with the use of standard treatment regimens. In total, 284/1381 (20.6%) met the criteria for LGV. A total of 179/284 (63%) were probable cases, and 105/284 (37%) were possible cases (those meeting clinical criteria but with concurrent sexually transmitted infections (STI) associated with LGV-like symptoms). None had confirmatory CT L1–L3 testing. A total of 230 LGV cases (81%) presented with proctitis, 71 (25%) with ulcers, and 57 (20.1%) with lymphadenopathy. In total, 66 (23.2%) patients had >1 symptom of LGV. A total of 43 (15%) LGV cases were hospitalized. Primary risk factors for LGV were male birth sex (p = 0.004), men who have sex with men (p < 0.001), and the presence of STIs other than gonorrhea or syphilis (p = 0.011). In total, 124/284 (43.7%) LGV cases received standard recommended treatment regimens. Probable cases were more likely to receive standard treatment than possible cases (p = 0.003). We report that 20.6% of CT cases met clinical criteria for LGV among HIV-infected Veterans and that less than half of cases received recommended treatment regimens, indicating that LGV is likely underestimated and inadequately treated among this US population.
Cardiac injury is a known potential complication of influenza infection. Because U.S. veterans cared for at the U.S. Department of Veterans Affairs are older and have more cardiovascular disease ...(CVD) risk factors than the general U.S. population, veterans are at risk for cardiac complications of influenza infection. We investigated biomarkers of cardiac injury characteristics and associated cardiac events among veterans who received cardiac biomarker testing ≤30 days after laboratory-confirmed influenza virus infection.
Laboratory-confirmed influenza cases among veterans cared for at U.S. Department of Veterans Affairs' facilities for October 2010-December 2012 were identified using electronic medical records (EMRs). Influenza confirmation was based on respiratory specimen viral culture or antigen or nucleic acid detection. Acute cardiac injury (ACI) was defined as an elevated cardiac biomarker (troponin I or creatinine kinase isoenzyme MB) >99 % of the upper reference limit occurring ≤30 days after influenza specimen collection. EMRs were reviewed for demographics, CVD history and risk factors, and ACI-associated cardiac events.
Among 38,197 patients with influenza testing results, 4,469 (12 %) had a positive result; 600 of those patients had cardiac biomarker testing performed ≤30 days after influenza testing, and 143 (24 %) had one or more elevated cardiac biomarkers. Among these 143, median age was 73 years (range 44-98 years), and 98 (69 %) were non-Hispanic white. All patients had one or more CVD risk factors, and 98 (69 %) had a history of CVD. Eighty-six percent of ACI-associated events occurred within 3 days of influenza specimen collection date. Seventy patients (49 %) had documented or probable acute myocardial infarction, 8 (6 %) acute congestive heart failure, 6 (4 %) myocarditis, and 4 (3 %) atrial fibrillation. Eleven (8 %) had non-cardiac explanations for elevated cardiac biomarkers, and 44 (31 %) had no documented explanation. Sixty-eight (48 %) patients had received influenza vaccination during the related influenza season.
Among veterans with laboratory-confirmed influenza infection and cardiac biomarker testing ≤30 days after influenza testing, approximately 25 % had evidence of ACI, the majority within 3 days. Approximately half were myocardial infarctions. Our findings emphasize the importance of considering ACI associated with influenza infection among patients at high risk, including this older population with prevalent CVD risk factors.
With next-generation DNA sequencing technologies, one can interrogate a specific genomic region of interest at very high depth of coverage and identify less prevalent, rare mutations in heterogeneous ...clinical samples. However, the mutation detection levels are limited by the error rate of the sequencing technology as well as by the availability of variant-calling algorithms with high statistical power and low false positive rates. We demonstrate that we can robustly detect mutations at 0.1% fractional representation. This represents accurate detection of one mutant per every 1000 wild-type alleles. To achieve this sensitive level of mutation detection, we integrate a high accuracy indexing strategy and reference replication for estimating sequencing error variance. We employ a statistical model to estimate the error rate at each position of the reference and to quantify the fraction of variant base in the sample. Our method is highly specific (99%) and sensitive (100%) when applied to a known 0.1% sample fraction admixture of two synthetic DNA samples to validate our method. As a clinical application of this method, we analyzed nine clinical samples of H1N1 influenza A and detected an oseltamivir (antiviral therapy) resistance mutation in the H1N1 neuraminidase gene at a sample fraction of 0.18%.