To develop a new method to quantify visually-enhanced vestibulo-ocular reflex (VVOR) gain, in patients with vestibular function loss, that is mathematically suitable given the nature of the test, and ...determine the reliability of the method by comparing results with those of the gold standard, the video head impulse test (vHIT).
We developed a new method for VVOR gain quantification and conducted a cross-sectional study in patients diagnosed with vestibular function loss and controls, all participants undergoing both a VVOR test and a vHIT. We measured VVOR gain with three different methods: area under the curve (AUC), slope regression, and a Fourier method (VVOR
, VVOR
, and VVOR
, respectively); and compared these gain values with vHIT gain calculated using the AUC method.
Overall, 111 patients were included: 29 healthy subjects and 82 patients with vestibular function loss. Intraclass correlation coefficients (ICC(1,1)) between gain from the gold standard and each of the VVOR gain methods were: 0.68 (CI: 0.61-0.75) for VVOR
, 0.66 (CI: 0.58-0.73) for VVOR
and 0.71 (CI: 0.64-0.77) for VVOR
. No interference was found between VVOR gain calculation methods and potentially influential variables considered (p ≥ 0.98).
The new method for quantifying VVOR gain showed good concordance with the vHIT method.
2: Individual cross-sectional studies with consistently applied reference standard and blinding (Diagnosis) Laryngoscope, 133:3554-3563, 2023.
Protecting riparian vegetation around streams is vital in reducing the detrimental effects of environmental change on freshwater ecosystems and in maintaining aquatic biodiversity. Thus, identifying ...ecological thresholds is useful for defining regulatory limits and for guiding the management of riparian zones towards the conservation of freshwater biota.
Using nationwide data on fish and invertebrates occurring in small Brazilian streams, we estimated thresholds of native vegetation loss in which there are abrupt changes in the occurrence and abundance of freshwater bioindicators and tested whether there are congruent responses among different biomes, biological groups and riparian buffer sizes.
Mean thresholds of native vegetation cover loss varied widely among biomes, buffer sizes and biological groups: ranging from 0.5% to 77.4% for fish, from 2.9% to 37.0% for aquatic invertebrates and from 3.8% to 43.2% for a subset of aquatic invertebrates. Confidence intervals for thresholds were wide, but the minimum values of these intervals were lower for the smaller riparian buffers (50 and 100 m) than larger ones (200 and 500 m), indicating that land use should be kept away from the streams. Also, thresholds occurred at a lower percentage of riparian vegetation loss in the smaller buffers, and were critically lower for invertebrates: reducing only 6.5% of native vegetation cover within a 50‐m riparian buffer is enough to cross thresholds for invertebrates.
Synthesis and applications. The high variability in biodiversity responses to loss of native riparian vegetation suggests caution in the use of a single riparian width for conservation actions or policy definitions nationwide. The most sensitive bioindicators can be used as early warning signals of abrupt changes in freshwater biodiversity. In practice, maintaining at least 50‐m wide riparian reserves on each side of streams would be more effective to protect freshwater biodiversity in Brazil. However, incentives and conservation strategies to protect even wider riparian reserves (~100 m) and also taking into consideration the regional context will promote a greater benefit. This information should be used to set conservation goals and to create complementary mechanisms and policies to protect wider riparian reserves than those currently required by the federal law.
Resumo
Proteger a vegetação no entorno de riachos é vital para reduzir os efeitos das mudanças ambientais sobre os ecossistemas aquáticos e para a manutenção de sua biodiversidade. Assim, a identificação de limiares ecológicos é útil para regular os limites de uso e para orientar o manejo de zonas ripárias, visando a conservação da biota aquática.
Usando dados de peixes e invertebrados aquáticos de pequenos riachos do Brasil, nós estimamos os limiares de perda de vegetação nativa nos quais ocorrem mudanças abruptas na ocorrência e abundância de bioindicadores aquáticos. Também testamos se existem respostas congruentes entre os diferentes biomas, grupos biológicos e áreas de vegetação ripária (buffers).
Os valores médios dos limiares de perda de vegetação nativa variaram marcadamente entre biomas, tamanhos de buffer e grupos biológicos: entre 0,5% e 77,4% para peixes, entre 2,9% e 37,0% para invertebrados aquáticos e entre 3,8% e 43,2% para um subconjunto de invertebrados aquáticos. Os intervalos de confiança dos limiares foram amplos, mas os valores mínimos dos intervalos foram reduzidos para os menores buffers ripários (50 e 100 m) em comparação com os maiores (200 e 500 m), indicando que o uso do solo deve ser mantido longe dos riachos. Além disso, os limiares ocorreram em menores porcentagens de perda de vegetação ripária nos buffers menores, e foram criticamente baixos para invertebrados aquáticos: uma redução de apenas 6,5% da cobertura de vegetação nativa no buffer ripário de 50 m é suficiente para ultrapassar os limiares de perda de invertebrados.
Síntese e aplicações. A elevada variabilidade dos limiares de declínio abrupto da biodiversidade aquática em resposta a perda de vegetação ripária nativa sugere cautela no uso de uma largura única de proteção ripária para ações de conservação e definição de políticas nacionais. Os bioindicadores mais sensíveis podem ser usados como sinais precoces que alertam a aproximação de mudanças abruptas na biodiversidade aquática. Na prática, manter reservas ripárias (áreas de preservação permanente—APP) de pelo menos 50 m de largura, em ambos os lados dos riachos, parece ser mais efetivo em proteger a biodiversidade de água doce do Brasil. Contudo, incentivos e estratégias de conservação que protejam reservas ripárias ainda maiores (~100 m de largura) e que levem em consideração o contexto regional podem promover um maior benefício. Essas informações podem ser usadas para definir metas de conservação e para criar mecanismos e políticas complementares para proteger reservas ripárias (APP) ainda maiores do que aquelas atualmente requeridas pela lei federal.
The high variability in biodiversity responses to loss of native riparian vegetation suggests caution in the use of a single riparian width for conservation actions or policy definitions nationwide. The most sensitive bioindicators can be used as early warning signals of abrupt changes in freshwater biodiversity. In practice, maintaining at least 50‐m wide riparian reserves on each side of streams would be more effective to protect freshwater biodiversity in Brazil. However, incentives and conservation strategies to protect even wider riparian reserves (~100 m) and also taking into consideration the regional context will promote a greater benefit. This information should be used to set conservation goals and to create complementary mechanisms and policies to protect wider riparian reserves than those currently required by the federal law.
Background
Cardiovascular disease (CVD) disproportionately affects Black adults in the United States. This is increasingly acknowledged to be due to inequitable distribution of health-promoting ...resources. One potential contributor is inequities in educational opportunities, although it is unclear what aspects of education are most salient. School racial segregation may affect cardiovascular health by increasing stress, constraining socioeconomic opportunities, and altering health behaviors. We investigated the association between school segregation and Black adults’ CVD risk.
Methods and findings
We leveraged a natural experiment created by quasi-random (i.e., arbitrary) timing of local court decisions since 1991 that released school districts from court-ordered desegregation. We used the Panel Study of Income Dynamics (PSID) (1991 to 2017), linked with district-level school segregation measures and desegregation court order status. The sample included 1,053 Black participants who ever resided in school districts that were under a court desegregation order in 1991. The exposure was mean school segregation during observed schooling years. Outcomes included several adult CVD risk factors and outcomes. We fitted standard ordinary least squares (OLS) multivariable linear regression models, then conducted instrumental variables (IV) analysis, using the proportion of schooling years spent in districts that had been released from court-ordered desegregation as an instrument. We adjusted for individual- and district-level preexposure confounders, birth year, and state fixed effects. In standard linear models, school segregation was associated with a lower probability of good self-rated health (−0.05 percentage points per SD of the segregation index; 95% CI: −0.08, −0.03;
p
< 0.001) and a higher probability of binge drinking (0.04 percentage points; 95% CI: 0.002, 0.07;
p
= 0.04) and heart disease (0.01 percentage points; 95% CI: 0.002, 0.15;
p
= 0.007). IV analyses also found that school segregation was associated with a lower probability of good self-rated health (−0.09 percentage points; 95% CI: −0.17, −0.02,
p
= 0.02) and a higher probability of binge drinking (0.17 percentage points; 95% CI: 0.04, 0.30,
p
= 0.008). For IV estimates, only binge drinking was robust to adjustments for multiple hypothesis testing. Limitations included self-reported outcomes and potential residual confounding and exposure misclassification.
Conclusions
School segregation exposure in childhood may have longstanding impacts on Black adults’ cardiovascular health. Future research should replicate these analyses in larger samples and explore potential mechanisms. Given the recent rise in school segregation, this study has implications for policies and programs to address racial inequities in CVD.
Abstract
In patients with left ventricular assist device (LVAD), infections and thrombotic events represent severe complications. We investigated device-specific local and systemic inflammation and ...its impact on cerebrovascular events (CVE) and mortality. In 118 LVAD patients referred for
18
F-FDG-PET/CT, metabolic activity of LVAD components, thoracic aortic wall, lymphoid and hematopoietic organs, was quantified and correlated with clinical characteristics, laboratory findings, and outcome. Driveline infection was detected in 92/118 (78%) patients by
18
F-FDG-PET/CT. Activity at the driveline entry site was associated with increased signals in aortic wall (r = 0.32,
p
< 0.001), spleen (r = 0.20,
p
= 0.03) and bone marrow (r = 0.20,
p
= 0.03), indicating systemic interactions. Multivariable analysis revealed independent associations of aortic wall activity with activity of spleen (β = 0.43, 95% CI 0.18–0.68,
p
< 0.001) and driveline entry site (β = 0.04, 95% CI 0.01–0.06,
p
= 0.001). Twenty-two (19%) patients suffered CVE after PET/CT. In a binary logistic regression analysis metabolic activity at the driveline entry site missed the level of significance as an influencing factor for CVE after adjusting for anticoagulation (OR = 1.16, 95% CI 1–1.33,
p
= 0.05). Metabolic activity of the subcutaneous driveline (OR = 1.13, 95% CI 1.02–1.24,
p
= 0.016) emerged as independent risk factor for mortality. Molecular imaging revealed systemic inflammatory interplay between thoracic aorta, hematopoietic organs, and infected device components in LVAD patients, the latter predicting CVE and mortality.
Current guidelines recommend the assessment for minimal HE in patients with liver cirrhosis. Various efforts were made to find tools that simplify the diagnosis. Here, we compare the 6 most ...frequently used tests for their validity and their predictive value for overt hepatic encephalopathy (oHE), rehospitalization, and death.
One hundred thirty-two patients with cirrhosis underwent the Portosystemic Encephalopathy-Syndrome-Test yielding the psychometric hepatic encephalopathy score (PHES), Animal Naming Test (ANT), Critical Flicker Frequency (CFF), Inhibitory Control Test (ICT), EncephalApp (Stroop), and Continuous Reaction Time Test (CRT). Patients were monitored for 365 days regarding oHE development, rehospitalization, and death. Twenty-three patients showed clinical signs of HE grade 1-2 at baseline. Of the remaining 109 neurologically unimpaired patients, 35.8% had abnormal PHES and 44% abnormal CRT. Percentage of abnormal Stroop (79.8% vs. 52.3%), ANT (19.3% vs. 51.4%), ICT (28.4% vs. 36.7%), and CFF results (18.3% vs. 25.7%) changed significantly when adjusted norms were used for evaluation instead of fixed cutoffs. All test results correlated significantly with each other ( p <0.05), except for CFF. During follow-up, 24 patients developed oHE, 58 were readmitted to the hospital, and 20 died. Abnormal PHES results were linked to oHE development in the multivariable model. No other adjusted test demonstrated predictive value for any of the investigated endpoints.
Where applicable, the diagnosis of minimal HE should be made based on adjusted norm values for the tests, exclusively. The minimal HE tests cannot be equated with one another and have an overall limited value in predicting clinical outcomes.
(1) Background: Patients with acute ischaemic stroke (AIS) are at high risk for stroke-associated infections (SAIs). We hypothesised that increased concentrations of systemic inflammation markers ...predict SAIs and unfavourable outcomes; (2) Methods: In 223 patients with AIS, blood samples were taken at ≤24 h, 3 d and 7d after a stroke, to determine IL-6, IL-10, CRP and LBP. The outcome was assessed using the modified Rankin Scale at 90 d. Patients were thoroughly examined regarding the development of SAIs; (3) Results: 47 patients developed SAIs, including 15 lower respiratory tract infections (LRTIs). IL-6 and LBP at 24 h differed, between patients with and without SAIs (IL-6: p < 0.001; LBP: p = 0.042). However, these associations could not be confirmed after adjustment for age, white blood cell count, reduced consciousness and NIHSS. When considering the subgroup of LRTIs, in patients who presented early (≤12 h after stroke, n = 139), IL-6 was independently associated with LRTIs (OR: 1.073, 95% CI: 1.002−1.148). The ROC-analysis for prediction of LRTIs showed an AUC of 0.918 for the combination of IL-6 and clinical factors; (4) Conclusions: Blood biomarkers were not predictive for total SAIs. At early stages, IL-6 was independently associated with outcome-relevant LRTIs. Further studies need to clarify the use of biochemical markers to identify patients prone to SAIs.
Aim: Higher-elevation areas on islands and continental mountains tend to be separated by longer distances, predicting higher endemism at higher elevations; our study is the first to test the ...generality of the predicted pattern. We also compare it empirically with contrasting expectations from hypotheses invoking higher speciation with area, temperature and species richness. Location: Thirty-two insular and 18 continental elevational gradients from around the world. Methods: We compiled entire floras with elevation-specific occurrence information, and calculated the proportion of native species that are endemic ('percent endemism') in 100-m bands, for each of the 50 elevational gradients. Using generalized linear models, we tested the relationships between percent endemism and elevation, isolation, temperature, area and species richness. Results: Percent endemism consistently increased monotonically with elevation, globally. This was independent of richness—elevation relationships, which had varying shapes but decreased with elevation at high elevations. The endemism—elevation relationships were consistent with isolation-related predictions, but inconsistent with hypotheses related to area, richness and temperature. Main conclusions: Higher per-species speciation rates caused by increasing isolation with elevation are the most plausible and parsimonious explanation for the globally consistent pattern of higher endemism at higher elevations that we identify. We suggest that topography-driven isolation increases speciation rates in mountainous areas, across all elevations and increasingly towards the equator. If so, it represents a mechanism that may contribute to generating latitudinal diversity gradients in a way that is consistent with both present-day and palaeontological evidence.
Deforestation in the Amazon and the social vulnerability of its settler communities has been associated with increased malaria incidence. The feeding biology of the most important malaria vectors in ...the region, notably Nyssorhynchus darlingi, compounds efforts to control vectors and reduce transmission of what has become known as "Frontier Malaria". Exploring Anophelinae mosquito diversity is fundamental to understanding the species responsible for transmission and developing appropriate management and intervention strategies for malaria control in the Amazon River basin.
This study describes Anophelinae mosquito diversity from settler communities affected by Frontier Malaria in the states of Acre, Amazonas and Rondônia by analysing COI gene data using cluster and tree-based species delimitation approaches.
In total, 270 specimens from collection sites were sequenced and these were combined with 151 reference (GenBank) sequences in the analysis to assist in species identification. Conservative estimates found that the number of species collected at these sites was between 23 (mPTP partition) and 27 (strict ABGD partition) species, up to 13 of which appeared to be new. Nyssorhynchus triannulatus and Nyssorhynchus braziliensis displayed exceptional levels of intraspecific genetic diversity but there was little to no support for putative species complex status.
This study demonstrates that Anophelinae mosquito diversity continues to be underestimated in poorly sampled areas where frontier malaria is a major public health concern. The findings will help shape future studies of vector incrimination and transmission dynamics in these areas and support efforts to develop more effective vector control and transmission reduction strategies in settler communities in the Amazon River basin.
The relationship between deforestation and malaria is a spatiotemporal process of variation in Plasmodium incidence in human-dominated Amazonian rural environments. The present study aimed to assess ...the underlying mechanisms of malarial exposure risk at a fine scale in 5-km
sites across the Brazilian Amazon, using field-collected data with a longitudinal spatiotemporally structured approach. Anopheline mosquitoes were sampled from 80 sites to investigate the Plasmodium infection rate in mosquito communities and to estimate the malaria exposure risk in rural landscapes. The remaining amount of forest cover (accumulated deforestation) and the deforestation timeline were estimated in each site to represent the main parameters of both the frontier malaria hypothesis and an alternate scenario, the deforestation-malaria hypothesis, proposed herein. The maximum frequency of pathogenic sites occurred at the intermediate forest cover level (50% of accumulated deforestation) at two temporal deforestation peaks, e.g., 10 and 35 years after the beginning of the organization of a settlement. The incidence density of infected anophelines in sites where the original forest cover decreased by more than 50% in the first 25 years of settlement development was at least twice as high as the incidence density calculated for the other sites studied (adjusted incidence density ratio = 2.25; 95% CI, 1.38-3.68; p = 0.001). The results of this study support the frontier malaria as a unifying hypothesis for explaining malaria emergence and for designing specific control interventions in the Brazilian Amazon.