This study evaluated the effects of early time-restricted eating (eTRE) on shifting the timing of sleep among late sleepers. Primary outcomes included actigraphy- and sleep diary-derived sleep onset, ...midsleep phase, and wake time with total sleep time as a secondary outcome.
Fifteen healthy adults with habitual late sleep timing were randomized to receive either eTRE or sleep and nutrition hygiene (control) via a single 30-minute synchronous video session. Participants completed an initial 1-week baseline phase followed by a 2-week intervention phase. Measures included continuous sleep monitoring and sleep and nutrition diaries.
Linear mixed-effects modeling demonstrated that eTRE significantly advanced sleep timing compared with controls. Self-reported sleep onset (56.1 95% confidence interval: 20.5, 91.7 minutes), midpoint (19.5 7.2, 31.9 minutes), and offset (42.2 2.9, 81.5 minutes) each moved earlier in eTRE as compared with controls. Similarly, objectively determined sleep onset (66.5 29.6, 103.4 minutes), midpoint (21.9 9.1, 34.7 minutes), and offset (39.3 1.3, 77.3 minutes) each moved earlier in eTRE as compared with controls. Total sleep time showed a nonsignificant increase in the eTRE group as compared with controls.
Late sleepers who were instructed in a single session about eTRE significantly advanced their sleep timing, especially sleep onset. eTRE shows potential as a clinical strategy for advancing sleep timing in late sleepers.
Registry: Chinese Clinical Trial Registry; Name: FAST Asleep: It's All About Timing; URL: https://www.chictr.org.cn/showproj.html?proj=122504; Identifier: ChiCTR2100043691.
Blum DJ, Hernandez B, Zeitzer JM. Early time-restricted eating advances sleep in late sleepers: a pilot randomized controlled trial.
. 2023;19(12):2097-2106.
•Around half of total deforestation is illegal in the tropics and subtropics.•We analyzed 244 illegal deforestation events in the Argentine Dry Chaco.•We used a Bayesian network to explain the type ...and size of illegal events.•Subnational institutions and power of landholders are the most influential factors.•Limiting land concentration and enforcing forest regulations can reduce illegality.
Deforestation is a main threat to the biosphere due to its contribution to biodiversity loss, carbon emissions, and land degradation. Most deforestation is illegal and continues unabated, representing around half of the total deforestation in the tropics and subtropics. Quantifying illegal deforestation is challenging, let alone assessing the social and institutional processes underlying its occurrence. We tackle this challenge by quantifying the relative influence of individual (i.e., landholders’ power, landholding size) and contextual (i.e., subnational institutions, agricultural suitability) factors on the type and size of illegal deforestation in the Argentine Dry Chaco, a major commodity production frontier and global deforestation hotspot. We build a Bayesian network fed with data of 244 illegal deforestation events, obtained from journalistic articles, grey literature, key informant interviews, and geospatial analyses. The results reveal that more powerful landholders were associated with larger illegal deforestation events. Policy simulations suggest that higher concentration of land in the hands of powerful landholders and more flexible subnational forest regulations would escalate illegal deforestation. This points to the need for a smart policy mix that integrates across economic, agricultural, and environmental sectors to halt illegal deforestation at commodity production frontiers. A land tenure reform can facilitate forest protection, while incentives to land-use diversification and the criminal prosecution of illegal deforestation are critical to shift landholder behavior towards more balanced production and conservation outcomes.
Biodiversity and natural resources constitute a social safety net for forest-dependent communities and represent their main source of livelihood. Agricultural expansion driven by global food demand ...is not only deeply altering landscapes at the local level but also affect the forms of life and culture of rural life. These changes are increasing inequalities between stakeholders in developing countries and causing the direct displacement of numerous rural families. In this article, we focus on the Argentine Dry Chaco, one of the most threatened forest systems in the world, to analyse evidence about how land-use changes asymmetrically affect social wellbeing across landscapes and generate conflicts between stakeholders regarding the use and access to natural resources. This information needs to be considered for better territorial planning and to propose conflict resolution strategies towards more just and sustainable relationships between people and nature in complex landscapes.
Introduction Emerging research links gut microbial health with sleep. One common sleep disorder in which the microbiome may play a role is restless legs syndrome (RLS). While the pathogenesis of RLS ...is not fully understood, a relative state of brain iron deficiency has been described in patients with RLS and appears to induce changes in several pathways (adenosinergic, glutamatergic and dopaminergic) known to be involved in the disease. Insufficient iron may be secondary to dietary iron deficiency or, potentially, gut inflammation. We hypothesized that small intestinal bacterial overgrowth (SIBO), a condition associated with gut dysbiosis (i.e., normally rare gut-residing bacteria are over-represented in the gut), is associated with RLS and may moderate the observed inter-patient variability in serum iron availability. Methods Participants are being recruited at the Stanford Sleep Center for three groups: RLS and low peripheral iron stores (<50ng/mL and/or transferrin saturation <18%), RLS and normal peripheral iron stores, and insomnia (control). Participants complete questionnaires concerning sleep and SIBO symptoms and are sent home with a fecal collection kit (Fecal Swab Collection and Preservation System, Norgen Biotek) and a SIBO kit (SIBO Home Breath Test Kit, Quintron). Fecal samples are assayed by the University of Minnesota Genomics Center with microbial community profiling evaluated by 16S ribosomal RNA (16S rRNA) gene sequencing protocols. SIBO breath samples are evaluated by Aerodiagnostics for hydrogen and methane abnormalities. Results Seven participants diagnosed with RLS (3 men, 4 women) have thus far completed the protocol. All indicated poor sleep quality (PSQI ≥ 5) and moderate to severe symptoms of RLS (IRLS scores ranging from 13 to 34/40). SIBO was present in all 7 participants (100%) whereas general population rates are estimated to be 6-15%. Conclusion These preliminary data suggest that SIBO may be more prevalent among patients with RLS. Additional analyses will examine fecal microbial composition, subtypes of RLS iron deficiency, and comparisons with insomnia. Support (If Any) Pau Innovation Gift Fund Seed Grant
Posttraumatic stress disorder (PTSD) is highly comorbid with sleep dysfunction. This association was previously explained based on cognitive and emotional dysfunction. The current study extends this ...literature by investigating the symptom level comorbidity of sleep dysfunction and DSM-5 PTSD utilizing a network approach. Participants were trauma-exposed female Filipino domestic workers (N = 1241). Network analysis was applied to 23 items: 18 items from PCL-5 measuring PTSD (Community 1) and 5 items from PSQI assessing sleep dysfunction (Community 2). The results showed that the symptoms within each community had the strongest correlations. Bridge connections were identified between the sleep dysfunction and PTSD symptom communities. Symptoms with the highest bridge strength were concentration difficulties, recklessness, irritability, and sleep disturbance. This is among the first studies investigating the comorbidity between PTSD and sleep dysfunction from the network approach. Future interventions may be developed that emphasize the bridge symptoms to address comorbidity among trauma exposed migrants.
Display omitted
•Pdr5 reconstituted in planar lipid bilayers displays ion channel activity, with a slight cation selectivity (PK+:PCl-=2.6:1).•ATP/Mg2+ and substrate are required to induce a membrane ...potential of Vm = 58 mV, corresponding to Vm = −58 mV in the cytosol in vivo.•Pdr5 co-transports H+ and substrate in a ATP/Mg2+-dependent manner.•H+ transport can be visualized in an in vitro transport assay.
The two major efflux pump systems that are involved in multidrug resistance (MDR) are (i) ATP binding cassette (ABC) transporters and (ii) secondary transporters. While the former use binding and hydrolysis of ATP to facilitate export of cytotoxic compounds, the latter utilize electrochemical gradients to expel their substrates. Pdr5 from Saccharomyces cerevisiae is a prominent member of eukaryotic ATP binding cassette (ABC) transporters that are involved in multidrug resistance (MDR) and used as a frequently studied model system. Although investigated for decades, the underlying molecular mechanisms of drug transport and substrate specificity remain elusive. Here, we provide electrophysiological data on the reconstituted Pdr5 demonstrating that this MDR efflux pump does not only actively translocate its substrates across the lipid bilayer, but at the same time generates a proton motif force in the presence of Mg2+-ATP and substrates by acting as a proton/drug co-transporter. Importantly, a strictly substrate dependent co-transport of protons was also observed in in vitro transport studies using Pdr5-enriched plasma membranes. We conclude from these results that the mechanism of MDR conferred by Pdr5 and likely other transporters is more complex than the sole extrusion of cytotoxic compounds and involves secondary coupled processes suitable to increase the effectiveness.
Anticoagulation with either a vitamin K antagonist or a direct oral anticoagulant may be associated with AKI. Our objective was to assess the risk of AKI among elderly individuals with atrial ...fibrillation newly prescribed a direct oral anticoagulant (dabigatran, rivaroxaban, or apixaban) versus warfarin.
Our population-based cohort study included 20,683 outpatients in Ontario, Canada, ≥66 years with atrial fibrillation who were prescribed warfarin, dabigatran, rivaroxaban, or apixaban between 2009 and 2017. Inverse probability of treatment weighting on the basis of derived propensity scores for the treatment with each direct oral anticoagulant was used to balance baseline characteristics among patients receiving each of the three direct oral anticoagulants compared with warfarin. Cox proportional hazards regression was performed in the weighted population to compare the association between the prescribed anticoagulant and the outcomes of interest. The exposure was an outpatient prescription of warfarin or one of the direct oral anticoagulants. The primary outcome was a hospital encounter with AKI, defined using Kidney Disease Improving Global Outcomes thresholds. Prespecified subgroup analyses were conducted by eGFR category and by the percentage of international normalized ratio measurements in range, a validated marker of anticoagulation control.
Each direct oral anticoagulant was associated with a significantly lower risk of AKI compared with warfarin (weighted hazard ratio, 0.65; 95% confidence interval, 0.53 to 0.80 for dabigatran; weighted hazard ratio, 0.85; 95% confidence interval, 0.73 to 0.98 for rivaroxaban; and weighted hazard ratio, 0.81; 95% confidence interval, 0.72 to 0.93 for apixaban). In the subgroup analysis, the lower risk of AKI associated with each direct oral anticoagulant was consistent across each eGFR strata. The risk of AKI was significantly lower among users of each of the direct oral anticoagulants compared with warfarin users who had a percentage of international normalized ratio measurements ≤56%.
Direct oral anticoagulants were associated with a lower risk of AKI compared with warfarin.
Surveillance blood work is routinely performed in maintenance hemodialysis (HD) recipients. Although more frequent blood testing may confer better outcomes, there is little evidence to support any ...particular monitoring interval.
Retrospective population-based cohort study.
All prevalent HD recipients in Ontario, Canada, as of April 1, 2011, and a cohort of incident patients commencing maintenance HD in Ontario, Canada, between April 1, 2011, and March 31, 2016.
Frequency of surveillance blood work, monthly versus every 6 weeks.
The primary outcome was all-cause mortality. Secondary outcomes were major adverse cardiovascular events, all-cause hospitalization, and episodes of hyperkalemia.
Cox proportional hazards with adjustment for demographic and clinical characteristics was used to evaluate the association between blood testing frequency and all-cause mortality. Secondary outcomes were evaluated using the Andersen-Gill extension of the Cox model to allow for potential recurrent events.
7,454 prevalent patients received care at 17 HD programs with monthly blood sampling protocols (n=5,335 patients) and at 8 programs with blood sampling every 6 weeks (n=2,119 patients). More frequent monitoring was not associated with a lower risk for all-cause mortality compared to blood sampling every 6 weeks (adjusted HR, 1.16; 95% CI, 0.99-1.38). Monthly monitoring was not associated with a lower risk for any of the secondary outcomes. Results were consistent among incident HD recipients.
Unmeasured confounding; limited data for center practices unrelated to blood sampling frequency; no information on frequency of unscheduled blood work performed outside the prescribed sampling interval.
Monthly routine blood testing in HD recipients was not associated with a lower risk for death, cardiovascular events, or hospitalizations as compared with testing every 6 weeks. Given the health resource implications, the frequency of routine blood sampling in HD recipients deserves careful reassessment.