Changes in climate patterns are dramatically influencing some agricultural areas. Arid, semi‐arid and coastal agricultural areas are especially vulnerable to climate change impacts on soil salinity. ...Inventorying and monitoring climate change impacts on salinity are crucial to evaluate the extent of the problem, to recognize trends and to formulate irrigation and crop management strategies that will maintain the agricultural productivity of these areas. Over the past three decades, Corwin and colleagues at the U.S. Salinity Laboratory (USSL) have developed proximal sensor and remote imagery methodologies for assessing soil salinity at multiple scales. The objective of this paper is to evaluate the impact climate change has had on selected agricultural areas experiencing weather pattern changes, with a focus on the use of proximal and satellite sensors to assess salinity development. Evidence presented in case studies for Californiaʼs San Joaquin Valley (SJV) and Minnesotaʼs Red River Valley (RRV) demonstrates the utility of these sensor approaches in assessing soil salinity changes due to changes in weather patterns. Agricultural areas are discussed where changes in weather patterns have increased root‐zone soil salinity, particularly in areas with shallow water tables (SJV and RRV), coastal areas with seawater intrusion (e.g., Bangladesh and the Gaza Strip) and water‐scarce areas potentially relying on degraded groundwater as an irrigation source (SJV and Murray‐Darling River Basin). Trends in salinization due to climate change indicate that the infrastructure and protocols to monitor soil salinity from field to regional to national to global scales are needed.
Highlights
Climate change will have a negative impact on agriculture, particularly in arid regions.
Proximal/remote sensors are useful to assess climate change impact on soil salinity across scales.
Salt‐water intrusion, shallow water tables and degraded water reuse will increase soil salinity.
Infrastructure and protocols to monitor soil salinity across multiple scales are needed.
Anemia, which is common in the critically ill, is often treated with red-cell transfusions, which are associated with poor clinical outcomes. We hypothesized that therapy with recombinant human ...erythropoietin (epoetin alfa) might reduce the need for red-cell transfusions.
In this prospective, randomized, placebo-controlled trial, we enrolled 1460 medical, surgical, or trauma patients between 48 and 96 hours after admission to the intensive care unit. Epoetin alfa (40,000 U) or placebo was administered weekly, for a maximum of 3 weeks; patients were followed for 140 days. The primary end point was the percentage of patients who received a red-cell transfusion. Secondary end points were the number of red-cell units transfused, mortality, and the change in hemoglobin concentration from baseline.
As compared with the use of placebo, epoetin alfa therapy did not result in a decrease in either the number of patients who received a red-cell transfusion (relative risk for the epoetin alfa group vs. the placebo group, 0.95; 95% confidence interval CI, 0.85 to 1.06) or the mean (+/-SD) number of red-cell units transfused (4.5+/-4.6 units in the epoetin alfa group and 4.3+/-4.8 units in the placebo group, P=0.42). However, the hemoglobin concentration at day 29 increased more in the epoetin alfa group than in the placebo group (1.6+/-2.0 g per deciliter vs. 1.2+/-1.8 g per deciliter, P<0.001). Mortality tended to be lower at day 29 among patients receiving epoetin alfa (adjusted hazard ratio, 0.79; 95% CI, 0.56 to 1.10); this effect was also seen in prespecified analyses in those with a diagnosis of trauma (adjusted hazard ratio, 0.37; 95% CI, 0.19 to 0.72). A similar pattern was seen at day 140 (adjusted hazard ratio, 0.86; 95% CI, 0.65 to 1.13), particularly in those with trauma (adjusted hazard ratio, 0.40; 95% CI, 0.23 to 0.69). As compared with placebo, epoetin alfa was associated with a significant increase in the incidence of thrombotic events (hazard ratio, 1.41; 95% CI, 1.06 to 1.86).
The use of epoetin alfa does not reduce the incidence of red-cell transfusion among critically ill patients, but it may reduce mortality in patients with trauma. Treatment with epoetin alfa is associated with an increase in the incidence of thrombotic events. (ClinicalTrials.gov number, NCT00091910 ClinicalTrials.gov.).
Abstract Research has focused on understanding how overeating can affect brain reward mechanisms and subsequent behaviors, both preclinically and in clinical research settings. This work is partly ...driven by the need to uncover the etiology and possible treatments for the ongoing obesity epidemic. However, overeating, or non-homeostatic feeding behavior, can occur independent of obesity. Isolating the variable of overeating from the consequence of increased body weight is of great utility, as it is well known that increased body weight or obesity can impart its own deleterious effects on physiology, neural processes, and behavior. In this review, we present data from three selected animal models of normal-weight non-homeostatic feeding behavior that have been significantly influenced by Bart Hoebel's 40+-yr career studying motivation, feeding, reinforcement, and the neural mechanisms that participate in the regulation of these processes. First, a model of sugar bingeing is described (Avena/Hoebel), in which animals with repeated, intermittent access to a sugar solution develop behaviors and brain changes that are similar to the effects of some drugs of abuse, serving as the first animal model of food addiction. Second, another model is described (Boggiano) in which a history of dieting and stress can perpetuate further binge eating of palatable and non-palatable food. In addition, a model (Boggiano) is described that allows animals to be classified as having a binge-prone vs. binge-resistant behavioral profile. Lastly, a limited access model is described (Corwin) in which non-food deprived rats with sporadic limited access to a high-fat food develop binge-type behaviors. These models are considered within the context of their effects on brain reward systems, including dopamine, the opioids, cholinergic systems, serotonin, and GABA. Collectively, the data derived from the use of these models clearly show that behavioral and neuronal consequences of bingeing on a palatable food, even when at a normal body weight, are different from those that result from simply consuming the palatable food in a non-binge manner. These findings may be important in understanding how overeating can influence behavior and brain chemistry.
Water for irrigation is a major limitation to agricultural production in many parts of the world. Use of waters with elevated levels of salinity is one likely option to meet the supply of increased ...demands. The sources of these waters include drainage water generated by irrigated agriculture, municipal wastewater, and poor quality groundwater. Soil salinity leaching requirements that were established several decades ago were based on steady-state conditions. Recently transient-state models have been developed that potentially can more correctly predict the dynamics of the chemical–physical–biological interactions in an agricultural system. The University of California Center for Water Resources appointed a workgroup to review the development of steady-state analyses and transient-state models, and to determine whether the current recommended guidelines for leaching requirement based on steady-state analyses need to be revised. The workgroup concludes that the present guidelines overestimate the leaching requirement and the negative consequences of irrigating with saline waters. This error is particularly large at low leaching fractions. This is a fortuitous finding because irrigating to achieve low leaching fractions provides a more efficient use of limited water supplies.
To determine whether the administration of recombinant human erythropoietin (rHuEPO) to critically ill patients in the intensive care unit (ICU) would reduce the number of red blood cell (RBC) ...transfusions required.
A prospective, randomized, double-blind, placebo-controlled, multicenter trial.
ICUs at three academic tertiary care medical centers.
A total of 160 patients who were admitted to the ICU and met the eligibility criteria were enrolled in the study (80 into the rHuEPO group; 80 into the placebo group).
Patients were randomized to receive either rHuEPO or placebo. The study drug (300 units/kg of rHuEPO or placebo) was administered by subcutaneous injection beginning ICU day 3 and continuing daily for a total of 5 days (until ICU day 7). The subsequent dosing schedule was every other day to achieve a hematocrit (Hct) concentration of >38%. The study drug was given for a minimum of 2 wks or until ICU discharge (for subjects with ICU lengths of stay >2 wks) up to a total of 6 wks (42 days) postrandomization.
The cumulative number of units of RBCs transfused was significantly less in the rHuEPO group than in the placebo group (p<.002, Kolmogorov-Smirnov test). The rHuEPO group was transfused with a total of 166 units of RBCs vs. 305 units of RBCs transfused in the placebo group. The final Hct concentration of the rHuEPO patients was significantly greater than the final Hct concentration of placebo patients (35.1+/-5.6 vs. 31.6+/-4.1; p<.01, respectively). A total of 45% of patients in the rHuEPO group received a blood transfusion between days 8 and 42 or died before study day 42 compared with 55% of patients in the placebo group (relative risk, 0.8; 95% confidence interval, 0.6, 1.1). There were no significant differences between the two groups either in mortality or in the frequency of adverse events.
The administration of rHuEPO to critically ill patients is effective in raising their Hct concentrations and in reducing the total number of units of RBCs they require.
The field-scale application of apparent soil electrical conductivity (EC
a) to agriculture has its origin in the measurement of soil salinity, which is an arid-zone problem associated with irrigated ...agricultural land and with areas having shallow water tables. Apparent soil electrical conductivity is influenced by a combination of physico-chemical properties including soluble salts, clay content and mineralogy, soil water content, bulk density, organic matter, and soil temperature; consequently, measurements of EC
a have been used at field scales to map the spatial variation of several edaphic properties: soil salinity, clay content or depth to clay-rich layers, soil water content, the depth of flood deposited sands, and organic matter. In addition, EC
a has been used at field scales to determine a variety of anthropogenic properties: leaching fraction, irrigation and drainage patterns, and compaction patterns due to farm machinery. Since its early agricultural use as a means of measuring soil salinity, the agricultural application of EC
a has evolved into a widely accepted means of establishing the spatial variability of several soil physico-chemical properties that influence the EC
a measurement. Apparent soil electrical conductivity is a quick, reliable, easy-to-take soil measurement that often, but not always, relates to crop yield. For these reasons, the measurement of EC
a is among the most frequently used tools in precision agriculture research for the spatio-temporal characterization of edaphic and anthropogenic properties that influence crop yield. It is the objective of this paper to provide a review of the development and use of EC
a measurements for agricultural purposes, particularly from a perspective of precision agriculture applications. Background information is presented to provide the reader with (i) an understanding of the basic theories and principles of the EC
a measurement, (ii) an overview of various EC
a measurement techniques, (iii) applications of EC
a measurements in agriculture, particularly site-specific crop management, (iv) guidelines for conducting an EC
a survey, and (v) current trends and future developments in the application of EC
a to precision agriculture. Unquestionably, EC
a is an invaluable agricultural tool that provides spatial information for soil quality assessment and precision agriculture applications including the delineation of site-specific management units. Technologies such as geo-referenced EC
a measurement techniques have brought precision agriculture from a 1980's concept to a promising tool for achieving sustainable agriculture.
Acute kidney injury is common in critically ill patients, with an incidence of 20% to 30%. It has been associated with increased mortality, hospital length of stay, and total cost. A number of ...strategies may be beneficial in identifying at-risk patients. In addition, using preventive measures and avoiding nephrotoxic medications are paramount in reducing the overall incidence. Although multifactorial, drug-induced acute kidney injury may account for up to 25% of all cases of acute kidney injury in this population. This review focuses on the mechanisms of drug-induced acute kidney injury in critically ill adults and offers preventive strategies when appropriate.
To quantify the incidence of anemia and red blood cell (RBC) transfusion practice in critically ill patients and to examine the relationship of anemia and RBC transfusion to clinical outcomes.
...Prospective, multiple center, observational cohort study of intensive care unit (ICU) patients in the United States. Enrollment period was from August 2000 to April 2001. Patients were enrolled within 48 hrs of ICU admission. Patient follow-up was for 30 days, hospital discharge, or death, whichever occurred first.
A total of 284 ICUs (medical, surgical, or medical-surgical) in 213 hospitals participated in the study.
A total of 4,892 patients were enrolled in the study.
The mean hemoglobin level at baseline was 11.0 +/- 2.4 g/dL. Hemoglobin level decreased throughout the duration of the study. Overall, 44% of patients received one or more RBC units while in the ICU (mean, 4.6 +/- 4.9 units). The mean pretransfusion hemoglobin was 8.6 +/- 1.7 g/dL. The mean time to first ICU transfusion was 2.3 +/- 3.7 days. More RBC transfusions were given in study week 1; however, in subsequent weeks, subjects received one to two RBC units per week while in the ICU. The number of RBC transfusions a patient received during the study was independently associated with longer ICU and hospital lengths of stay and an increase in mortality. Patients who received transfusions also had more total complications and were more likely to experience a complication. Baseline hemoglobin was related to the number of RBC transfusions, but it was not an independent predictor of length of stay or mortality. However, a nadir hemoglobin level of <9 g/dL was a predictor of increased mortality and length of stay.
Anemia is common in the critically ill and results in a large number of RBC transfusions. Transfusion practice has changed little during the past decade. The number of RBC units transfused is an independent predictor of worse clinical outcome.
Culture of Neisseria gonorrhoeae is essential for surveillance of complete antimicrobial susceptibility profiles. In 2014, the culture success rate of N. gonorrhoeae from samples taken at the clinic ...for sexually transmitted infections (STI clinic), Oslo University Hospital, Norway, was only 20%. The present study aimed to improve gonococcal culture rates using bedside inoculation of patient samples on gonococcal agar plates and incubation at the STI clinic.
This prospective quality improvement study was conducted by the STI clinic and the Department of Microbiology at Oslo University Hospital from May 2016 - October 2017. When culture of N. gonorrhoeae was clinically indicated, we introduced a parallel 'bedside culture' at the STI clinic and compared results with the standard culture at the microbiology department. Samples were taken from urethra, anorectum, pharynx and cervix. Culture rates were compared across symptomatic and asymptomatic anatomical sites.
From 596 gonococcal-positive PCR samples, bedside culture had a significantly higher success rate of 57% compared to 41% with standard culture (p < 0.05). Overall, culture rate from symptomatic sites was 91% v. 45% from asymptomatic sites. The culture rates from different anatomical sites were as follows: urethra 93%, anorectum 64%, pharynx 28% and cervix 70%. Bedside culture significantly (p < 0.05) improved the culture rates for symptomatic urethral and asymptomatic pharyngeal samples.
Where feasible, bedside inoculation on gonococcal agar plates and incubation of samples from patients with gonorrhoea is recommended. This will improve the culture diagnostics and provide additional gonococcal isolates for antimicrobial resistance surveillance.