Small cell carcinoma of the ovary of hypercalcemic type (SCCOHT) is an extremely rare, aggressive cancer affecting children and young women. We identified germline and somatic inactivating mutations ...in the SWI/SNF chromatin-remodeling gene SMARCA4 in 75% (9/12) of SCCOHT cases in addition to SMARCA4 protein loss in 82% (14/17) of SCCOHT tumors but in only 0.4% (2/485) of other primary ovarian tumors. These data implicate SMARCA4 in SCCOHT oncogenesis.
Background
Current guidelines for the treatment of adult severe traumatic brain injury (sTBI) consist of high-quality evidence reports, but they are no longer accompanied by management protocols, as ...these require expert opinion to bridge the gap between published evidence and patient care. We aimed to establish a modern sTBI protocol for adult patients with both intracranial pressure (ICP) and brain oxygen monitors in place.
Methods
Our consensus working group consisted of 42 experienced and actively practicing sTBI opinion leaders from six continents. Having previously established a protocol for the treatment of patients with ICP monitoring alone, we addressed patients who have a brain oxygen monitor in addition to an ICP monitor. The management protocols were developed through a Delphi-method-based consensus approach and were finalized at an in-person meeting.
Results
We established three distinct treatment protocols, each with three tiers whereby higher tiers involve therapies with higher risk. One protocol addresses the management of ICP elevation when brain oxygenation is normal. A second addresses management of brain hypoxia with normal ICP. The third protocol addresses the situation when both intracranial hypertension and brain hypoxia are present. The panel considered issues pertaining to blood transfusion and ventilator management when designing the different algorithms.
Conclusions
These protocols are intended to assist clinicians in the management of patients with both ICP and brain oxygen monitors but they do not reflect either a standard-of-care or a substitute for thoughtful individualized management. These protocols should be used in conjunction with recommendations for basic care, management of critical neuroworsening and weaning treatment recently published in conjunction with the Seattle International Brain Injury Consensus Conference.
Advanced biliary tract cancer has a poor prognosis. Cisplatin and gemcitabine is the standard first-line chemotherapy regimen, but no robust evidence is available for second-line chemotherapy. The ...aim of this study was to determine the benefit derived from second-line FOLFOX (folinic acid, fluorouracil, and oxaliplatin) chemotherapy in advanced biliary tract cancer.
The ABC-06 clinical trial was a phase 3, open-label, randomised trial done in 20 sites with expertise in managing biliary tract cancer across the UK. Adult patients (aged ≥18 years) who had histologically or cytologically verified locally advanced or metastatic biliary tract cancer (including cholangiocarcinoma and gallbladder or ampullary carcinoma) with documented radiological disease progression to first-line cisplatin and gemcitabine chemotherapy and an Eastern Cooperative Oncology Group performance status of 0–1 were randomly assigned (1:1) centrally to active symptom control (ASC) and FOLFOX or ASC alone. FOLFOX chemotherapy was administered intravenously every 2 weeks for a maximum of 12 cycles (oxaliplatin 85 mg/m2, L-folinic acid 175 mg or folinic acid 350 mg, fluorouracil 400 mg/m2 bolus, and fluorouracil 2400 mg/m2 as a 46-h continuous intravenous infusion). Randomisation was done following a minimisation algorithm using platinum sensitivity, serum albumin concentration, and stage as stratification factors. The primary endpoint was overall survival, assessed in the intention-to-treat population. Safety was also assessed in the intention-to-treat population. The study is complete and the final results are reported. This trial is registered with ClinicalTrials.gov, NCT01926236, and EudraCT, 2013-001812-30.
Between March 27, 2014, and Jan 4, 2018, 162 patients were enrolled and randomly assigned to ASC plus FOLFOX (n=81) or ASC alone (n=81). Median follow-up was 21·7 months (IQR 17·2–30·8). Overall survival was significantly longer in the ASC plus FOLFOX group than in the ASC alone group, with a median overall survival of 6·2 months (95% CI 5·4–7·6) in the ASC plus FOLFOX group versus 5·3 months (4·1–5·8) in the ASC alone group (adjusted hazard ratio 0·69 95% CI 0·50–0·97; p=0·031). The overall survival rate in the ASC alone group was 35·5% (95% CI 25·2–46·0) at 6 months and 11·4% (5·6–19·5) at 12 months, compared with 50·6% (39·3–60·9) at 6 months and 25·9% (17·0–35·8) at 12 months in the ASC plus FOLFOX group. Grade 3–5 adverse events were reported in 42 (52%) of 81 patients in the ASC alone group and 56 (69%) of 81 patients in the ASC plus FOLFOX group, including three chemotherapy-related deaths (one each due to infection, acute kidney injury, and febrile neutropenia). The most frequently reported grade 3–5 FOLFOX-related adverse events were neutropenia (ten 12% patients), fatigue or lethargy (nine 11% patients), and infection (eight 10% patients).
The addition of FOLFOX to ASC improved median overall survival in patients with advanced biliary tract cancer after progression on cisplatin and gemcitabine, with a clinically meaningful increase in 6-month and 12-month overall survival rates. To our knowledge, this trial is the first prospective, randomised study providing reliable, high-quality evidence to allow an informed discussion with patients of the potential benefits and risks from second-line FOLFOX chemotherapy in advanced biliary tract cancer. Based on these findings, FOLFOX should become standard-of-care chemotherapy in second-line treatment for advanced biliary tract cancer and the reference regimen for further clinical trials.
Cancer Research UK, StandUpToCancer, AMMF (The UK Cholangiocarcinoma Charity), and The Christie Charity, with additional funding from The Cholangiocarcinoma Foundation and the Conquer Cancer Foundation Young Investigator Award for translational research.
Irrigation’s effects on precipitation during an exceptionally dry summer (June–August 2012) in the United States were quantified by incorporating a novel dynamic irrigation scheme into the Weather ...Research and Forecasting (WRF) Model. The scheme is designed to represent a typical application strategy for farmlands across the conterminous United States (CONUS) and a satellite-derived irrigation map was incorporated into the WRF-Noah-Mosaic module to realistically trigger the irrigation. Results show that this new irrigation approach can dynamically generate irrigation water amounts that are in close agreement with the actual irrigation water amounts across the high plains (HP), where the prescribed scheme best matches real-world irrigation practices. Surface energy and water budgets have been substantially altered by irrigation, leading to modified large-scale atmospheric circulations. In the studied dry summer, irrigation was found to strengthen the dominant interior high pressure system over the southern and central United States and deepen the trough over the upper Midwest. For the HP and central United States, the rainfall amount is slightly reduced over irrigated areas, likely as a result of a reduction in both local convection and large-scale moisture convergence resulting from interactions and feedbacks between the land surface and atmosphere. In areas downwind of heavily irrigated regions, precipitation is enhanced, resulting in a 20%–100% reduction in the dry biases (relative to the observations) simulated over a large portion of the downwind areas without irrigation in the model. The introduction of irrigation reduces the overall mean biases and root-mean-square errors in the simulated daily precipitation over the CONUS.
There has been an increase in tile drained area across the US Midwest and other regions worldwide due to agricultural expansion, intensification, and climate variability. Despite this growth, ...spatially explicit tile drainage maps remain scarce, which limits the accuracy of hydrologic modeling and implementation of nutrient reduction strategies. Here, we developed a machine-learning model to provide a Spatially Explicit Estimate of Tile Drainage (SEETileDrain) across the US Midwest in 2017 at a 30-m resolution. This model used 31 satellite-derived and environmental features after removing less important and highly correlated features. It was trained with 60,938 tile and non-tile ground truth points within the Google Earth Engine cloud-computing platform. We also used multiple feature importance metrics and Accumulated Local Effects to interpret the machine learning model. The results show that our model achieved good accuracy, with 96 % of points classified correctly and an F1 score of 0.90. When tile drainage area is aggregated to the county scale, it agreed well (r2 = 0.69) with the reported area from the Ag Census. We found that Land Surface Temperature (LST) along with climate- and soil-related features were the most important factors for classification. The top-ranked feature is the median summer nighttime LST, followed by median summer soil moisture percent. This study demonstrates the potential of applying satellite remote sensing to map spatially explicit agricultural tile drainage across large regions. The results should be useful for land use change monitoring and hydrologic and nutrient models, including those designed to achieve cost-effective agricultural water and nutrient management strategies. The algorithms developed here should also be applicable for other remote sensing mapping applications.
Display omitted
•A 30-m resolution spatially-explicit tile drainage map was created for the US Midwest through Google Earth Engine services.•The map has an overall accuracy of 96 %, precision of 85 %, and recall of 96 %.•Estimated tile drainage presence correlates to land surface temperature, soil, and weather.•A novel measure is proposed to identify feature importance in random forest classification.•The proposed method can be applied to map tile drainage areas across years and regions.
Understanding how irrigated areas change over time is vital to effectively manage limited agricultural water resources, but long-term, high-resolution, and spatially explicit datasets are rare. The ...High Plains Aquifer (HPA) in the central United States is one of the largest and most stressed aquifer systems in the world. It supports a $20 billion economy, but groundwater use is unsustainable over much of the aquifer. Emerging cloud computing tools like Google Earth Engine (GEE) make it possible to use the full Landsat record to monitor regional systems like the HPA with high spatial and temporal resolution over multiple decades. However, challenges remain to develop irrigation classification methods that are robust to a wide range of climate conditions and crop types, evolving management, and missing data. Here, we addressed these challenges to produce an annual, moderately high resolution (30 m) irrigation map time series from 1984 to 2017 over the aquifer. Leveraging GEE's extensive data catalog, we combined Landsat imagery, environmental covariables, and a large heterogeneous ground truth dataset to create a single random forest classifier applied annually to the entire region. Following classification, we applied the Bayesian Updating of Land-Cover (BULC) algorithm to fill imagery gaps and reduce commission errors in the provisional irrigation time series. Novel neighborhood greenness indices contributed to an overall 91.4% map accuracy across years; county statistics (r2 = 0.86) were similarly well-matched. Trend analysis of irrigated area through time identified regions of stable, expanding, and declining irrigated area. Given declining aquifer storage, we estimate that up to 24% of currently irrigated area may be lost this century. To date, the map dataset is the longest, highest resolution large-scale record of where and when irrigation occurs. It is freely available for stakeholders, managers, and researchers to inform policies and management decisions, as well as for use in hydrology, agronomy, and climate models.
•Stressed systems like the High Plains Aquifer need irrigation data for management.•Google Earth Engine enables annual tracking of irrigated area over multiple decades.•Data gaps in the early Landsat record can be addressed using Bayesian statistics.•New neighborhood greenness indices contributed to 91% map accuracy across 34 years.•This is the longest, highest resolution large-scale record of irrigation to date.
Monitoring and controlling 2 language systems is fundamental to language use in bilinguals. Here, we reveal in a combined functional (event-related functional magnetic resonance imaging) and ...structural neuroimaging (voxel-based morphometry) study that dorsal anterior cingulate cortex (ACC), a structure tightly bound to domain-general executive control functions, is a common locus for language control and resolving nonverbal conflict. We also show an experience-dependent effect in the same region: Bilinguals use this structure more efficiently than monolinguals to monitor nonlinguistic cognitive conflicts. They adapted better to conflicting situations showing less ACC activity while outperforming monolinguals. Importantly, for bilinguals, brain activity in the ACC, as well as behavioral measures, also correlated positively with local gray matter volume. These results suggest that early learning and lifelong practice of 2 languages exert a strong impact upon human neocortical development. The bilingual brain adapts better to resolve cognitive conflicts in domain-general cognitive tasks.
Electroencephalography (EEG) holds promise as a neuroimaging technology that can be used to understand how the human brain functions in real-world, operational settings while individuals move freely ...in perceptually-rich environments. In recent years, several EEG systems have been developed that aim to increase the usability of the neuroimaging technology in real-world settings. Here, the usability of three wireless EEG systems from different companies are compared to a conventional wired EEG system, BioSemi's ActiveTwo, which serves as an established laboratory-grade 'gold standard' baseline. The wireless systems compared include Advanced Brain Monitoring's B-Alert X10, Emotiv Systems' EPOC and the 2009 version of QUASAR's Dry Sensor Interface 10-20. The design of each wireless system is discussed in relation to its impact on the system's usability as a potential real-world neuroimaging system. Evaluations are based on having participants complete a series of cognitive tasks while wearing each of the EEG acquisition systems. This report focuses on the system design, usability factors and participant comfort issues that arise during the experimental sessions. In particular, the EEG systems are assessed on five design elements: adaptability of the system for differing head sizes, subject comfort and preference, variance in scalp locations for the recording electrodes, stability of the electrical connection between the scalp and electrode, and timing integration between the EEG system, the stimulus presentation computer and other external events.
Background
Management algorithms for adult severe traumatic brain injury (sTBI) were omitted in later editions of the Brain Trauma Foundation’s sTBI Management Guidelines, as they were not ...evidence-based.
Methods
We used a Delphi-method-based consensus approach to address management of sTBI patients undergoing intracranial pressure (ICP) monitoring. Forty-two experienced, clinically active sTBI specialists from six continents comprised the panel. Eight surveys iterated queries and comments. An in-person meeting included whole- and small-group discussions and blinded voting. Consensus required 80% agreement. We developed heatmaps based on a traffic-light model where panelists’ decision tendencies were the focus of recommendations.
Results
We provide comprehensive algorithms for ICP-monitor-based adult sTBI management. Consensus established 18 interventions as fundamental and ten treatments not to be used. We provide a three-tier algorithm for treating elevated ICP. Treatments within a tier are considered empirically equivalent. Higher tiers involve higher risk therapies. Tiers 1, 2, and 3 include 10, 4, and 3 interventions, respectively. We include inter-tier considerations, and recommendations for critical neuroworsening to assist the recognition and treatment of declining patients. Novel elements include guidance for autoregulation-based ICP treatment based on MAP Challenge results, and two heatmaps to guide (1) ICP-monitor removal and (2) consideration of sedation holidays for neurological examination.
Conclusions
Our modern and comprehensive sTBI-management protocol is designed to assist clinicians managing sTBI patients monitored with ICP-monitors alone. Consensus-based (class III evidence), it provides management recommendations based on combined expert opinion. It reflects neither a standard-of-care nor a substitute for thoughtful individualized management.
Serious medication errors are common in hospitals and often occur during order transcription or administration of medication. To help prevent such errors, technology has been developed to verify ...medications by incorporating bar-code verification technology within an electronic medication-administration system (bar-code eMAR).
We conducted a before-and-after, quasi-experimental study in an academic medical center that was implementing the bar-code eMAR. We assessed rates of errors in order transcription and medication administration on units before and after implementation of the bar-code eMAR. Errors that involved early or late administration of medications were classified as timing errors and all others as nontiming errors. Two clinicians reviewed the errors to determine their potential to harm patients and classified those that could be harmful as potential adverse drug events.
We observed 14,041 medication administrations and reviewed 3082 order transcriptions. Observers noted 776 nontiming errors in medication administration on units that did not use the bar-code eMAR (an 11.5% error rate) versus 495 such errors on units that did use it (a 6.8% error rate)--a 41.4% relative reduction in errors (P<0.001). The rate of potential adverse drug events (other than those associated with timing errors) fell from 3.1% without the use of the bar-code eMAR to 1.6% with its use, representing a 50.8% relative reduction (P<0.001). The rate of timing errors in medication administration fell by 27.3% (P<0.001), but the rate of potential adverse drug events associated with timing errors did not change significantly. Transcription errors occurred at a rate of 6.1% on units that did not use the bar-code eMAR but were completely eliminated on units that did use it.
Use of the bar-code eMAR substantially reduced the rate of errors in order transcription and in medication administration as well as potential adverse drug events, although it did not eliminate such errors. Our data show that the bar-code eMAR is an important intervention to improve medication safety. (ClinicalTrials.gov number, NCT00243373.)