The hippocampus in schizophrenia is characterized by both hypermetabolism and reduced size. It remains unknown whether these abnormalities are mechanistically linked. Here we addressed this question ...by using MRI tools that can map hippocampal metabolism and structure in patients and mouse models. In at-risk patients, hypermetabolism was found to begin in CA1 and spread to the subiculum after psychosis onset. CA1 hypermetabolism at baseline predicted hippocampal atrophy, which occurred during progression to psychosis, most prominently in similar regions. Next, we used ketamine to model conditions of acute psychosis in mice. Acute ketamine reproduced a similar regional pattern of hypermetabolism, while repeated exposure shifted the hippocampus to a hypermetabolic basal state with concurrent atrophy and pathology in parvalbumin-expressing interneurons. Parallel in vivo experiments using the glutamate-reducing drug LY379268 and direct measurements of extracellular glutamate showed that glutamate drives both neuroimaging abnormalities. These findings show that hippocampal hypermetabolism leads to atrophy in psychotic disorder and suggest glutamate as a pathogenic driver.
Display omitted
► In psychotic disorder and a mouse model, hippocampal hypermetabolism predicts atrophy ► The longitudinal pattern of hypermetabolism and atrophy overlap and spread ► Excess extracellular glutamate drives hypermetabolism and leads to hippocampal interneuronal pathology and atrophy ► Regulating glutamate release in the model prevents hypermetabolism and atrophy
Schobel et al. use functional and structural MRI in individuals at high risk for psychosis to show that hippocampal hypermetabolism predicts hippocampal atrophy across progression to first episode psychosis. A rodent model shows that alterations in extracellular glutamate may contribute to these abnormalities.
To survive, cancer cells must resist numerous internal and environmental insults associated with neoplasia that jeopardize proteostasis within the endoplasmic reticulum (ER). Solid and hematopoietic ...tumors often experience genomic instability, oncogene activation, increased protein secretion demands, and somatic mutations in proteins handled by the secretory pathway that impede their folding. Invasion or metastasis into foreign environments can expose tumor cells to hypoxia, oxidative stress, lack of growth signals, inadequate amino acid supplies, glucose deprivation, and lactic acidosis, all of which pose challenges for protein processing in the ER. Together, these conditions can promote the buildup of misfolded proteins in the ER to cause ER stress, which then activates the unfolded protein response (UPR). An intracellular signaling network largely initiated by three ER transmembrane proteins, the UPR constantly surveils protein folding conditions within the ER lumen and when necessary initiates counteractive measures to maintain ER homeostasis. Under mild or moderate levels of ER stress, the homeostatic UPR sets in motion transcriptional and translational changes that promote cell adaption and survival. However, if these processes are unsuccessful at resolving ER stress, a terminal UPR program dominates and actively signals cell suicide. This article summarizes the mounting evidence that cancer cells are predisposed to ER stress and vulnerable to targeted interventions against ongoing UPR signaling.
Most estimates of global mean sea-level rise this century fall below 2 m. This quantity is comparable to the positive vertical bias of the principle digital elevation model (DEM) used to assess ...global and national population exposures to extreme coastal water levels, NASA's SRTM. CoastalDEM is a new DEM utilizing neural networks to reduce SRTM error. Here we show - employing CoastalDEM-that 190 M people (150-250 M, 90% CI) currently occupy global land below projected high tide lines for 2100 under low carbon emissions, up from 110 M today, for a median increase of 80 M. These figures triple SRTM-based values. Under high emissions, CoastalDEM indicates up to 630 M people live on land below projected annual flood levels for 2100, and up to 340 M for mid-century, versus roughly 250 M at present. We estimate one billion people now occupy land less than 10 m above current high tide lines, including 250 M below 1 m.
An optical timekeeper
Optical clocks, based on optical transitions of atoms, operate at much higher frequency than the microwave atomic clocks currently used as our timing standards. They have been ...shown to exhibit better stability and are poised to redefine the second. The development of stable, self-referenced optical frequency combs that span the microwave to optical wavelengths has been key to these efforts. Diddams
et al.
reviewed developments and refinements of these optical combs over the past 20 years and provide an overview of where they are finding application, from precision timing to high-resolution spectroscopy and imaging, ranging, and navigation.
Science
this issue p.
eaay3676
BACKGROUND
The generation and control of coherent electromagnetic waves, such as those provided by electronic oscillators in the radio frequency domain or lasers in the optical domain, have had an unparalleled impact on human society over the past century. For example, precise timing with radio waves referenced to atomic transitions undergirds navigation with the Global Positioning System. And modern communication systems are built around the properties of such waves to carry data through the air or within optical fibers that circumnavigate the globe. Such technical advances rely upon an exceptionally well developed and unified theoretical understanding of radio and optical waves, as well as the devices for their generation and control. Nonetheless and surprisingly, just 20 years ago, radio and optical technology realms remained largely distinct and isolated from one another. Although light waves could be modulated at radio rates and likewise electrical currents could be produced by demodulating optical signals, a simple coherent connection between radio and optical fields did not exist. As a result, many common technologies for the synthesis and control of radio frequencies, including those central to navigation, communications, and measurement, seemed futuristic for optics. Conversely, the full potential of optics for time standards, metrology, and science was not accessible, despite a decades-long recognition of the opportunities and attempts to harness their technological impact. This situation, which arose as a consequence of the enormously high frequencies of electromagnetic waves in the optical domain, thereby limited scientific progress and technical capabilities.
ADVANCES
The invention of the laser in 1960 represented an optical analog to radio oscillators, which were invented much earlier. This development motivated efforts to create a coherent bridge between the radio and optical realms with multiple oscillators of successively higher frequencies being chained together. However, the sheer complexity and size of such approaches made it clear that these systems would never be widely available and that their capability would be limited. Such efforts were upended after nearly four decades of work when an unanticipated breakthrough enabled by combined advances in femtosecond laser technology, nonlinear optics, and precision frequency metrology finally solved this problem. The key to overcoming these issues was a new approach to generate and control the spectrum of a mode-locked laser, which was called an optical frequency comb in accordance with the regularly spaced comb of frequencies it contained. Even though mode-locked lasers and optical frequency combs had existed previously, in 2000 it was demonstrated how their spectra could be expanded over an octave of optical bandwidth. This critical advance enabled the technique of self-referencing, by which the optical frequencies of the comb are locked in an absolute sense to a radio frequency reference. In a simple and elegant way, this produced an optical clockwork, analogous to a gearbox (see the figure), that provided the bidirectional coherent connection between the optical and radio frequency domains. Moreover, because this connection is simultaneously established to many thousands of optical frequencies within the comb, self-referencing makes the coherent translation of frequencies possible across hundreds of terahertz of the optical spectrum.
OUTLOOK
In the two decades since the introduction of the frequency comb, entirely new scientific and technology vistas have been opened. New applications are leveraging an unprecedented sharing of the respective strengths of electronics and photonics, as well as a new freedom to work seamlessly across the broad optical frequency expanse. Frequency combs are used to realize and compare ultraprecise optical clocks at the 19th decimal place, which provides powerful approaches to test relativity and quantum theory as well as the search for physics beyond our present understanding. In addition, combs transfer this exceptional precision across the spectrum to perform massively parallel spectroscopy, generate the lowest-noise microwaves and few-cycle attosecond waveforms, and even help astronomers search for Earth-like exoplanets. Just as applications of frequency combs have expanded, there are new developments in frequency comb technology. Among these are combs that do not use mode locking for comb formation, as well as new approaches built on integrated photonics and microresonators that enable frequency combs on a silicon chip. Indeed, it is now clear that the once-enormous gap between optics and electronics will be further blurred as the coherent electromagnetic spectrum is united by frequency combs.
An optical clockwork.
A frequency comb is a type of laser that functions in a manner analogous to a set of gears, as illustrated here, to link radio frequencies to a vast array of optical frequencies (“the comb”) with values precisely determined by the expression
f
n
=
nf
r
+
f
0
, where
f
r
and
f
0
are radio frequencies and
n
is an integer on the order of 10
5
.
Optical frequency combs were introduced around 20 years ago as a laser technology that could synthesize and count the ultrafast rate of the oscillating cycles of light. Functioning in a manner analogous to a clockwork of gears, the frequency comb phase-coherently upconverts a radio frequency signal by a factor of
≈
10
5
to provide a vast array of evenly spaced optical frequencies, which is the comb for which the device is named. It also divides an optical frequency down to a radio frequency, or translates its phase to any other optical frequency across hundreds of terahertz of bandwidth. We review the historical backdrop against which this powerful tool for coherently uniting the electromagnetic spectrum developed. Advances in frequency comb functionality, physical implementation, and application are also described.
Synthetic phenolic antioxidants (SPAs) are widely used in various industrial and commercial products to retard oxidative reactions and lengthen product shelf life. In recent years, numerous studies ...have been conducted on the environmental occurrence, human exposure, and toxicity of SPAs. Here, we summarize the current understanding of these issues and provide recommendations for future research directions. SPAs have been detected in various environmental matrices including indoor dust, outdoor air particulates, sea sediment, and river water. Recent studies have also observed the occurrence of SPAs, such as 2,6-di-tert-butyl-4-methylphenol (BHT) and 2,4-di-tert-butyl-phenol (DBP), in humans (fat tissues, serum, urine, breast milk, and fingernails). In addition to these parent compounds, some transformation products have also been detected both in the environment and in humans. Human exposure pathways include food intake, dust ingestion, and use of personal care products. For breastfeeding infants, breast milk may be an important exposure pathway. Toxicity studies suggest some SPAs may cause hepatic toxicity, have endocrine disrupting effects, or even be carcinogenic. The toxicity effects of some transformation products are likely worse than those of the parent compound. For example, 2,6-di-tert-butyl-p-benzoquinone (BHT-Q) can cause DNA damage at low concentrations. Future studies should investigate the contamination and environmental behaviors of novel high molecular weight SPAs, toxicity effects of coexposure to several SPAs, and toxicity effects on infants. Future studies should also develop novel SPAs with low toxicity and low migration ability, decreasing the potential for environmental pollution.
Summary Background Nivolumab has shown improved survival in the treatment of advanced non-small-cell lung cancer (NSCLC) previously treated with chemotherapy. We assessed the safety and activity of ...combination nivolumab plus ipilimumab as first-line therapy for NSCLC. Methods The open-label, phase 1, multicohort study (CheckMate 012) cohorts reported here were enrolled at eight US academic centres. Eligible patients were aged 18 years or older with histologically or cytologically confirmed recurrent stage IIIb or stage IV, chemotherapy-naive NSCLC. Patients were randomly assigned (1:1:1) by an interactive voice response system to receive nivolumab 1 mg/kg every 2 weeks plus ipilimumab 1 mg/kg every 6 weeks, nivolumab 3 mg/kg every 2 weeks plus ipilimumab 1 mg/kg every 12 weeks, or nivolumab 3 mg/kg every 2 weeks plus ipilimumab 1 mg/kg every 6 weeks until disease progression, unacceptable toxicities, or withdrawal of consent. Data from the latter two cohorts, which were considered potentially suitable for further clinical development, are presented in this report; data from the other cohort (as well as several earlier cohorts) are described in the appendix . The primary outcome was safety and tolerability, assessed in all treated patients. This ongoing study is registered with ClinicalTrials.gov , number NCT01454102. Findings Between May 15, 2014, and March 25, 2015, 78 patients were randomly assigned to receive nivolumab every 2 weeks plus ipilimumab every 12 weeks (n=38) or nivolumab every 2 weeks plus ipilimumab every 6 weeks (n=40). One patient in the ipilimumab every-6-weeks cohort was excluded before treatment; therefore 77 patients actually received treatment (38 in the ipilimumab every-12-weeks cohort; 39 in the ipilimumab every-6-weeks cohort). At data cut-off on Jan 7, 2016, 29 (76%) patients in the ipilimumab every-12-weeks cohort and 32 (82%) in the ipilimumab every-6-weeks cohort had discontinued treatment. Grade 3–4 treatment-related adverse events occurred in 14 (37%) patients in the ipilimumab every-12-weeks cohort and 13 (33%) patients in the every-6-weeks cohort; the most commonly reported grade 3 or 4 treatment-related adverse events were increased lipase (three 8% and no patients), pneumonitis (two 5% and one 3% patients), adrenal insufficiency (one 3% and two 5% patients), and colitis (one 3% and two 5% patients). Treatment-related serious adverse events were reported in 12 (32%) patients in the ipilimumab every-12-weeks cohort and 11 (28%) patients in the every-6-weeks cohort. Treatment-related adverse events (any grade) prompted treatment discontinuation in four (11%) patients in the every-12-weeks cohort and five (13%) patients in the every-6-weeks cohort. No treatment-related deaths occurred. Confirmed objective responses were achieved in 18 (47% 95% CI 31–64) patients in the ipilimumab every-12-weeks cohort and 15 (38% 95% CI 23–55) patients in the ipilimumab every-6-weeks cohort; median duration of response was not reached in either cohort, with median follow-up times of 12·8 months (IQR 9·3–15·5) in the ipilimumab every-12-weeks cohort and 11·8 months (6·7–15·9) in the ipilimumab every-6-weeks cohort. In patients with PD-L1 of 1% or greater, confirmed objective responses were achieved in 12 (57%) of 21 patients in the ipilimumab every-12-weeks cohort and 13 (57%) of 23 patients in the ipilimumab every-6-weeks cohort. Interpretation In NSCLC, first-line nivolumab plus ipilimumab had a tolerable safety profile and showed encouraging clinical activity characterised by a high response rate and durable response. To our knowledge, the results of this study are the first suggestion of improved benefit compared with anti-PD-1 monotherapy in patients with NSCLC, supporting further assessment of this combination in a phase 3 study. Funding Bristol-Myers Squibb.
Machine learning has demonstrated potential in analyzing large, complex biological data. In practice, however, biological information is required in addition to machine learning for successful ...application.
As aircraft have become more reliable, humans have played a progressively more important causal role in aviation accidents. Consequently, a growing number of aviation organizations are tasking their ...safety personnel with developing accident investigation and other safety programs to address the highly complex and often nebulous issue of human error. Yet, many safety professionals are illequipped to perform these new duties.
The purpose of the present book is to remedy this situation by presenting a comprehensive, userfriendly framework to assist practitioners in effectively investigating and analyzing human error in aviation. Coined the Human Factors Analysis and Classification System (HFACS), its framework is based on James Reason's (1990) well-known "Swiss cheese" model of accident causation. In essence, HFACS bridges the gap between theory and practice in a way that helps improve both the quantity and quality of information gathered in aviation accidents and incidents.
The HFACS framework was originally developed for, and subsequently adopted by, the U.S. Navy/Marine Corps as an accident investigation and data analysis tool. The U.S. Army, Air Force, and Coast Guard, as well as other military and civilian aviation organizations around the world are also currently using HFACS to supplement their preexisting accident investigation systems. In addition, HFACS has been taught to literally thousands of students and safety professionals through workshops and courses offered at professional meetings and universities. Indeed, HFACS is now relatively well known within many sectors of aviation and an increasing number of organizations worldwide are interested in exploring its usage. Consequently, the authors currently receive numerous requests for more information about the system on what often seems to be a daily basis.
The Mayo Clinic recently introduced a diagnostic test that quantifies plasma ceramides in order to identify patients at risk of major adverse cardiac events. By comparing recent discoveries about ...these biomarker ceramides with the exhaustive body of literature surrounding cholesterol, Summers aims to highlight important advances and critically needed areas of investigation on this exciting class of bioactive lipids.
Scott Summers evaluates the current status of research on ceramides, a lipid implicated in cardiometabolic disorders that shows intriguing parallels with cholesterol.
•Omnichannel marketing is a continuum of strategies.•The continuum unifies the customer journey and channel choice.•Omnichannel strategy is vertical (integrate journey) and horizontal (integrate ...channels).•Consumer behavior and marketing determine where a firm should position along the continuum.•Ten consumer and marketing determinants of omnichannel strategy have strong empirical support.
This paper provides a framework for conceptualizing omnichannel integration as a continuum, identifies phenomena that determine how firms should position along that continuum, and summarizes empirical research regarding these phenomena. The framework combines the customer journey (search to purchase to aftersales) and channel choice (online vs. offline). This generates a range of omnichannel strategies, anchored by “Unconnected” on one extreme and “Complete” on the other. In between, “Vertical” strategies integrate channels over the customer journey, while “Horizontal” strategies integrate across channels at a given stage in the customer journey. We draw on more than 200 articles to identify 10 consumer and marketing phenomena (“determinants”) that influence where a firm should position along the continuum. This however raises challenges. For example, empirical research surprisingly finds many customers belong to an offline-focused segment. This suggests a Vertical strategy linking offline channels. However, today's turbulent retail environment questions whether the offline-focused segment will endure. Should the retailer cater to offline-focused customers or facilitate their progression to “multichannelism”? Another finding is that consumers strongly prefer consistency across channels. This suggests a Horizontal strategy. However, consistency might create channel cannibalization. How can the retailer avoid this? We discuss these and several other findings regarding the impact of the 10 determinants on omnichannel continuum strategy. We identify issues researchers need to research and managers need to consider when developing omnichannel continuum strategy.
Display omitted