We introduce Sailfish, a computational method for quantifying the abundance of previously annotated RNA isoforms from RNA-seq data. Because Sailfish entirely avoids mapping reads, a time-consuming ...step in all current methods, it provides quantification estimates much faster than do existing approaches (typically 20 times faster) without loss of accuracy. By facilitating frequent reanalysis of data and reducing the need to optimize parameters, Sailfish exemplifies the potential of lightweight algorithms for efficiently processing sequencing reads.
Understanding the amazingly complex human cerebral cortex requires a map (or parcellation) of its major subdivisions, known as cortical areas. Making an accurate areal map has been a century-old ...objective in neuroscience. Using multi-modal magnetic resonance images from the Human Connectome Project (HCP) and an objective semi-automated neuroanatomical approach, we delineated 180 areas per hemisphere bounded by sharp changes in cortical architecture, function, connectivity, and/or topography in a precisely aligned group average of 210 healthy young adults. We characterized 97 new areas and 83 areas previously reported using post-mortem microscopy or other specialized study-specific approaches. To enable automated delineation and identification of these areas in new HCP subjects and in future studies, we trained a machine-learning classifier to recognize the multi-modal 'fingerprint' of each cortical area. This classifier detected the presence of 96.6% of the cortical areas in new subjects, replicated the group parcellation, and could correctly locate areas in individuals with atypical parcellations. The freely available parcellation and classifier will enable substantially improved neuroanatomical precision for studies of the structural and functional organization of human cerebral cortex and its variation across individuals and in development, aging, and disease.
We report data on the martian meteorite Northwest Africa (NWA) 7034, which shares some petrologic and geochemical characteristics with known martian meteorites of the SNC (i.e., shergottite, ...nakhlite, and chassignite) group, but also has some unique characteristics that would exclude it from that group. NWA 7034 is a geochemically enriched crustal rock compositionally similar to basalts and average martian crust measured by recent Rover and Orbiter missions. It formed 2.089 ± 0.081 billion years ago, during the early Amazonian epoch in Mars' geologic history. NWA 7034 has an order of magnitude more indigenous water than most SNC meteorites, with up to 6000 parts per million extraterrestrial H2O released during stepped heating. It also has bulk oxygen isotope values of Δ 17 O = 0.58 ± 0.05 per mil and a heat-released water oxygen isotope average value of Δ 17 O = 0.330 ± 0.011 per mil, suggesting the existence of multiple oxygen reservoirs on Mars.
The overturning circulation of the global ocean is critically shaped by deep-ocean mixing, which transforms cold waters sinking at high latitudes into warmer, shallower waters. The effectiveness of ...mixing in driving this transformation is jointly set by two factors: the intensity of turbulence near topography and the rate at which well-mixed boundary waters are exchanged with the stratified ocean interior. Here, we use innovative observations of a major branch of the overturning circulation—an abyssal boundary current in the Southern Ocean—to identify a previously undocumented mixing mechanism, by which deep-ocean waters are efficiently laundered through intensified near-boundary turbulence and boundary–interior exchange. The linchpin of the mechanism is the generation of submesoscale dynamical instabilities by the flow of deepocean waters along a steep topographic boundary. As the conditions conducive to this mode of mixing are common to many abyssal boundary currents, our findings highlight an imperative for its representation in models of oceanic overturning.
Protein aggregation is a phenomenon that has attracted considerable attention within the pharmaceutical industry from both a developability standpoint (to ensure stability of protein formulations) ...and from a research perspective for neurodegenerative diseases. Experimental identification of aggregation behavior in proteins can be expensive; and hence, the development of accurate computational approaches is crucial. The existing methods for predicting protein aggregation rely mostly on the primary sequence and are typically trained on amyloid‐like proteins. However, the training bias toward beta amyloid peptides may worsen prediction accuracy of such models when applied to larger protein systems. Here, we present a novel algorithm to identify aggregation‐prone regions in proteins termed “AggScore” that is based entirely on three‐dimensional structure input. The method uses the distribution of hydrophobic and electrostatic patches on the surface of the protein, factoring in the intensity and relative orientation of the respective surface patches into an aggregation propensity function that has been trained on a benchmark set of 31 adnectin proteins. AggScore can accurately identify aggregation‐prone regions in several well‐studied proteins and also reliably predict changes in aggregation behavior upon residue mutation. The method is agnostic to an amyloid‐specific aggregation context and thus may be applied to globular proteins, small peptides and antibodies.
Cell viability, an essential measurement for cell therapy products, lacks traceability. One of the most common cell viability tests is trypan blue dye exclusion where blue-stained cells are counted ...via brightfield imaging. Typically, live and dead cells are classified based on their pixel intensities which may vary arbitrarily making it difficult to compare results. Herein, a traceable absorbance microscopy method to determine the intracellular uptake of trypan blue is demonstrated. The intensity pixels of the brightfield images are converted to absorbance images which are used to calculate moles of trypan blue per cell. Trypan blue cell viability measurements, where trypan blue content in each cell is quantified, enable traceable live-dead classifications. To implement the absorbance microscopy method, we developed an open-source AbsorbanceQ application that generates quantitative absorbance images. The validation of absorbance microscopy is demonstrated using neutral density filters. Results from four different microscopes demonstrate a mean absolute deviation of 3% from the expected optical density values. When assessing trypan blue-stained Jurkat cells, the difference in intracellular uptake of trypan blue in heat-shock-killed cells using two different microscopes is 3.8%. Cells killed with formaldehyde take up ~50% less trypan blue as compared to the heat-shock-killed cells, suggesting that the killing mechanism affects trypan blue uptake. In a test mixture of approximately 50% live and 50% dead cells, 53% of cells were identified as dead (±6% standard deviation). Finally, to mimic batches of low-viability cells that may be encountered during a cell manufacturing process, viability was assessed for cells that were 1) overgrown in the cell culture incubator for five days or 2) incubated in DPBS at room temperature for five days. Instead of making live-dead classifications using arbitrary intensity values, absorbance imaging yields traceable units of moles that can be compared, which is useful for assuring quality for biomanufacturing processes.
Waldenström macroglobulinemia (WM) is preceded by asymptomatic WM (AWM), for which the risk of progression to overt disease is not well defined.
We studied 439 patients with AWM, who were diagnosed ...and observed at Dana-Farber Cancer Institute between 1992 and 2014.
During the 23-year study period, with a median follow-up of 7.8 years, 317 patients progressed to symptomatic WM (72%). Immunoglobulin M 4,500 mg/dL or greater, bone marrow lymphoplasmacytic infiltration 70% or greater, β2-microglobulin 4.0 mg/dL or greater, and albumin 3.5 g/dL or less were all identified as independent predictors of disease progression. To assess progression risk in patients with AWM, we trained and cross-validated a proportional hazards model using bone marrow infiltration, immunoglobulin M, albumin, and beta-2 microglobulin values as continuous measures. The model divided the cohort into three distinct risk groups: a high-risk group with a median time to progression (TTP) of 1.8 years, an intermediate-risk group with a median TTP of 4.8 years, and a low-risk group with a median TTP of 9.3 years. We validated this model in two external cohorts, demonstrating robustness and generalizability. For clinical applicability, we made the model available as a Web page application ( www.awmrisk.com ). By combining two cohorts, we were powered to identify wild type MYD88 as an independent predictor of progression (hazard ratio, 2.7).
This classification system is positioned to inform patient monitoring and care and, for the first time to our knowledge, to identify patients with high-risk AWM who may need closer follow-up or benefit from early intervention.
The direct‐from‐model and tool‐less manufacturing process of 3D printing (3DP) embodies a general‐purpose technology, facilitating capacity sharing and outsourcing. Starting from a case study of a ...3DP company (Shapeways) and a new market entrant (Panalpina), we develop dynamic practices for partial outsourcing in build‐to‐model manufacturing. We propose a new outsourcing scheme, bidirectional partial outsourcing (BPO), where 3D printers share capacity by alternating between the role of outsourcer and subcontractor based on need. Coupled with order book smoothing (OBS), where orders are released gradually to production, this provides 3D printers with two distinct ways to manage demand variability. By combining demand and cost field data with an analytical model, we find that BPO improves 3DP cost efficiency and delivery performance as the number of 3DP firms in the network increases. OBS is sufficient for an established 3D printer when alternatives to in‐house manufacturing are few, or of limited capacity. Nevertheless, OBS comes at the cost of reduced responsiveness, whereas BPO shifts the cost and delivery performance frontier. Our analysis shows how BPO combined with OBS makes 3DP companies more resilient to downward movements in both demand and price levels.
Attempts to delineate an immune subtype of schizophrenia have not yet led to the clear identification of potential treatment targets. An unbiased informatic approach at the level of individual immune ...cytokines and symptoms may reveal organisational structures underlying heterogeneity in schizophrenia, and potential for future therapies. The aim was to determine the network and relative influence of pro- and anti-inflammatory cytokines on depressive, positive, and negative symptoms. We further aimed to determine the effect of exposure to minocycline or placebo for 6 months on cytokine-symptom network connectivity and structure. Network analysis was applied to baseline and 6-month data from the large multi-center BeneMin trial of minocycline (N = 207) in schizophrenia. Pro-inflammatory cytokines IL-6, TNF-α, and IFN-γ had the greatest influence in the inflammatory network and were associated with depressive symptoms and suspiciousness at baseline. At 6 months, the placebo group network connectivity was 57% stronger than the minocycline group, due to significantly greater influence of TNF-α, early wakening, and pathological guilt. IL-6 and its downstream impact on TNF-α, and IFN-γ, could offer novel targets for treatment if offered at the relevant phenotypic profile including those with depression. Future targeted experimental studies of immune-based therapies are now needed.
As researchers who have published over recent years on the issue of comparing the climate effects of different greenhouse gases, we would like to highlight a simple innovation that would enhance the ...transparency of stocktakes of progress towards achieving any multi-decade-timescale global temperature goal. In addition to specifying targets for total CO 2-equivalent emissions of all greenhouse gases, governments and corporations could also indicate the separate contribution to these totals from greenhouse gases with lifetimes around 100 years or longer, notably CO 2 and nitrous oxide, and the contribution from Short-Lived Climate Forcers (SLCFs), notably methane and some hydrofluorocarbons. This separate indication would support an objective assessment of the implications of aggregated emission targets for global temperature, in alignment with the UNFCCC Parties' Decision (4/ CMA.1) 1 to provide "information necessary for clarity, transparency and understanding" in nationally determined contributions (NDCs) and long-term low-emission development strategies (LT-LEDSs). While differences remain between us regarding how best to set fair yet ambitious targets for individual emitters 2-5 , including how any additional information might be used, and the interpretation of the Paris Agreement, it is important to emphasise the high level of agreement on the underlying science of how different greenhouse gases affect global temperature. The 2018 IPCC Special Report on 1.5 °C (SR1.5) 6 stated "Reaching and sustaining net-zero global anthropogenic CO 2 emissions and declining net non-CO 2 radiative forcing (Planetary energy imbalance resulting directly from human-induced changes.) would halt anthropogenic global warming on multi-decadal timescales (high confidence). The maximum temperature reached is then determined by cumulative net global anthropogenic CO 2 emissions up to the time of net zero CO 2 emissions (high confidence) and the level of non-CO 2 radiative forcing in the decades prior to the time that maximum temperatures are reached (medium confidence)". The IPCC 6th Assessment Report (AR6) 7 confirmed "limiting human-induced global warming to a specific level requires limiting cumulative CO 2 emissions, reaching at least net zero CO 2 emissions, along with strong reductions in other greenhouse gas emissions". Parties to the Paris Agreement agreed in Katowice in 2018 (Decision 18/CMA.1) 1 to report past emissions of individual gases separately and use 100-year Global Warming Potentials (GWP 100