13C flux analysis studies have become an essential component of metabolic engineering research. The scope of these studies has gradually expanded to include both isotopically steady-state and ...transient labeling experiments, the latter of which are uniquely applicable to photosynthetic organisms and slow-to-label mammalian cell cultures. Isotopomer network compartmental analysis (INCA) is the first publicly available software package that can perform both steady-state metabolic flux analysis and isotopically non-stationary metabolic flux analysis. The software provides a framework for comprehensive analysis of metabolic networks using mass balances and elementary metabolite unit balances. The generation of balance equations and their computational solution is completely automated and can be performed on networks of arbitrary complexity.
Availability and implementation: MATLAB p-code files are freely available for non-commercial use and can be downloaded at http://mfa.vueinnovations.com. Commercial licenses are also available.
Contact:
j.d.young@vanderbilt.edu
Droplet-based single-cell RNA sequence analyses assume that all acquired RNAs are endogenous to cells. However, any cell-free RNAs contained within the input solution are also captured by these ...assays. This sequencing of cell-free RNA constitutes a background contamination that confounds the biological interpretation of single-cell transcriptomic data.
We demonstrate that contamination from this "soup" of cell-free RNAs is ubiquitous, with experiment-specific variations in composition and magnitude. We present a method, SoupX, for quantifying the extent of the contamination and estimating "background-corrected" cell expression profiles that seamlessly integrate with existing downstream analysis tools. Applying this method to several datasets using multiple droplet sequencing technologies, we demonstrate that its application improves biological interpretation of otherwise misleading data, as well as improving quality control metrics.
We present SoupX, a tool for removing ambient RNA contamination from droplet-based single-cell RNA sequencing experiments. This tool has broad applicability, and its application can improve the biological utility of existing and future datasets.
Specialized nucleoside transporter (NT) proteins are required for passage of nucleosides and hydrophilic nucleoside analogues across biological membranes. Physiologic nucleosides serve as central ...salvage metabolites in nucleotide biosynthesis, and nucleoside analogues are used as chemotherapeutic agents in the treatment of cancer and antiviral diseases. The nucleoside adenosine modulates numerous cellular events via purino-receptor cell signalling pathways. Human NTs are divided into two structurally unrelated protein families: the SLC28 concentrative nucleoside transporter (CNT) family and the SLC29 equilibrative nucleoside transporter (ENT) family. Human CNTs are inwardly directed Na(+)-dependent nucleoside transporters found predominantly in intestinal and renal epithelial and other specialized cell types. Human ENTs mediate bidirectional fluxes of purine and pyrimidine nucleosides down their concentration gradients and are ubiquitously found in most, possibly all, cell types. Both protein families are evolutionarily old: CNTs are present in both eukaryotes and prokaryotes; ENTs are widely distributed in mammalian, lower vertebrate and other eukaryote species. This mini-review describes a 30-year collaboration with Professor Stephen Baldwin to identify and understand the structures and functions of these physiologically and clinically important transport proteins.
This review investigated three research questions (i) What is the utility of social cognitive theory (SCT) to explain physical activity (PA)?; (ii) Is the effectiveness of SCT moderated by sample or ...methodological characteristics? and (iii) What is the frequency of significant associations between the core SCT constructs and PA? Ten electronic databases were searched with no date or sample restrictions. Forty‐four studies were retrieved containing 55 SCT models of PA. Methodological quality was assessed using a standardized tool. A random‐effects meta‐analysis revealed that SCT accounted for 31% of the variance in PA. However, methodological quality was mostly poor for these models. Methodological quality and sample age moderated the PA effect size, with increases in both associated with greater variance explained. Although self‐efficacy and goals were consistently associated with PA, outcome expectations and socio‐structural factors were not. This review determined that SCT is a useful framework to explain PA behaviour. Higher quality models explained more PA variance, but overall methodological quality was poor. As such, high‐quality studies examining the utility of SCT to explain PA are warranted.
Spatial patterns of functional organization, resolved by microelectrode mapping, comprise a core principle of sensory cortices. In auditory cortex, however, recent two-photon Ca2+ imaging challenges ...this precept, as the traditional tonotopic arrangement appears weakly organized at the level of individual neurons. To resolve this fundamental ambiguity about the organization of auditory cortex, we developed multiscale optical Ca2+ imaging of unanesthetized GCaMP transgenic mice. Single-neuron activity monitored by two-photon imaging was precisely registered to large-scale cortical maps provided by transcranial widefield imaging. Neurons in the primary field responded well to tones; neighboring neurons were appreciably cotuned, and preferred frequencies adhered tightly to a tonotopic axis. By contrast, nearby secondary-field neurons exhibited heterogeneous tuning. The multiscale imaging approach also readily localized vocalization regions and neurons. Altogether, these findings cohere electrode and two-photon perspectives, resolve new features of auditory cortex, and offer a promising approach generalizable to any cortical area.
•High-sensitivity mode of transcranial imaging of cortex in unanesthetized mice•Spectral organization of auditory cortex under widefield imaging is highly regular•Neighboring neurons in AI are appreciably cotuned•Increased spectral integration is observed in neurons of AII
Understanding cortical organization across large-to-small spatial scales is critical to brain mapping. Issa et al. develop an imaging approach in GCaMP-expressing mice to characterize sound responses of auditory cortex from the level of individual neurons up to large-scale cortical maps.
A highly stretchable and transparent electrical heater is demonstrated by constructing a partially embedded silver nanowire percolative network on an elastic substrate. The stretchable network heater ...is applied on human wrists under real‐time strain, bending, and twisting, and has potential for lightweight, biocompatible, and versatile wearable applications.
The Asteroid Terrestrial impact Last Alert System (ATLAS) system consists of two 0.5 m Schmidt telescopes with cameras covering 29 square degrees at plate scale of 1.86 arcsec per pixel. Working in ...tandem, the telescopes routinely survey the whole sky visible from Hawaii (above δ > − 50 ° ) every two nights, exposing four times per night, typically reaching o < 19 magnitude per exposure when the moon is illuminated and c < 19.5 magnitude per exposure in dark skies. Construction is underway of two further units to be sited in Chile and South Africa which will result in an all-sky daily cadence from 2021. Initially designed for detecting potentially hazardous near earth objects, the ATLAS data enable a range of astrophysical time domain science. To extract transients from the data stream requires a computing system to process the data, assimilate detections in time and space and associate them with known astrophysical sources. Here we describe the hardware and software infrastructure to produce a stream of clean, real, astrophysical transients in real time. This involves machine learning and boosted decision tree algorithms to identify extragalactic and Galactic transients. Typically we detect 10-15 supernova candidates per night which we immediately announce publicly. The ATLAS discoveries not only enable rapid follow-up of interesting sources but will provide complete statistical samples within the local volume of 100 Mpc. A simple comparison of the detected supernova rate within 100 Mpc, with no corrections for completeness, is already significantly higher (factor 1.5 to 2) than the current accepted rates.
A large and growing body of "big data" is generated by internet search engines, such as Google. Because people often search for information about public health and medical issues, researchers may be ...able to use search engine data to monitor and predict public health problems, such as HIV. We sought to assess the feasibility of using Google search data to analyze and predict new HIV diagnoses cases in the United States.
From 2007 to 2014, we collected search volume data on HIV-related Google search keywords across the United States. State-level new HIV diagnoses data were collected from the Centers for Disease Control and Prevention (CDC) and AIDSVu.org. We developed a negative binomial model to predict HIV cases using a subset of significant predictor keywords identified by LASSO. The Google search data were combined with state-level HIV case reports provided by the CDC. We use historical data to train the model and predict new HIV diagnoses from 2011 to 2014, with an average R2 value of 0.99 between predicted versus actual cases, and average root-mean-square error (RMSE) of 108.75.
Results indicate that Google Trends is a feasible tool to predict new cases of HIV at the state level. We discuss the implications of integrating visualization maps and tools based on these models into public health and HIV monitoring and surveillance.