In vitro biosensors have been an integral component for early diagnosis of cancer in the clinic. Among them, no-wash biosensors, which only depend on the simple mixing of the signal generating probes ...and the sample solution without additional washing and separation steps, have been found to be particularly attractive. The outstanding advantages of facile, convenient, and rapid response of no-wash biosensors are especially suitable for point-of-care testing (POCT). One fast-growing field of no-wash biosensor design involves the usage of nanomaterials as signal amplification carriers or direct signal generating elements. The analytical capacity of no-wash biosensors with respect to sensitivity or limit of detection, specificity, stability, and multiplexing detection capacity is largely improved because of their large surface area, excellent optical, electrical, catalytic, and magnetic properties. This review provides a comprehensive overview of various nanomaterial-enhanced no-wash biosensing technologies and focuses on the analysis of the underlying mechanism of these technologies applied for the early detection of cancer biomarkers ranging from small molecules to proteins, and even whole cancerous cells. Representative examples are selected to demonstrate the proof-of-concept with promising applications for in vitro diagnostics of cancer. Finally, a brief discussion of common unresolved issues and a perspective outlook on the field are provided.
An approach to the problem of estimating the size of inhomogeneous crowds, which are composed of pedestrians that travel in different directions, without using explicit object segmentation or ...tracking is proposed. Instead, the crowd is segmented into components of homogeneous motion, using the mixture of dynamic-texture motion model. A set of holistic low-level features is extracted from each segmented region, and a function that maps features into estimates of the number of people per segment is learned with Bayesian regression. Two Bayesian regression models are examined. The first is a combination of Gaussian process regression with a compound kernel, which accounts for both the global and local trends of the count mapping but is limited by the real-valued outputs that do not match the discrete counts. We address this limitation with a second model, which is based on a Bayesian treatment of Poisson regression that introduces a prior distribution on the linear weights of the model. Since exact inference is analytically intractable, a closed-form approximation is derived that is computationally efficient and kernelizable, enabling the representation of nonlinear functions. An approximate marginal likelihood is also derived for kernel hyperparameter learning. The two regression-based crowd counting methods are evaluated on a large pedestrian data set, containing very distinct camera views, pedestrian traffic, and outliers, such as bikes or skateboarders. Experimental results show that regression-based counts are accurate regardless of the crowd size, outperforming the count estimates produced by state-of-the-art pedestrian detectors. Results on 2 h of video demonstrate the efficiency and robustness of the regression-based crowd size estimation over long periods of time.
Single-cell RNA sequencing (scRNA-seq) is a popular and powerful technology that allows you to profile the whole transcriptome of a large number of individual cells. However, the analysis of the ...large volumes of data generated from these experiments requires specialized statistical and computational methods. Here we present an overview of the computational workflow involved in processing scRNA-seq data. We discuss some of the most common tasks and the tools available for addressing central biological questions. In this article and our companion website ( https://scrnaseq-course.cog.sanger.ac.uk/website/index.html ), we provide guidelines regarding best practices for performing computational analyses. This tutorial provides a hands-on guide for experimentalists interested in analyzing their data as well as an overview for bioinformaticians seeking to develop new computational methods.
Fluorescence lifetime imaging microscopy (FLIM) is used in diverse disciplines, including biology, chemistry and biophysics, but its use has been limited by the complexity of the data analysis. The ...phasor approach to FLIM has the potential to markedly reduce this complexity and at the same time provide a powerful visualization of the data content. Phasor plots for fluorescence lifetime analysis were originally developed as a graphical representation of excited-state fluorescence lifetimes for in vitro systems. The method's simple mathematics and specific rules avoid errors and confusion common in the study of complex and heterogeneous fluorescence. In the case of FLIM, the phasor approach has become a powerful method for simple and fit-free analyses of the information contained in the many thousands of pixels constituting an image. At present, the phasor plot is used not only for FLIM, but also for hyperspectral imaging, wherein phasors provide an unprecedented understanding of heterogeneous fluorescence. Undoubtedly, phasor plots will be increasingly important in the future analysis and understanding of FLIM and hyperspectral confocal imaging. This protocol presents the principle of the method and guides users through one of the popular interfaces for FLIM phasor analysis, namely, the SimFCS software. Implementation of the analysis takes only minutes to complete for a dataset containing hundreds of files.
•Sample-to-sample amount variation could be larger than analytical variation.•Sample normalization is a critical step in quantitative metabolomics.•Sample normalization should be incorporated into a ...metabolomic profiling workflow.•There is no unified method but a number of methods have been reported.•The performance of sample normalization methods needs to be carefully considered to select a proper method.
To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts.
Food safety is increasingly becoming an important public health issue, as foodborne diseases present a widespread and growing public health problem in both developed and developing countries. The ...rapid and precise monitoring and detection of foodborne pathogens are some of the most effective ways to control and prevent human foodborne infections. Traditional microbiological detection and identification methods for foodborne pathogens are well known to be time consuming and laborious as they are increasingly being perceived as insufficient to meet the demands of rapid food testing. Recently, various kinds of rapid detection, identification, and monitoring methods have been developed for foodborne pathogens, including nucleic-acid-based methods, immunological methods, and biosensor-based methods, etc. This article reviews the principles, characteristics, and applications of recent rapid detection methods for foodborne pathogens.
An update of evidence-based guidelines concerning liberation from mechanical ventilation is needed as new evidence has become available. The American College of Chest Physicians (CHEST) and the ...American Thoracic Society (ATS) have collaborated to provide recommendations to clinicians concerning liberation from the ventilator.
Comprehensive evidence syntheses, including meta-analyses, were performed to summarize all available evidence relevant to the guideline panel's questions. The evidence was appraised using the Grading of Recommendations, Assessment, Development, and Evaluation (GRADE) approach, and the results were summarized in evidence profiles. The evidence syntheses were discussed and recommendations developed and approved by a multidisciplinary committee of experts in mechanical ventilation.
Recommendations for three population, intervention, comparator, outcome (PICO) questions concerning ventilator liberation are presented in this document. The guideline panel considered the balance of desirable (benefits) and undesirable (burdens, adverse effects, costs) consequences, quality of evidence, feasibility, and acceptability of various interventions with respect to the selected questions. Conditional (weak) recommendations were made to use inspiratory pressure augmentation in the initial spontaneous breathing trial (SBT) and to use protocols to minimize sedation for patients ventilated for more than 24 h. A strong recommendation was made to use preventive noninvasive ventilation (NIV) for high-risk patients ventilated for more than 24 h immediately after extubation to improve selected outcomes. The recommendations were limited by the quality of the available evidence.
The guideline panel provided recommendations for inspiratory pressure augmentation during an initial SBT, protocols minimizing sedation, and preventative NIV, in relation to ventilator liberation.
We recently introduced Cleavage Under Targets & Tagmentation (CUT&Tag), an epigenomic profiling strategy in which antibodies are bound to chromatin proteins in situ in permeabilized nuclei. These ...antibodies are then used to tether the cut-and-paste transposase Tn5. Activation of the transposase simultaneously cleaves DNA and adds adapters ('tagmentation') for paired-end DNA sequencing. Here, we introduce a streamlined CUT&Tag protocol that suppresses DNA accessibility artefacts to ensure high-fidelity mapping of the antibody-targeted protein and improves the signal-to-noise ratio over current chromatin profiling methods. Streamlined CUT&Tag can be performed in a single PCR tube, from cells to amplified libraries, providing low-cost genome-wide chromatin maps. By simplifying library preparation CUT&Tag requires less than a day at the bench, from live cells to sequencing-ready barcoded libraries. As a result of low background levels, barcoded and pooled CUT&Tag libraries can be sequenced for as little as $25 per sample. This enables routine genome-wide profiling of chromatin proteins and modifications and requires no special skills or equipment.
Streptococcus pyogenes
(also known as group A
Streptococcus
, Strep A) is an obligate human pathogen with significant global morbidity and mortality. Transmission is believed to occur primarily ...between individuals via respiratory droplets, but knowledge about other potential sources of transmission via aerosols or the environment is limited. Such knowledge is required to design optimal interventions to control transmission, particularly in endemic settings. We aim to detail an experimental methodology to assess the transmission potential of Strep A in a clinical environment. We will examine potential sources of transmission in up to 20 participants recruited to the
C
ontrolled
h
uman
i
nfection for
p
enicillin against
S
treptococcus pyogenes
(CHIPS) Trial. Three approaches to understanding transmission will be used: the use of selective agar settle plates to capture possible droplet or airborne spread of Strep A; measurement of the possible distance of Strep A droplet spread during conversation; and environmental swabbing of personal and common high-touch items to detect the presence of Strep A on hard and soft surfaces. All methods are designed to allow for an assessment of transmission potential by symptomatic, asymptomatic and non-cases. Ethical approval has been obtained through Bellberry Human Research Ethics Committee (approval 2021-03-295). Trial registration number: ACTRN12621000751875. Any results elicited from these experiments will be of benefit to the scientific literature in improving our knowledge of opportunities to prevent Strep A transmission as a direct component of the primordial prevention of rheumatic fever. Findings will be reported at local, national and international conferences and in peer-reviewed journals.
Silks are natural fibrous protein polymers that are spun by silkworms and spiders. Among silk variants, there has been increasing interest devoted to the silkworm silk of B. mori, due to its ...availability in large quantities along with its unique material properties. Silk fibroin can be extracted from the cocoons of the B. mori silkworm and combined synergistically with other biomaterials to form biopolymer composites. With the development of recombinant DNA technology, silks can also be rationally designed and synthesized via genetic control. Silk proteins can be processed in aqueous environments into various material formats including films, sponges, electrospun mats and hydrogels. The versatility and sustainability of silk-based materials provides an impressive toolbox for tailoring materials to meet specific applications via eco-friendly approaches. Historically, silkworm silk has been used by the textile industry for thousands of years due to its excellent physical properties, such as lightweight, high mechanical strength, flexibility, and luster. Recently, due to these properties, along with its biocompatibility, biodegradability and non-immunogenicity, silkworm silk has become a candidate for biomedical utility. Further, the FDA has approved silk medical devices for sutures and as a support structure during reconstructive surgery. With increasing needs for implantable and degradable devices, silkworm silk has attracted interest for electronics, photonics for implantable yet degradable medical devices, along with a broader range of utility in different device applications. This Tutorial review summarizes and highlights recent advances in the use of silk-based materials in bio-nanotechnology, with a focus on the fabrication and functionalization methods for in vitro and in vivo applications in the field of tissue engineering, degradable devices and controlled release systems.