Brain and the gastrointestinal (GI) tract are intimately connected to form a bidirectional neurohumoral communication system. The communication between gut and brain, knows as the gut-brain axis, is ...so well established that the functional status of gut is always related to the condition of brain. The researches on the gut-brain axis were traditionally focused on the psychological status affecting the function of the GI tract. However, recent evidences showed that gut microbiota communicates with the brain via the gut-brain axis to modulate brain development and behavioral phenotypes. These recent findings on the new role of gut microbiota in the gut-brain axis implicate that gut microbiota could associate with brain functions as well as neurological diseases via the gut-brain axis. To elucidate the role of gut microbiota in the gut-brain axis, precise identification of the composition of microbes constituting gut microbiota is an essential step. However, identification of microbes constituting gut microbiota has been the main technological challenge currently due to massive amount of intestinal microbes and the difficulties in culture of gut microbes. Current methods for identification of microbes constituting gut microbiota are dependent on omics analysis methods by using advanced high tech equipment. Here, we review the association of gut microbiota with the gut-brain axis, including the pros and cons of the current high throughput methods for identification of microbes constituting gut microbiota to elucidate the role of gut microbiota in the gut-brain axis.
•Appelmans protocol broadens host range of a phage cocktail targeting acinetobacter baumannii.•Prophage induction and recombination contributes to the broadening of the phage cocktail host ...range.•Recombination of prophages from encountered bacterial strains generates expanded host range phages.•Induced phages demonstrated limited stability, unsuitable for therapy.
Infections caused by carbapenem-resistant Acinetobacter baumannii (CRAB) present significant healthcare challenges due to limited treatment options. Bacteriophage (phage) therapy offers potential as an alternative treatment. However, the high host specificity of phages poses challenges for their therapeutic application. To broaden the phage spectrum, laboratory-based phage training using the Appelmans protocol was employed in this study. As a result, the protocol successfully expanded the host range of a phage cocktail targeting CRAB. Further analysis revealed that the expanded host range phages isolated from the output cocktail were identified as recombinant derivatives originating from prophages induced from encountered bacterial strains. These findings provide valuable genetic insights into the protocol's mechanism when applied to phages infecting A. baumannii strains that have never been investigated before. However, it is noteworthy that the expanded host range phages obtained from this protocol exhibited limited stability, raising concerns about their suitability for therapeutic purposes.
Purpose
Variations in the vessel radius of segmented surfaces of intracranial aneurysms significantly influence the fluid velocities given by computer simulations. It is important to generate models ...that capture the effect of these variations in order to have a better interpretation of the numerically predicted hemodynamics. Also, it is highly relevant to develop methods that combine experimental observations with uncertainty modeling to get a closer approximation to the blood flow behavior.
Methods
This work applies polynomial chaos expansion to model the effect of geometric uncertainties on the simulated fluid velocities of intracranial aneurysms. The radius of the vessel is defined as the uncertainty variable. Proper orthogonal decomposition is applied to characterize the solution space of fluid velocities. Next, a process of projecting the 4D-Flow MRI velocities on the basis vectors followed by coefficient mapping using generalized dynamic mode decomposition enables the merging of 4D-Flow MRI with the uncertainty propagated fluid velocities.
Results
Polynomial chaos expansion propagates the fluid velocities with an error of 2% in velocity magnitude relative to computer simulations. Also, the bifurcation region (or impingement location) shows a standard deviation of 0.17 m/s (since an available reported variance in the vessel radius is adopted to model the uncertainty, the expected standard deviation may be different). Numerical phantom experiments indicate that the proposed approach reconstructs the fluid velocities with 0.3% relative error in presence of geometric uncertainties.
Conclusion
Polynomial chaos expansion is an effective approach to propagate the effect of the uncertainty variable in the blood flow velocities of intracranial aneurysms. Merging 4D-Flow MRI and uncertainty propagated fluid velocities leads to more realistic flow trends relative to ignoring the uncertainty in the vessel radius.
Agent-based modeling is a technique for modeling dynamic systems from the bottom up. Individual elements of the system are represented computationally as agents. The system-level behaviors emerge ...from the micro-level interactions of the agents. Contemporary state-of-the-art agent-based modeling toolkits are essentially discrete-event simulators designed to execute serially on the Central Processing Unit (CPU). They simulate Agent-Based Models (ABMs) by executing agent actions one at a time. In addition to imposing an un-natural execution order, these toolkits have limited scalability. In this article, we investigate data-parallel computer architectures such as Graphics Processing Units (GPUs) to simulate large scale ABMs. We have developed a series of efficient, data parallel algorithms for handling environment updates, various agent interactions, agent death and replication, and gathering statistics. We present three fundamental innovations that provide unprecedented scalability. The first is a novel stochastic memory allocator which enables parallel agent replication in O(1) average time. The second is a technique for resolving precedence constraints for agent actions in parallel. The third is a method that uses specialized graphics hardware, to gather and process statistical measures. These techniques have been implemented on a modern day GPU resulting in a substantial performance increase. We believe that our system is the first ever completely GPU based agent simulation framework. Although GPUs are the focus of our current implementations, our techniques can easily be adapted to other data-parallel architectures. We have benchmarked our framework against contemporary toolkits using two popular ABMs, namely, SugarScape and StupidModel. Adapted from the source document.
This multi-institutional study assessed the efficacy of Enterococcus faecium NRRL B-2354 as a nonpathogenic Salmonella surrogate for thermal processing of nonfat dry milk powder, peanut butter, ...almond meal, wheat flour, ground black pepper, and date paste. Each product was analyzed by two laboratories (five independent laboratories total), with the lead laboratory inoculating (E. faecium or a five-strain Salmonella enterica serovar cocktail of Agona, Reading, Tennessee, Mbandaka, and Montevideo) and equilibrating the product to the target water activity before shipping. Both laboratories subjected samples to three isothermal treatments (between 65 and 100°C). A log-linear and Bigelow model was fit to survivor data via one-step regression. On the basis of D80°C values estimated from the combined model, E. faecium was more thermally resistant (P < 0.05) than Salmonella in nonfat dry milk powder (DEf-80°C, 100.2±5.8 min; DSal-80°C, 28.9±1.0 min), peanut butter (DEf-80°C, 133.5±3.1 min; DSal-80°C, 57.6±1.5 min), almond meal (DEf-80°C, 34.2±0.4 min; DSal-80°C, 26.1±0.2 min), ground black pepper (DEf-80°C, 3.2±0.8 min; DSal-80°C, 1.5±0.1 min), and date paste (DEf-80°C, 1.5±0.0 min; DSal-80°C, 0.5±0.0 min). Although the combined laboratory D80°C for E. faecium was lower (P < 0.05) than for Salmonella in wheat flour (DEf-80°C, 9.4±0.1 min; DSal-80°C, 10.1±0.2 min), the difference was ~7%. The zT values for Salmonella in all products and for E. faecium in milk powder, almond meal, and date paste were not different (P > 0.05) between laboratories. Therefore, this study demonstrated the impact of standardized methodologies on repeatability of microbial inactivation results. Overall, E. faecium NRRL B-2354 was more thermally resistant than Salmonella, which provides support for utilizing E. faecium as a surrogate for validating thermal processing of multiple low-moisture products. However, product composition should always be considered before making that decision.
E. faecium was more thermally resistant than Salmonella.Thermal resistance of E. faecium and Salmonella was impacted by product type.Inactivation results were statistically similar across five independent laboratories.Standardized methodologies yielded low cross-laboratory differences.
•New method for 3D surface reconstruction of microscopic samples from SEM is proposed.•Slanted support window-based formulation achieves more accurate reconstructions.•Need for post-processing ...disparity smoothing and edge corrections is eliminated.
This work is to address the limitations of 2D Scanning Electron Microscopy (SEM) micrographs in providing 3D topographical information necessary for various types of analysis in biological and biomedical sciences as well as mechanical and material engineering by investigating modern stereo vision methodologies for 3D surface reconstruction of microscopic samples. To achieve this, micrograph pairs of the microscopic samples are acquired by utilizing an SEM equipped with motor controlled specimen stage capable of precise translational, rotational movements and tilting of the specimen stage. After pre-processing of the micrographs by SIFT feature detection/description followed by RANSAC for matching outlier removal and stereo rectification, a dense stereo matching methodology is utilized which takes advantage of slanted support window formulation for sub-pixel accuracy stereo matching of the input images. This results in a dense disparity map which is used to determine the true depth/elevation of individual surface points. This is a major improvement in comparison to previous matching methodologies which require additional post-processing refinement steps to reduce the negative effects of discrete disparity assignment or the blurring artifacts in near the edge regions. The provided results are great representatives of the superior performance of the slanted support window assumption employed here for surface reconstruction of microscopic samples.
4D‐Flow magnetic resonance imaging (MRI) has enabled in vivo time‐resolved measurement of three‐dimensional blood flow velocities in the human vascular system. However, its clinical use has been ...hampered by two main issues, namely, low spatio‐temporal resolution and acquisition noise. While patient‐specific computational fluid dynamics (CFD) simulations can address the resolution and noise issues, its fidelity is impacted by accuracy of estimation of boundary conditions, model parameters, vascular geometry, and flow model assumptions. In this paper a scheme to address limitations of both modalities through data‐fusion is presented. The solutions of the patient‐specific CFD simulation are characterized using proper orthogonal decomposition (POD). Next, a process of projecting the 4D‐Flow MRI data onto the POD basis and projection coefficient mapping using generalized dynamic mode decomposition (DMD) enables simultaneous super‐resolution and denoising of 4D‐Flow MRI. The method has been tested using numerical phantoms derived from patient‐specific aneurysmal geometries and applied to in vivo 4D‐Flow MRI data.
spp. have emerged as significant pathogens causing nosocomial infections. Treatment of these pathogens has become a major challenge to clinicians worldwide, due to their increasing tendency to ...antibiotic resistance. To address this, much revenue and technology are currently being dedicated toward developing novel drugs and antibiotic combinations to combat antimicrobial resistance. To address this issue, we have constructed a panel of
spp. strains expressing different antimicrobial resistance determinants such as narrow spectrum β-lactamases, extended-spectrum β-lactamases, OXA-type-carbapenemase, metallo-beta-lactamase, and over-expressed AmpC β-lactamase. Bacterial strains exhibiting different resistance phenotypes were collected between 2008 and 2013 from Severance Hospital, Seoul. Antimicrobial susceptibility was determined according to the CLSI guidelines using agar dilution method. Selected strains were sequenced using Ion Torrent PGM system, annotated using RAST server and analyzed using Geneious pro 8.0. Genotypic determinants, such as acquired resistance genes, changes in the expression of efflux pumps, mutations, and porin alternations, contributing to the relevant expressed phenotype were characterized. Isolates expressing ESBL phenotype consisted of
gene, the overproduction of intrinsic AmpC beta-lactamase associated with IS
insertion, and carbapenem resistance associated with production of carbapenem-hydrolyzing Ambler class D β-lactamases, such as OXA-23, OXA-66, OXA-120, OXA-500, and metallo-β-lactamase, SIM-1. We have analyzed the relative expression of Ade efflux systems, and determined the sequences of their regulators to correlate with phenotypic resistance. Quinolone resistance-determining regions were analyzed to understand fluoroquinolone-resistance. Virulence factors responsible for pathogenesis were also identified. Due to several mutations, acquisition of multiple resistance genes and transposon insertion, phenotypic resistance decision scheme for for evaluating the resistance proved inaccurate, which highlights the urgent need for modification to this scheme. This complete illustration of mechanism contributing to specific resistance phenotypes can be used as a target for novel drug development. It can also be used as a reference strain in the clinical laboratory and for the evaluation of antibiotic efficacy for specific resistance mechanisms.
Tool sequence selection is an important activity in process-planning for milling and has great bearing on the cost of machining. Currently, it is accomplished manually without consideration of cost ...factors a priori. Typically, a large tool is selected to quickly generate the rough shape and a smaller clearing tool is used to generate the net-shape. In this paper, we present a new systematic method to select the optimal sequence of tool(s), to machine a 2.5-axis pocket given pocket geometry, a database of cutting tools, cutting parameters, and tool holder geometry. Algorithms have been developed to calculate the geometric constructs such as accessible areas, and pocket decomposition, while considering tool holders. A Genetic Algorithm (GA) formulation is used to find the optimal tool sequence. Two types of selection mechanisms namely “Elitist selection” and “Roulette method” are tested. It is found that the Elitist method converges much faster than the Roulette method. The proposed method is compared to a shortest-path graph formulation that was developed previously by the authors. It is found that the GA formulation generates near optimal solutions while reducing computation by up to 30% as compared to the graph formulation.
Optical Coherence Tomography (OCT) is an emerging technique in the field of biomedical imaging, with applications in ophthalmology, dermatology, coronary imaging etc. Due to the underlying physics, ...OCT images usually suffer from a granular pattern, called speckle noise, which restricts the process of interpretation. Here, a sparse and low rank decomposition based method is used for speckle reduction in retinal OCT images. This technique works on input data that consists of several B-scans of the same location. The next step is the batch alignment of the images using a sparse and low-rank decomposition based technique. Finally the denoised image is created by median filtering of the low-rank component of the processed data. Simultaneous decomposition and alignment of the images result in better performance in comparison to simple registration-based methods that are used in the literature for noise reduction of OCT images.