Bioremediation was performed in situ at a former military range site to assess the performance of native bacteria in degrading hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX) and 2,4-dinitrotoluene ...(2,4-DNT). The fate of these pollutants in soil and soil pore water was investigated as influenced by waste glycerol amendment to the soil. Following waste glycerol application, there was an accumulation of organic carbon that promoted microbial activity, converting organic carbon into acetate and propionate, which are intermediate compounds in anaerobic processes. This augmentation of anaerobic activity strongly correlated to a noticeable reduction in RDX concentrations in the amended soil. Changes in concentrations of RDX in pore water were similar to those observed in the soil suggesting that RDX leaching from the soil matrix, and treatment with waste glycerol, contributed to the enhanced removal of RDX from the water and soil. This was not the case with 2,4-DNT, which was neither found in pore water nor affected by the waste glycerol treatment. Results from saturated conditions and Synthetic Precipitation Leaching Procedure testing, to investigate the environmental fate of 2,4-DNT, indicated that 2,4-DNT found on site was relatively inert and was likely to remain in its current state on the site.
Intellectual disability (ID) is characterised by an extreme genetic heterogeneity. Several hundred genes have been associated to monogenic forms of ID, considerably complicating molecular ...diagnostics. Trio-exome sequencing was recently proposed as a diagnostic approach, yet remains costly for a general implementation.
We report the alternative strategy of targeted high-throughput sequencing of 217 genes in which mutations had been reported in patients with ID or autism as the major clinical concern. We analysed 106 patients with ID of unknown aetiology following array-CGH analysis and other genetic investigations. Ninety per cent of these patients were males, and 75% sporadic cases.
We identified 26 causative mutations: 16 in X-linked genes (ATRX, CUL4B, DMD, FMR1, HCFC1, IL1RAPL1, IQSEC2, KDM5C, MAOA, MECP2, SLC9A6, SLC16A2, PHF8) and 10 de novo in autosomal-dominant genes (DYRK1A, GRIN1, MED13L, TCF4, RAI1, SHANK3, SLC2A1, SYNGAP1). We also detected four possibly causative mutations (eg, in NLGN3) requiring further investigations. We present detailed reasoning for assigning causality for each mutation, and associated patients' clinical information. Some genes were hit more than once in our cohort, suggesting they correspond to more frequent ID-associated conditions (KDM5C, MECP2, DYRK1A, TCF4). We highlight some unexpected genotype to phenotype correlations, with causative mutations being identified in genes associated to defined syndromes in patients deviating from the classic phenotype (DMD, TCF4, MECP2). We also bring additional supportive (HCFC1, MED13L) or unsupportive (SHROOM4, SRPX2) evidences for the implication of previous candidate genes or mutations in cognitive disorders.
With a diagnostic yield of 25% targeted sequencing appears relevant as a first intention test for the diagnosis of ID, but importantly will also contribute to a better understanding regarding the specific contribution of the many genes implicated in ID and autism.
Purpose
The aim of this report is to present a prototype augmented reality (AR) intra-operative brain imaging system. We present our experience of using this new neuronavigation system in ...neurovascular surgery and discuss the feasibility of this technology for aneurysms, arteriovenous malformations (AVMs), and arteriovenous fistulae (AVFs).
Methods
We developed an augmented reality system that uses an external camera to capture the live view of the patient on the operating room table and to merge this view with pre-operative volume-rendered vessels. We have extensively tested the system in the laboratory and have used the system in four surgical cases: one aneurysm, two AVMs and one AVF case.
Results
The developed AR neuronavigation system allows for precise patient-to-image registration and calibration of the camera, resulting in a well-aligned augmented reality view. Initial results suggest that augmented reality is useful for tailoring craniotomies, localizing vessels of interest, and planning resection corridors.
Conclusion
Augmented reality is a promising technology for neurovascular surgery. However, for more complex anomalies such as AVMs and AVFs, better visualization techniques that allow one to distinguish between arteries and veins and determine the absolute depth of a vessel of interest are needed.
User interaction has the potential to greatly facilitate the exploration and understanding of 3D medical images for diagnosis and treatment. However, in certain specialized environments such as in an ...operating room (OR), technical and physical constraints such as the need to enforce strict sterility rules, make interaction challenging. In this paper, we propose to facilitate the intraoperative exploration of angiographic volumes by leveraging the motion of a tracked surgical pointer, a tool that is already manipulated by the surgeon when using a navigation system in the OR. We designed and implemented three interactive rendering techniques based on this principle. The benefit of each of these techniques is compared to its non-interactive counterpart in a psychophysics experiment where 20 medical imaging experts were asked to perform a reaching/targeting task while visualizing a 3D volume of angiographic data. The study showed a significant improvement of the appreciation of local vascular structure when using dynamic techniques, while not having a negative impact on the appreciation of the global structure and only a marginal impact on the execution speed. A qualitative evaluation of the different techniques showed a preference for dynamic chroma-depth in accordance with the objective metrics but a discrepancy between objective and subjective measures for dynamic aerial perspective and shading.
Purpose
Neuronavigation systems making use of augmented reality (AR) have been the focus of much research in the last couple of decades. In recent years, there has been considerable interest in using ...mobile devices for AR in the operating room (OR). We propose a complete system that performs real-time AR video augmentation on a mobile device in the context of image-guided neurosurgery.
Methods
MARIN (mobile augmented reality interactive neuronavigation system) improves upon the state of the art in terms of performance, allowing real-time augmentation, and interactivity by allowing users to interact with the displayed data. The system was tested in a user study with 17 subjects for qualitative and quantitative evaluation in the context of target localization and brought into the OR for preliminary feasibility tests, where qualitative feedback from surgeons was obtained.
Results
The results of the user study showed that MARIN performs significantly better in terms of both time (
p
<
0.0004
) and accuracy (
p
<
0.04
) for the task of target localization in comparison with a traditional image-guided neurosurgery (IGNS) navigation system. Further, MARIN AR visualization was found to be more intuitive and allowed users to estimate target depth more easily.
Conclusion
MARIN improves upon previously proposed mobile AR neuronavigation systems with its real-time performance, higher accuracy, full integration in the normal workflow and greater interactivity and customizability of the displayed information. The improvement in efficiency and usability over previous systems will facilitate bringing AR into the OR.
Purpose
Navigation systems commonly used in neurosurgery suffer from two main drawbacks: (1) their accuracy degrades over the course of the operation and (2) they require the surgeon to mentally map ...images from the monitor to the patient. In this paper, we introduce the Intraoperative Brain Imaging System (IBIS), an open-source image-guided neurosurgery research platform that implements a novel workflow where navigation accuracy is improved using tracked intraoperative ultrasound (iUS) and the visualization of navigation information is facilitated through the use of augmented reality (AR).
Methods
The IBIS platform allows a surgeon to capture tracked iUS images and use them to automatically update preoperative patient models and plans through fast GPU-based reconstruction and registration methods. Navigation, resection and iUS-based brain shift correction can all be performed using an AR view. IBIS has an intuitive graphical user interface for the calibration of a US probe, a surgical pointer as well as video devices used for AR (e.g., a surgical microscope).
Results
The components of IBIS have been validated in the laboratory and evaluated in the operating room. Image-to-patient registration accuracy is on the order of
3.72
±
1.27
mm
and can be improved with iUS to a median target registration error of 2.54 mm. The accuracy of the US probe calibration is between 0.49 and 0.82 mm. The average reprojection error of the AR system is
0.37
±
0.19
mm
. The system has been used in the operating room for various types of surgery, including brain tumor resection, vascular neurosurgery, spine surgery and DBS electrode implantation.
Conclusions
The IBIS platform is a validated system that allows researchers to quickly bring the results of their work into the operating room for evaluation. It is the first open-source navigation system to provide a complete solution for AR visualization.
Purpose
Accurate and effective registration of the vertebrae is crucial for spine surgical navigation procedures. Patient movement, surgical instrumentation or inadvertent contact with the tracked ...reference during the intervention may invalidate the registration, requiring a rapid correction of the misalignment. In this paper, we present a framework to rigidly align preoperative computed tomography (CT) with the intra-operative ultrasound (iUS) images of a single vertebra.
Methods
We use a single caudo-cranial axial sweep procedure to acquire iUS images, from which the scan trajectory is exploited to initialize the registration transform. To refine the transform, locations of the posterior vertebra surface are first extracted, then used to compute the CT-to-iUS image intensity gradient-based alignment. The approach was validated on a lumbosacral section of a porcine cadaver.
Results
We achieved an overall median accuracy of 1.48 mm (success rate of 84.42%) in
∼
11 s of computation time, satisfying the clinically accepted accuracy threshold of 2 mm.
Conclusion
Our approach using intra-operative ultrasound to register patient vertebral anatomy to preoperative images matches the clinical needs in terms of accuracy and computation time, facilitating its integration into the surgical workflow.