NUK - logo
E-viri
Celotno besedilo
Recenzirano Odprti dostop
  • 3-D Object Tracking in Pano...
    Marshall, M. R.; Hellfeld, D.; Joshi, T. H. Y.; Salathe, M.; Bandstra, M. S.; Bilton, K. J.; Cooper, R. J.; Curtis, J. C.; Negut, V.; Shurley, A. J.; Vetter, K.

    IEEE transactions on nuclear science, 02/2021, Letnik: 68, Številka: 2
    Journal Article

    Networked detector systems can be deployed in urban environments to aid in the detection and localization of radiological and/or nuclear material. However, effectively responding to and interpreting a radiological alarm using spectroscopic data alone may be hampered by a lack of situational awareness, particularly in complex environments. This study investigates the use of Light Detection and Ranging (LiDAR) and streaming video to enable real-time object detection and tracking, and the fusion of this tracking information with radiological data for the purposes of enhanced situational awareness and increased detection sensitivity. This work presents an object detection, tracking, and novel source-object attribution analysis that is capable of operating in real time. By implementing this analysis pipeline on a custom-developed system that comprises a static 2 in. <inline-formula> <tex-math notation="LaTeX">\times 4 </tex-math></inline-formula> in. <inline-formula> <tex-math notation="LaTeX">\times16 </tex-math></inline-formula> in. NaI(Tl) detector colocated with a 64-beam LiDAR and four monocular cameras, we demonstrate the ability to accurately correlate trajectories from tracked objects to spectroscopic gamma-ray data in real time and use physics-based models to reliably discriminate between source-carrying and nonsource-carrying objects. In this work, we describe our approach in detail and present a quantitative performance assessment that characterizes the source-object attribution capabilities of both video and LiDAR. Additionally, we demonstrate the ability to simultaneously track pedestrians and vehicles in a mock urban environment and use this tracking information to improve both detection sensitivity and situational awareness using our contextual-radiological data fusion methodology.