NUK - logo
E-resources
Peer reviewed Open access
  • MAT: Motion-aware multi-obj...
    Han, Shoudong; Huang, Piao; Wang, Hongwei; Yu, En; Liu, Donghaisheng; Pan, Xiaofeng

    Neurocomputing (Amsterdam), 03/2022, Volume: 476
    Journal Article

    Modern multi-object tracking (MOT) systems usually build trajectories through associating per-frame detections. However, facing the challenges of camera motion, fast motion, and occlusion, it is difficult to ensure the quality of long-range tracking or even the tracklet purity, especially for small objects. Most of tracking frameworks depend heavily on the performance of re-identification (ReID) for the data association. Unfortunately, the ReID-based association is not only unreliable and time-consuming, but still cannot address the false negatives for occluded and blurred objects, due to noisy partial-detections, similar appearances, and lack of temporal-spatial constraints. In this paper, we propose an enhanced MOT paradigm, namely Motion-Aware Tracker (MAT). Our MAT is a plug-and-play solution, it mainly focuses on high-performance motion-based prediction, reconnection, and association. First, the nonrigid pedestrian motion and rigid camera motion are blended seamlessly to develop the Integrated Motion Localization (IML) module. Second, the Dynamic Reconnection Context (DRC) module is devised to guarantee the robustness for long-range motion-based reconnection. The core ideas in DRC are the motion-based dynamic-window and cyclic pseudo-observation trajectory filling strategy, which can smoothly fill in the tracking fragments caused by occlusion or blur. At last, we present the 3D Integral Image (3DII) module to efficiently cut off useless track-detection association connections using temporal-spatial constraints. Extensive experiments are conducted on the MOT16&17 challenging benchmarks. The results demonstrate that our MAT can achieve superior performance and surpass other state-of-the-art trackers by a large margin with high efficiency.