DIKUL - logo
E-resources
Full text
Peer reviewed
  • Selective memory: Recalling...
    MacTavish, Kirk; Paton, Michael; Barfoot, Timothy D.

    Journal of field robotics, December 2018, 2018-12-00, 20181201, Volume: 35, Issue: 8
    Journal Article

    Visual navigation is a key enabling technology for autonomous mobile vehicles. The ability to provide large‐scale, long‐term navigation using low‐cost, low‐power vision sensors is appealing for industrial applications. A crucial requirement for long‐term navigation systems is the ability to localize in environments whose appearance is constantly changing over time—due to lighting, weather, seasons, and physical changes. This paper presents a multiexperience localization (MEL) system that uses a powerful map representation—storing every visual experience in layers—that does not make assumptions about underlying appearance modalities and generators. Our localization system provides real‐time performance by selecting online, a subset of experiences against which to localize. We achieve this task through a novel experience‐triage algorithm based on collaborative filtering, which selects experiences relevant to the live view, outperforming competing techniques. Based on classical memory‐based recommender systems, this technique also enables landmark‐level recommendations, is entirely online, and requires no training data. We demonstrate the capabilities of the MEL system in the context of long‐term autonomous path following in unstructured outdoor environments with a challenging 100‐day field experiment through day, night, snow, spring, and summer. We furthermore provide offline analysis comparing our system to several state‐of‐the‐art alternatives. We show that the combination of the novel methods presented in this paper enable full use of incredibly rich multiexperience maps, opening the door to robust long‐term visual localization.