4.7 Article

NeuroBayesSLAM: Neurobiologically inspired Bayesian integration of multisensory information for robot navigation

Journal

NEURAL NETWORKS
Volume 126, Issue -, Pages 21-35

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2020.02.023

Keywords

Bayesian; Multisensory integration; Attractor dynamics; Head direction cells; Grid cells; Monocular SLAM

Funding

  1. National Key Research and Development Program of China [2016YFC0801808]
  2. Natural Science Foundation of China [51679213]
  3. CAS Pioneer Hundred Talents Program, China [Y8F1160101]
  4. State Key Laboratory of Robotics, China [Y7C120E101]

Ask authors/readers for more resources

Spatial navigation depends on the combination of multiple sensory cues from idiothetic and allothetic sources. The computational mechanisms of mammalian brains in integrating different sensory modalities under uncertainty for navigation is enlightening for robot navigation. We propose a Bayesian attractor network model to integrate visual and vestibular inputs inspired by the spatial memory systems of mammalian brains. In the model, the pose of the robot is encoded separately by two sub-networks, namely head direction network for angle representation and grid cell network for position representation, using similar neural codes of head direction cells and grid cells observed in mammalian brains. The neural codes in each of the sub-networks are updated in a Bayesian manner by a population of integrator cells for vestibular cue integration, as well as a population of calibration cells for visual cue calibration. The conflict between vestibular cue and visual cue is resolved by the competitive dynamics between the two populations. The model, implemented on a monocular visual simultaneous localization and mapping (SLAM) system, termed NeuroBayesSLAM, successfully builds semi-metric topological maps and self-localizes in outdoor and indoor environments of difference characteristics, achieving comparable performance as previous neurobiologically inspired navigation systems but with much less computation complexity. The proposed multisensory integration method constitutes a concise yet robust and biologically plausible method for robot navigation in large environments. The model provides a viable Bayesian mechanism for multisensory integration that may pertain to other neural subsystems beyond spatial cognition. (c) 2020 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available