Aleke Nolte, BCCN Berlin / TU Berlin

Hebbian RatSLAM with deep-learned features

In visual SLAM, RatSLAM, a rat brain inspired localization and mapping algorithm has made a name for itself by mapping a whole suburb using only the inputs from a single webcam. The algorithm performs well but relies on a hand-engineered image processing procedure to recognize previously visited places. While overall quite robust, this processing would still need to be adjusted manually for new environments. To address this and to work towards making RatSLAM adaptable to unknown environments, we extend RatSLAM by adding two components which can adjust based on their inputs. Specifically, following the idea of bio-inspiration, we investigate 1) if we can replace the hand-engineered image processing by features extracted by a pre-trained deep neural network and 2) if a variant of Hebbian learning can be used to associate feature representations with places. To evaluate the feasibility of the proposed Hebbian extension and to investigate the impact of the deep-learned features in comparison to the original RatSLAM image processing procedure, we perform a number of simplified mapping experiments in a simulated, but visually complex environment. Evaluating the mapping performance, we find that the Hebbian extension to RatSLAM allows constructing good maps of the environment given that the used feature representations encode the visual inputs sensibly. Further, when the viewpoint of the on-board camera varies on re-visits to a place, the results show that utilizing the proposed deep-learned features results in better maps than with RatSLAM's original image processing, thereby pointing towards the potential of deep feature learning for localization and mapping tasks.

Additional Information

MSc defence in the International Masterprogram Computational Neuroscience

Organized by

Verena Hafner / Robert Martin

Go back