Abstract

In this paper we propose an affordable solution to self-localization, which utilizes visual odometry and road maps as the only inputs. To this end, we present a probabilistic model as well as an efficient approximate inference algorithm, which is able to utilize distributed computation to meet the real-time requirements of autonomous systems. Because of the probabilistic nature of the model we are able to cope with uncertainty due to noisy visual odometry and inherent ambiguities in the map (e.g., in a Manhattan world). By exploiting freely available, community developed maps and visual odometry measurements, we are able to localize a vehicle up to 3m after only a few seconds of driving on maps which contain more than 2,150km of drivable roads.

Keywords

OdometryVisual odometryProbabilistic logicComputer scienceArtificial intelligenceComputer visionComputationSimultaneous localization and mappingInferenceStatistical modelMobile robotRobotAlgorithm

Affiliated Institutions

Related Publications

Publication Info

Year
2013
Type
article
Pages
3057-3064
Citations
146
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

146
OpenAlex

Cite This

Marcus A. Brubaker, Andreas Geiger, Raquel Urtasun (2013). Lost! Leveraging the Crowd for Probabilistic Visual Self-Localization. , 3057-3064. https://doi.org/10.1109/cvpr.2013.393

Identifiers

DOI
10.1109/cvpr.2013.393