Abstract

In this paper, we present a novel benchmark for the evaluation of RGB-D SLAM systems. We recorded a large set of image sequences from a Microsoft Kinect with highly accurate and time-synchronized ground truth camera poses from a motion capture system. The sequences contain both the color and depth images in full sensor resolution (640 × 480) at video frame rate (30 Hz). The ground-truth trajectory was obtained from a motion-capture system with eight high-speed tracking cameras (100 Hz). The dataset consists of 39 sequences that were recorded in an office environment and an industrial hall. The dataset covers a large variety of scenes and camera motions. We provide sequences for debugging with slow motions as well as longer trajectories with and without loop closures. Most sequences were recorded from a handheld Kinect with unconstrained 6-DOF motions but we also provide sequences from a Kinect mounted on a Pioneer 3 robot that was manually navigated through a cluttered indoor environment. To stimulate the comparison of different approaches, we provide automatic evaluation tools both for the evaluation of drift of visual odometry systems and the global pose error of SLAM systems. The benchmark website [1] contains all data, detailed descriptions of the scenes, specifications of the data formats, sample code, and evaluation tools.

Keywords

Computer visionComputer scienceArtificial intelligenceBenchmark (surveying)RGB color modelGround truthVisual odometrySimultaneous localization and mappingTrajectoryTracingOdometryPixelFrame (networking)Motion captureFrame ratePoseMobile robotRobotMotion (physics)

Affiliated Institutions

Related Publications

Publication Info

Year
2012
Type
article
Pages
573-580
Citations
3714
Access
Closed

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

3714
OpenAlex
687
Influential
3027
CrossRef

Cite This

Jrgen Sturm, Nikolas Engelhard, Felix Endres et al. (2012). A benchmark for the evaluation of RGB-D SLAM systems. 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems , 573-580. https://doi.org/10.1109/iros.2012.6385773

Identifiers

DOI
10.1109/iros.2012.6385773

Data Quality

Data completeness: 81%