Robotics

Enhancing RGB-D SLAM Performances Considering Sensor Specifications for Indoor Localization

Publié le - IEEE Sensors Journal

Auteurs : Imad El Bouazzaoui, Sergio Alberto Rodriguez Florez, Abdelhafid El Ouardi

Several works have focused on Simultaneous Localization and Mapping (SLAM), which is a topic that has been studied for more than a decade to meet the needs of robots to navigate in an unknown environment. SLAM is an essential perception functionality in several applications, especially in robotics and autonomous vehicles. RGB-D cameras are among the sensors commonly used with recent SLAM algorithms. They provide an RGB image and the associated depth map, making it possible to solve scale drift with less complexity and create a dense 3D environment representation. Many RGB-D SLAM algorithms have been studied and evaluated on publicly available datasets without considering sensor specifications or image acquisition modes that could improve or decrease localization accuracy. In this work, we deal with indoor localization, taking into account the sensor specifications. In this context, our contribution is a deep experimental study to highlight the impact of the sensor acquisition modes on the localization accuracy, and a parametric optimization protocol for a precise localization in a given environment. Furthermore, we apply the proposed protocol to optimize a depth-related parameter of the SLAM algorithm. The study is based on a publicly available dataset in an indoor environment with a depth sensor. The reconstruction results’ analysis is founded on the study of different metrics involving translational and rotational errors. These metrics errors are compared with those obtained with a state-of-the-art stereo vision-based SLAM algorithm.