Skip to main content

An Adaptive Augmented Vision-based Ellipsoidal SLAM for Indoor Environments

Resource type
Date created
2019-06-21
Authors/Contributors
Abstract
In this paper, the problem of Simultaneous Localization And Mapping (SLAM) is addressed via a novel augmented landmark vision-based ellipsoidal SLAM. The algorithm is implemented on a NAO humanoid robot and is tested in an indoor environment. The main feature of the system is the implementation of SLAM with a monocular vision system. Distinguished landmarks referred to as NAOmarks are employed to localize the robot via its monocular vision system. We henceforth introduce the notion of robotic augmented reality (RAR) and present a monocular Extended Kalman Filter (EKF)/ellipsoidal SLAM in order to improve the performance and alleviate the computational effort, to provide landmark identification, and to simplify the data association problem. The proposed SLAM algorithm is implemented in real-time to further calibrate the ellipsoidal SLAM parameters, noise bounding, and to improve its overall accuracy. The augmented EKF/ellipsoidal SLAM algorithms are compared with the regular EKF/ellipsoidal SLAM methods and the merits of each algorithm is also discussed in the paper. The real-time experimental and simulation studies suggest that the adaptive augmented ellipsoidal SLAM is more accurate than the conventional EKF/ellipsoidal SLAMs.
Document
Published as
Lahemer & Rad. (2019). An Adaptive Augmented Vision-Based Ellipsoidal SLAM for Indoor Environments. Sensors. 19. 2795. 10.3390/s19122795.
Publication title
Sensors
Document title
An Adaptive Augmented Vision-Based Ellipsoidal SLAM for Indoor Environments. Sensors
Date
2019
Volume
19
Issue
2795
Publisher DOI
10.3390/s19122795
Copyright statement
Copyright is held by the author(s).
Scholarly level
Peer reviewed?
Yes
Language
English
Member of collection
Download file Size
sensors-19-02795-v3.pdf 6.25 MB

Views & downloads - as of June 2023

Views: 16
Downloads: 2