Author: Zou, Yajing
Title: RGB-D SLAM for indoor mobile platform based on hybrid feature fusion and wheel odometer integration
Advisors: Chen, Wu (LSGI)
Degree: Ph.D.
Year: 2022
Subject: Wireless localization
Mappings ,Mathematics)
Hong Kong Polytechnic University -- Dissertations
Department: Department of Land Surveying and Geo-Informatics
Pages: xii, 150 pages : color illustrations
Language: English
Abstract: Self-tracking and scene reconstruction are crucial for mobile platform navigation in unknown indoor environments. Red-Green-Blue Depth (RGB-D) camera is an ideal choice of the onboard sensor on the mobile platform, in consideration of its small size, light weight, and cheap price. However, the performance of RGB-D simultaneous localization and mapping (SLAM) is degenerated due to two main problems: (a) the SLAM system is prone to lost tracking under low textured scenes, and (b) the accuracy is inadequate for mobile platform navigation because of the accumulating drift. This thesis aims to solve the first problem by hybrid feature fusion and the second one by wheel odometer integration, and therefore improves the continuity and accuracy of the RGB-D SLAM system on the mobile platform.
Firstly, a new RGB-D SLAM fusing point and line features is proposed. While previous line-based methods utilize either 3D-3D or 3D-2D line correspondences, the new method combines both and can exploit more line information. It is evaluated on Technical University of Munich (TUM) RGB-D datasets and real-world experiments. Experiment results show that the proposed method can yield better continuity than state-of-the-art (SOTA) methods. In addition, it can improve the localization accuracy of the method utilizing 3D line features by 22.5% and the mapping accuracy by 10.2%. The improvements over the method utilizing 2D line features are 25.8% and 14.7% in consideration of localization and mapping accuracies, respectively.
Secondly, a new RGB-D SLAM fusing point and plane features is proposed. While previous plane-based methods assign experimental weights to the plane features, the new method derives the analytical covariances by plane fitting and covariance propagation. Point and plane features are optimally combined to construct the cost function based on the derived covariances. Furthermore, a new representation form for plane features is developed based on the parallel and vertical relationships among planes. It encodes the structural regularity in indoor scenes and is further utilized by factor graph optimization. Experiments on the TUM RGB-D datasets prove that the proposed method yields better continuity than the feature point-based methods. In the lab room experiment, the proposed method can improve the localization accuracy by 23.6% using the analytical covariances, and enhance that by 27.6% using the new representation form. In the corridor experiment, the improvements of the mapping accuracies are 11.5% owing to the analytical covariances, and 8.8% using the new representation form.
Thirdly, a new localization and mapping method by tightly coupling the RGB-D camera and the wheel odometer is proposed. Previous methods assume the platform moves on a 100% flat floor which is unpractical and may lead to non-optimal estimation results. To avoid the disadvantage, the new method adopts a soft assumption that the platform moves with small perturbations due to uneven terrain and develops a two-step strategy to handle the perturbations: (a) firstly, the Mahalanobis distance test is applied to examine the motion assumption, and (b) secondly, the ground plane is detected to constrain the mobile platform. Moreover, the visual and wheel odometer constraints are tightly coupled in a new factor graph. The proposed method is evaluated by two real-world experiments in a lab room and a corridor, respectively. Compared with the previous loose-coupled method utilizing a hard planar motion assumption, it can improve the localization accuracy by 40.7% and the mapping accuracy by 33.8%.
Finally, based on the algorithms developed in this study, a comprehensive real-time RGB-D SLAM system is developed for mobile platform navigation. Point, line, and plane features are simultaneously fused in the comprehensive system. Hybrid features are combined with the wheel odometer under the soft planar motion assumption. In real-world experiments, compared with the feature point-based system, the proposed system can improve the localization accuracy by 70.1% and the mapping accuracy by 75.9%, combing the wheel odometer can improve these accuracies by 66.3% and 72.1%, fusing points, lines, and planes can improve them by 57.2% and 62.6%, fusing plane features can improve them by 53.8% and 55.6%, and the smallest improvements are 33.6% and 39.1% by fusing line features.
Rights: All rights reserved
Access: open access

Files in This Item:
File Description SizeFormat 
6399.pdfFor All Users4.04 MBAdobe PDFView/Open


Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show full item record

Please use this identifier to cite or link to this item: https://theses.lib.polyu.edu.hk/handle/200/12020