Full metadata record
DC FieldValueLanguage
dc.contributorDepartment of Land Surveying and Geo-Informaticsen_US
dc.contributor.advisorChen, Wu (LSGI)-
dc.contributor.advisorWu, Bo (LSGI)-
dc.creatorDarwish, Walid Abdallah Aboumandour-
dc.identifier.urihttps://theses.lib.polyu.edu.hk/handle/200/9662-
dc.languageEnglishen_US
dc.publisherHong Kong Polytechnic University-
dc.rightsAll rights reserveden_US
dc.titlePrecise reconstruction of indoor environments using RGB-depth sensorsen_US
dcterms.abstractCommercial RGB-D cameras (e.g., Kinect) have been widely used in the gaming industry as non-touch remote controllers. RGB-D cameras are designed for maximum three-meter range applications where geometric fidelity is not of utmost importance. Recently, Structure Sensor was released in the commercial market as the first mobile RGB-D camera. As this promising camera has great potential to be used in indoor navigation and 3D modelling, precise calibration of their depth information, working range, and geometric sensor parameters should be thoroughly obtained. In this study, we propose a novel calibration method for Structured Light (SL) RGB-D cameras. The calibration method uses a novel distortion model for the captured depth images. The depth distortion model consumes the distortion effects of both IR sensors. The method calibrates the geometric parameters of each RGB-D camera lens. Moreover, the method extends to modelling the systematic depth bias resulting from imaging conditions and IR sensors' baseline. The method can thoroughly calibrate the SL RGB-D cameras' full range independently of the IR sensors' baseline. The calibration procedure was normalized and designed to be automatic. The proposed calibration method can calibrate the full range of the sensor and achieve a relative error of 0.8%, while ordinary calibration methods can only calibrate up to 34% of the sensor's range and achieves a relative error of 4.0%. Due to indoor scalability, many RGB-D frames were collected and registered together to form a complete colored 3D model. The Simultaneous Localization And Mapping (SLAM) technique is used to track the RGB-D camera. The scene structure, the depth range, and feature types are the dominant elements affecting registration accuracy and thus SLAM performance. Those elements can easily force SLAM into a severe drift or terminate the tracking status (lost tracking). Current SLAM systems use visual matched point features to compute the camera pose; therefore, those systems suffer from lost tracking problems and inevitable drift. To minimize the probability of lost tracking and drift, strong features (lines, planes) were added to the SLAM tracking core. In this context, a new procedure to detect, extract, describe, and match those 3D features was proposed. Line features were extracted using RGB and depth images while plane features were extracted using the depth image. The procedure uses a novel descriptor which adopted both visual and depth information to describe the 3D features for further matching. A new RGB-D SLAM system is proposed to utilize the valuable 3D matched features. The Fully Constrained RGB-D SLAM (FC RGB-D SLAM) system minimizes the combined geometric distance of 2D and 3D matched features to estimate the camera pose, then to enhance 3D model quality, the system applies a global refinement stage to refine the estimated camera poses based on indoor geometric constraints. Also, the system adopts the graph-based optimization technique to correct the closure error whenever a loop closure is detected. The results show that compared to visual RGB-D SLAM systems, FC RGB-D SLAM can achieve significant improvements in 3D model accuracy with and without loop closure constraints.en_US
dcterms.extentxix, 119 pages : color illustrationsen_US
dcterms.isPartOfPolyU Electronic Thesesen_US
dcterms.issued2018en_US
dcterms.educationalLevelPh.D.en_US
dcterms.educationalLevelAll Doctorateen_US
dcterms.LCSHHong Kong Polytechnic University -- Dissertationsen_US
dcterms.LCSHComputer visionen_US
dcterms.LCSHDepth perceptionen_US
dcterms.accessRightsopen accessen_US

Files in This Item:
File Description SizeFormat 
991022165759203411.pdfFor All Users3.12 MBAdobe PDFView/Open


Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show simple item record

Please use this identifier to cite or link to this item: https://theses.lib.polyu.edu.hk/handle/200/9662