Author: Duan, Ran
Title: Visual smart navigation for UAV mission-oriented flight
Advisors: Lu, Peng (AAE)
Wen, Chih-yung (AAE)
Degree: Ph.D.
Year: 2022
Subject: Drone aircraft -- Piloting
Navigation (Astronautics)
Hong Kong Polytechnic University -- Dissertations
Department: Department of Aeronautical and Aviation Engineering
Pages: xxi, 100 pages : color illustrations
Language: English
Abstract: This thesis addresses the accurate and robust localization problems for UAV visual navigation, including the localization of both the UAV itself and destination, during a mission-oriented flying. The self-localization process is carried out by visual odometry (VO) and we localize the destination by object tracking. Both of them are fundamental yet very challenging tasks in computer vision and robotic areas.
To achieve accurate and robust self-localization, our work investigates an inherent problem of long-term VO, i.e., why the camera pose estimation process occasionally obtains a relatively large error even when the residual of the reprojection errors are well controlled. We demonstrate that the long-term VO process suffers from the biased error distribution of estimated poses and presents a stereo orientation prior (SOP) method to perform a bias compensation in each frame. Using the stereo camera extrinsic parameters as the baseline, the SOP measures the bias of each dimension of the 6-DoF pose for every 2D-3D geometric correspondence. Unlike the commonly used error metrics that compute the total error of an inlier group, our measurement is based on the semidifinite programming of the quadratic polynomials that reformed from 2D-3D points projection system. This allows us to evaluate whether the error mainly comes from orientation or translation. Thus, the proposed system can refine the inlier group by rejecting the points with large error bias in orientation, which performs like a "soft-IMU". We show that the proposed visual odometry system achieves competitive performance in terms of accuracy and robustness even compare with the IMU-aided state-of-the-art methods.
To automatically localize the destination for the UAV, we present a deep-learning-based tracker. It rebuilds a discriminative target appearance model by selecting the representative convolutional neural network (CNN) layers and feature maps autonomously. Then a sub-network is extracted to perform the object detection for the tracked target. To show the versatility of the proposed method, we implemented it on VGG-19 net and YOLO v3, respectively. The results demonstrate that the proposed tracker is quite competitive with the state-of-the-art CNN-based trackers in terms of accuracy, scale adaptation, robustness, and efficiency for UAV-related applications.
Finally, we integrate the visual odometry and object tracking into the UAV onboard vision system. With the stereo vision and the current UAV pose from visual odometry, the tracked targets in the 2D image can be converted to the 3D positions in the odometry local map for the UAV navigation. This allows the UAV to perform mission-oriented flying, such as object inspection or goods delivery, in full autonomy.
Rights: All rights reserved
Access: open access

Files in This Item:
File Description SizeFormat 
6272.pdfFor All Users10.14 MBAdobe PDFView/Open


Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show full item record

Please use this identifier to cite or link to this item: https://theses.lib.polyu.edu.hk/handle/200/11787