Full metadata record
DC FieldValueLanguage
dc.contributorDepartment of Computingen_US
dc.creatorZhang, Kaihua-
dc.identifier.urihttps://theses.lib.polyu.edu.hk/handle/200/7373-
dc.languageEnglishen_US
dc.publisherHong Kong Polytechnic University-
dc.rightsAll rights reserveden_US
dc.titleReal-time and robust visual trackingen_US
dcterms.abstractVisual tracking has been extensively studied because of its importance in practical applications such as visual surveillance, human computer interaction, traffic monitoring, to name a few. Despite extensive research in this topic with demonstrated success, it is still a very challenging task to build a robust and efficient tracking system to deal with various appearance changes caused by pose variation, illumination changes, shape deformation and abrupt motion. In this thesis, we address these challenging factors by building several robust appearance models for visual tracking. To effectively select informative features to build an appearance model, we first present an online boosting feature selection approach via optimizing the Fisher information criterion. Recently, the multiple instance learning (MIL) method has been introduced into tracking to solve the sample ambiguity problems. The MIL tracker puts the positive and negative samples into bags, and then selects features with an online boosting method via maximizing the bag likelihood function. However, the features selected by the MIL tracker are less informative to tell target from background. To solve this problem, motivated by the active learning method, we propose an active feature selection approach which is able to select more informative features than MIL tracker by using the Fisher information criterion to measure the uncertainty of classification model, thereby resulting in more robust and efficient real-time object tracking performance. We further show that it is unnecessary to use bag likelihood loss functions for feature selection as proposed in the MIL tracker. Instead, we can directly select features on the instance level by using a supervised learning method which is more efficient and robust than the MIL tracker. In the MIL tracker, the important prior information of instance labels and the most important positive instance (i.e., the tracking result in the current frame) are not exploited. We show that integrating such prior information into a supervised learning algorithm can handle visual drift more effectively and efficiently than the MIL tracker. We present an online discriminative feature selection algorithm that directly couples its score with the importance of samples, leading to a more robust and efficient tracker. Different from the above-mentioned methods that select features via online boosting methods to design appearance models, we then propose an appearance model which is built by features extracted from a multiscale image feature space with random projections. A very sparse measurement matrix is constructed to efficiently extract the features. The tracking task is then formulated as a binary classification via a naive Bayes classifier with online update in the compressed domain. Finally, we present a simple yet very fast and robust algorithm which exploits the spatio-temporal context for visual tracking. Our approach formulates the spatio-temporal relationship between the object of interest and its local context based on the Bayesian framework, which models the spatio-temporal statistical correlation between the low-level features (i.e., image intensity and position) from the target and its surrounding regions. The tracking problem is then formulated as computing a confidence map, and obtaining the best target location by maximizing an object location likelihood function. The Fast Fourier Transform is adopted for extremely fast learning and detection in this work. Implemented in MATLAB, the proposed tracker runs at 350 frames per second on an i7 machine.en_US
dcterms.extentxviii, 149 p. : ill. ; 30 cm.en_US
dcterms.isPartOfPolyU Electronic Thesesen_US
dcterms.issued2013en_US
dcterms.educationalLevelAll Doctorateen_US
dcterms.educationalLevelPh.D.en_US
dcterms.LCSHComputer visionen_US
dcterms.LCSHHong Kong Polytechnic University -- Dissertationsen_US
dcterms.accessRightsopen accessen_US

Files in This Item:
File Description SizeFormat 
b26818152.pdfFor All Users9.73 MBAdobe PDFView/Open


Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show simple item record

Please use this identifier to cite or link to this item: https://theses.lib.polyu.edu.hk/handle/200/7373