Author: Kanhangad, Vivek
Title: Biometric identification using contact-free 3D hand scans
Degree: Ph.D.
Year: 2010
Subject: Hong Kong Polytechnic University -- Dissertations
Biometric identification.
Three-dimensional imaging in biology
Department: Department of Computing
Pages: xvii, 178 p. : ill. ; 30 cm.
Language: English
Abstract: The hand identification problem has been extensively studied in the biometrics literature. Commercially available identification systems based on hand geometry features have gained high user acceptance and found wide ranging applications for personal verification tasks. Nevertheless, there are several critical issues that remain to be addressed in order to make hand identification systems more robust and user-friendly. Major limitations of current two dimensional image based hand identification include its high vulnerability to spoof attacks, inconvenience caused to the user by the constrained imaging set up, especially to elderly and people suffering from limited dexterity, and hygienic concerns among users due to the placement of the hand on the imaging platform. Obviating the need for hand position restricting pegs and the imaging platform, however, introduces a highly challenging problem of having to handle hand pose variations in three dimensional (3D) space. This dissertation explores the use of 3D contact-free hand scans and the possibility of integrating three dimensional shape and intensity information in order to overcome the above limitations. A two step, fully automatic, approach for hand matching that handles large changes in pose is developed. In the first step, the acquired 3D hand is utilized to robustly estimate its orientation based on a single detected point on the hand. The estimated orientation information is then used to normalize the pose of the 3D hand along with its texture. In the second step, multimodal hand features extracted from the pose corrected range and intensity images are utilized to perform identification. The extracted palmprint and finger geometry features are combined using a new dynamic fusion strategy. It is shown that the dynamic fusion approach performs significantly better than the straightforward fusion using a weighted combination rule. In order to extract discriminatory features from the palmprint region of the 3D hand, two approaches that exploit local surface details have been developed. The proposed 3D palmprint matcher is shown to be more robust against spoof attacks. For the purpose of 3D finger matching, two representations that characterize the 3D finger surface features are extracted from the range images. The matching metrics proposed for the two finger geometry features effectively handle limited pose variations and perform partial feature matching in order to enhance the performance. Finally, an adaptive fusion framework based on hybrid particle swarm optimization (PSO) that chooses the optimal fusion rule and weight parameters for a desired level of security is developed. Experiments are performed on synthetic as well as real biometric matching scores to demonstrate that the proposed fusion approach consistently outperforms the existing framework based on decision level fusion.
Rights: All rights reserved
Access: open access

Files in This Item:
File Description SizeFormat 
b23517165.pdfFor All Users4.94 MBAdobe PDFView/Open

Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show full item record

Please use this identifier to cite or link to this item: