Author: Yu, Xinbo
Title: Practical algorithms for vision-based human activity recognition and human action evaluation
Advisors: Chan, C. C. Keith (COMP)
Degree: Ph.D.
Year: 2020
Subject: Human activity recognition
Machine learning
Hong Kong Polytechnic University -- Dissertations
Department: Department of Computing
Pages: iii, vii, 109 pages : color illustrations
Language: English
Abstract: Human Activity Recognition (HAR) and Human Action Evaluation (HAE) are two main tasks of human activity analysis addressed in this thesis, which could be applied to many application domains like healthcare and physical rehabilitation, interactive entertainment, and video surveillance. These applications could alleviate the increasingly serious problem of population aging by bring improvements to the people's quality of life. Existing HAR methods use various sensors like vision, wearable, and ambient sensors. With a comprehensive understanding of these sensors, this thesis focuses on the vision-based HAR. To test the effectiveness of existing methods, we collect a small real-world Activities of Daily Living (ADLs) dataset and implement some representative skeleton-based methods. We also propose an HAR framework called HARELCARE for developing practical HAR algorithms. Within the proposed HARELCARE framework, two effective HAR algorithms are developed and tested on the collected ADLs dataset. One of them is based on feature extraction, while the other is based on transfer learning. The results show both methods significantly outperform existing methods on our real-world ADLs dataset. Not only tackling with small datasets, we also propose a Model-based Multimodal Network (MMNet) to handle HAR with increasingly larger public datasets. Since most of public datasets are collected with Kinect sensors, multiple data modalities like skeleton and RGB video are available. However, it remains a lack of effective multimodal methods that could further improve the existing methods. Our MMNet fuses different data modalities at the feature level. With extensive experiments, the proposed MMNet is proved effective and achieves the true state-of-the-art performances on three public datasets NTU-RGB+D, PKU-MMD and Northwestern-UCLA Multiview. The results of our HAR algorithms show great potential of our methods to be applied to wide applications. Unlike HAR that focuses on activity classification, HAE is concerned with making judgements about the abnormality and even the quality of human actions. If performed effectively, HAE based on skeleton data can be used to monitor the outcomes of behavioural therapies for Alzheimer disease (AD). To do so, we propose a two-task Graph Convolutional Network (2T-GCN) to represent the skeleton data for both HAE tasks of abnormality detection and quality evaluation. It is first evaluated on the UI-PRMD dataset and found to perform well for abnormality detection. While for quality evaluation, in addition to the laboratory-collected UI-PRMD, we test it on a set of real exercise data collected from AD patients. Experimental results show that the numerical scores for some exercises performed by AD patients are consistent with their AD severity level assigned by a clinical staff. This shows the potential of our approach for monitoring AD and other neurodegenerative diseases.
Rights: All rights reserved
Access: open access

Files in This Item:
File Description SizeFormat 
5515.pdfFor All Users2.35 MBAdobe PDFView/Open


Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show full item record

Please use this identifier to cite or link to this item: https://theses.lib.polyu.edu.hk/handle/200/11050