Full metadata record
DC FieldValueLanguage
dc.contributorDepartment of Computingen_US
dc.contributor.advisorLeong, Hong Va (COMP)-
dc.creatorFu, Yujun-
dc.identifier.urihttps://theses.lib.polyu.edu.hk/handle/200/10024-
dc.languageEnglishen_US
dc.publisherHong Kong Polytechnic University-
dc.rightsAll rights reserveden_US
dc.titleUnderstanding human intention via non-intrusive capturing of interaction and body signalsen_US
dcterms.abstractComputers are commonplace and their ability to understand human intention will open up the possibility to provide potential useful assistance to human in many applications, including human social interactions and daily human-computer interactions. In this thesis, I investigate into these interactions to understand human intention. For human social interactions, I focus on a common type of social interaction in real life, namely, human fight. Prior studies about fight detection are challenged by some constraints, including the reliance on costly high level feature recognition, like human gestures, actions, or visual words and simulated fight events due to dataset availability. I propose two sets of motion analysis-based features to build human real fight detection models, without recognizing human gestures or visual words. To evaluate, we collect our own human real fight datasets. Experiments demonstrate that my models outperform the state-of-the-art counterparts in human real fight detection. I further extend my investigation to understand fights with real fight intention and simulated fights. The findings suggest that there are fundamental differences between human real fights and simulated fights, and my motion analysis-based features can effectively distinguish them. State-of-the-art data-driven approaches, such as deep learning, are constrained by the limited amount of spontaneous human real fight data available. To address this, I propose an ensemble-based method for cross-species fight detection by adapting knowledge from real animal fight events. Experiments demonstrate that it is feasible to build well-performing human real fight detection models via cross-species learning.en_US
dcterms.abstractFor daily human-computer interaction tasks, I study the user intention prediction problem. The challenges include limited prescribed tasks, lack of full modalities and the need of expensive intrusive devices to capture interaction and body signals, especially physiological signals. First, I conduct the study on a common but more complex and open-ended daily computer interaction task: web search task, to overcome the limitation of studying on simple prescribed tasks. Second, I propose two feature representations to encode users' interaction and body signals, including mouse, gaze, head and body motion signals. I combine these signal features with historical activity sequences to build effective multimodal user intention prediction models. Experiments indicate that the proposed features can successfully encode these signals and the model can achieve encouraging performance. I further extend the work to the application of detecting user slips. Experiments provide evidence about the feasibility of building useful intention-based user slips detection models. Third, I would like to capture the interaction and body signals with non-intrusive devices, as opposed to contemporary physiological signal measurements with expensive intrusive devices. Since physiological signals could well indicate human emotions and even intentions, measuring them in a non-intrusive and low cost manner would benefit human emotion and intention understanding. I propose a physiological mouse and build a prototype to non-intrusively measure human heart beat and respiratory rate. Experiments illustrate that the mouse can achieve promising performance on measuring the two physiological signals. Further experiments also suggest that it is feasible to correlate the measured signals to human emotions via the physiological mouse prototype.en_US
dcterms.extentxviii, 170 pages : color illustrationsen_US
dcterms.isPartOfPolyU Electronic Thesesen_US
dcterms.issued2019en_US
dcterms.educationalLevelPh.D.en_US
dcterms.educationalLevelAll Doctorateen_US
dcterms.LCSHHong Kong Polytechnic University -- Dissertationsen_US
dcterms.LCSHHuman-computer interactionen_US
dcterms.accessRightsopen accessen_US

Files in This Item:
File Description SizeFormat 
991022232427803411.pdfFor All Users6.34 MBAdobe PDFView/Open


Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show simple item record

Please use this identifier to cite or link to this item: https://theses.lib.polyu.edu.hk/handle/200/10024