Author: Yang, Yanni
Title: Smart wireless sensing for human activity monitoring and object detection
Advisors: Cao, Jiannong (COMP)
Degree: Ph.D.
Year: 2021
Subject: Wireless sensor networks
Signal processing -- Digital techniques
Hong Kong Polytechnic University -- Dissertations
Department: Department of Computing
Pages: xix, 183 pages : color illustrations
Language: English
Abstract: The rapid prevalence of wireless technologies has brought a new paradigm for ubiquitous and non-intrusive human activity and object sensing. Various wireless signals, for instance, radio frequency (RF) signals (e.g., RFID and WiFi) and acoustic signals, are used for sensing different kinds of human activities and objects. In recent years, there is an increasing number of important wireless sensing applications, including human health monitoring and food safety detection. However, since wireless signals are vulnerable to different noises, current works mainly realize activity monitoring and object detection in simplified and restricted scenarios, e.g., single-person and single-activity monitoring in a static environment, and usually require costly and specialized devices. These restrictions have impeded the practical and wide application of wireless sensing techniques. In this thesis, we study how to realize wireless human activity monitoring and object detection for more complex scenarios in a low-cost way. However, new challenges are brought up to achieve this goal. First, in complex scenarios, noises caused by the line-of-sight and multipath wireless signals from the non-target activities and subjects could lead to inaccurate sensing results. Second, the hardware diversity and imperfection of low-cost wireless devices could significantly affect the measured wireless signals with poor sensing performance. To tackle the above challenges, we investigate and develop new methods and apply them for two critical applications, including respiration monitoring (RM) and liquid fraud detection, which are summarized below. First, we release the restriction on RFID-based RM from the static environment to the dynamic environment. In a dynamic environment with other people moving around, the multipath RFID signals reflected by moving people could mix with the respiration signal of the monitored person. This makes the respiration pattern unclear and leads to inaccurate RM results. To achieve accurate RM in dynamic environments, we propose a signal denoising method to remove the effect of the multipath signals caused by the movements of surrounding people. In specific, we leverage the intrinsic pattern difference between the regular respiration activity and the random and irregular surrounding movements to create a dedicated matched filter for denoising the respiration signals.
Second, we release the restriction on RFID-based RM from respiration-only monitoring to respiration-and-exercise monitoring. To extract the respiration and exercise patterns, we attach the RFID tags on the human chest and limbs, respectively. However, the large-scale exercise movements could overwhelm the tiny respiration signal, making it difficult to extract the respiration pattern. Thus, the effect of exercise movements on the respiration pattern should be removed. To achieve this, we first design a signal amplification method by employing the human respiration mechanism to enlarge the tiny respiration pattern. Then, we propose a signal fusion method to remove the effect of the large-scale exercise movement. Third, we release the restriction on UWB-based RM from the single-person to the multi-person scenario. UWB radar can measure the signal traveling distance precisely with its wide bandwidth. Thus, we can separate multiple persons' respiration signals according to their different signal traveling distances in the air. However, UWB radar signal is quite sensitive to small movements in the environment. When there are multiple persons in the environment, their presence would affect and blur the signals of each other. Therefore, instead of using the noisy temporal radar signal, we propose a signal transformation method to transform the spatial-temporal information of the radar signal into another modality, i.e., the image, which enables accurate estimation of the respiration states for multiple persons. Finally, we investigate to use commodity acoustic devices for liquid detection. When the acoustic signal traveling through the liquid, the received acoustic signal involves the information about the liquids absorption of the acoustic signal over different frequencies, which can be used to detect the fake liquid. However, due to the diversity and imperfection of the acoustic devices, the measure absorption feature can be significantly deviated and result in inaccurate liquid detection results. Thus, we propose a signal calibration method to calibrate the measured feature using a dedicated reference signal. In addition, since the acoustic devices can be placed at various positions around the liquid, the change of the multipath acoustic signal could result in variations in the liquid's absorption feature. To remove this effect, we adopt a data augmentation technique to bear those variations for accurate liquid detection.
Rights: All rights reserved
Access: open access

Files in This Item:
File Description SizeFormat 
5840.pdfFor All Users10.61 MBAdobe PDFView/Open

Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show full item record

Please use this identifier to cite or link to this item: