Author: Cheng, Haiming
Title: Push the limit of acoustic sensing techniques for finer HCI
Advisors: Lou, Wei (COMP)
Degree: Ph.D.
Year: 2024
Subject: Human-computer interaction
Signal processing
Hong Kong Polytechnic University -- Dissertations
Department: Department of Computing
Pages: xv, 156 pages : color illustrations
Language: English
Abstract: With quickly iterative electronic devices and increasing life expectancy, human-computer interaction (HCI) and ubiquitous sensing technologies continue to evolve at a rapid pace. Although a large number of sensing and interaction solutions integrating with various carrier signals have been proposed to support a wide range of emerging applications, some of their usages are confined by dedicated hardware or sensors, complicated deployment, visible conditions, or other prerequisites. Free from the above constraints, acoustic signals, with both finer-grained and long-range sensing abilities, have been widely adopted and stimulated multiple promising applications. As microphones and speakers on diverse devices become increasingly pervasive and available, acoustic sensing has drawn significant attention from both academia and industry.
Despite the advances in acoustic sensing, sensing resolutions of most current works remain bottlenecked by the limited sampling rate and narrow bandwidth, leading to restrictions and inconvenience in applications. To push the resolution limit, thus boosting and inspiring more fine-grained and delicate sensing work to extend HCI approaches, complement missing scenarios, and further benefit people’s lives, in this thesis, we first bridge over the fundamental resolution barrier by improving it to submillimetre level for device-free sensing, stepping on which, we sequentially propose a novel eye blink based HCI method to fill the gap of interaction contexts that existing HCI systems can not cover. Specifically, in the heart lies an original phase difference based approach leveraging the Frequency-Modulated Continuous Wave (FMCW) to derive the reflected time delay, thus precisely inferring absolute distance, catering to interaction needs of tinier perception with lower delay. Our insight behind is the phase difference-based approach provides more views to see received signals in the time domain, based on which will arouse more new challenges, new solutions and new applications for finer sensing and HCI study.
To break through the restricted resolution aroused by sampling rate or bandwidth while directly deriving accurate absolute distance for device-free tracking on commercial mobile devices, in the first work, we propose PD-FMCW, a device-free motion tracking scheme leveraging ultrasonic FMCW signal transmitted by smartphones. Specifically, we use a novel phase subtraction method to effectively derive the time delay of the reflected FMCW signal without time-frequency transform or cross-correlation processes. Recurring to the phase, the distance measurement resolution will no longer be limited by the signal’s bandwidth or sampling rate but evolve to a tiny theoretical resolution related to the speed of object movement and chirp duration instead. Further, we devise a suite of novel denoising methods to effectively remove various interferences and implement a prototype with all procedures realized in the time domain. Results show that PD-FMCW can achieve both higher theoretical and practical accuracies in tracking with the capability of recognizing 2 mm or even tinier micro-movements, which paves the way for more delicate sensing work.
In view of the pushed sensing resolution and improved subtle movement detection ability, combined with the changes the epidemic has made to people’s lives in recent years, in the second work, we provide a novel HCI technique TwinkleTwinkle, leveraging eye blink for interaction to complement scenarios when people have difficulty or inconveniences speaking or using their fingers/hands/arms or wearing masks/glasses/gloves. In specific, TwinkleTwinkle senses and recognizes eye blink patterns in a contact-free and training-free manner leveraging ultrasound signals. It first applies the phase difference-based approach to depicting candidate eye blink motion profiles without removing any noises, followed by modeling intrinsic characteristics of blink motions through adaptive constraints to separate tiny patterns from interferences in conditions where blink habits and involuntary movements vary between individuals. A vote-based approach is proposed to get final patterns designed to map with number combinations either self-defined or based on carriers like ASCII code and Morse code to make interaction seamlessly embedded with normal and well-known language systems. We implement TwinkleTwinkle on smartphones with all methods realized in the time domain and conduct extensive evaluations. Results show that TwinkleTwinkle achieves about 91% accuracy in recognizing 23 blink patterns among different people.
To conclude, this thesis has explored acoustic sensing in both methodology and application to complement and innovate HCI techniques, inspiring and boosting more fine-grained and delicate sensing applications. Furthermore, rather than simply removing noises, it prompts a question as to whether we can employ techniques to alter people’s attitudes towards these noises. We hope the proposed research will continuously contribute to benefiting and facilitating people’s daily life physically and mentally.
Rights: All rights reserved
Access: open access

Files in This Item:
File Description SizeFormat 
7248.pdfFor All Users16.53 MBAdobe PDFView/Open

Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show full item record

Please use this identifier to cite or link to this item: