Author: Gao, Zhenke
Title: Unet-DenseNet for robust far-field speaker verification
Advisors: Mak, Man-wai (EIE)
Degree: M.Sc.
Year: 2022
Subject: Automatic speech recognition
Speech perception
Hong Kong Polytechnic University -- Dissertations
Department: Department of Electronic and Information Engineering
Pages: iv, 41 pages : color illustrations
Language: English
Abstract: Far-field speaker verification (SV) has always been critical but challenging. Data augmentation is commonly used to overcome the problems arising from far-field microphones, such as high background noise levels and reverberation effects. On top of data augmentation, this dissertation tackles these problems by introducing a Unet-based speech enhancement (SE) module as a front-end processor for the speaker embedding module.
To prevent the SE module from distorting speaker information, we propose two improvements to the speech enhancement–speaker embedding pipeline: (1) a Unet-DenseNet based SE-SV joint training pipeline is designed to remove the noise before the enhanced signal is fed to the speaker embedding network; and (2) a semi-joint training is proposed to prevent over-fitting of the Unet when training the Unet-DenseNet.
To evaluate the proposed model, we conducted extensive experiments on the noisy version of the Voxceleb1 dataset. To verify the generalization on unseen noise, we conducted experiments on the VOiCES Challenge 2019 evaluation set.
The results show that the joint training model can reduce the average equal error rate (EER) by 2.5% when the test utterances have SNR ranging from 5dB to 20dB. In particular, at the SNR of –5dB, the relative reduction in EER and minimum decision cost is 7.2% and 7.5%, respectively.
Rights: All rights reserved
Access: restricted access

Files in This Item:
File Description SizeFormat 
6521.pdfFor All Users (off-campus access for PolyU Staff & Students only)1.42 MBAdobe PDFView/Open

Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show full item record

Please use this identifier to cite or link to this item: