Full metadata record
DC FieldValueLanguage
dc.contributorDepartment of Electrical and Electronic Engineeringen_US
dc.contributor.advisorLaw, Ngai-fong Bonnie (EEE)en_US
dc.creatorPang, You-
dc.identifier.urihttps://theses.lib.polyu.edu.hk/handle/200/13814-
dc.languageEnglishen_US
dc.publisherHong Kong Polytechnic Universityen_US
dc.rightsAll rights reserveden_US
dc.titleCataract surgical phase recognition using deep learning techniquesen_US
dcterms.abstractUnderstanding surgical phases is pivotal for advancing smart operating room technology. While significant strides have been made in automating surgical phase recognition, prevailing methodologies are often constrained by three primary challenges. (1) The extremely imbalanced data distribution makes it hard for the network to train efficiently, leading to models prone to overfitting and weak generalization. (2) Traditional 2D networks, which rely solely on single image input, are deficient in temporal data. This limitation hinders their ability to discern distinct visual attributes for individual frames, as well as to process motion-related information effectively. (3) Additionally, the approach of recognizing frames in isolation often leads to diminished accuracy. This issue, referred to as phase shaking, is characterized by inconsistent predictions within each phase.en_US
dcterms.abstractIn this thesis, we present two novel deep-learning approaches to address these challenges. First, to effectively utilize the time information of surgical videos, we proposed a frame sequence-based method, which adopts a frame sequence as network input. The network also employs a Focal Loss and Dropout strategy to mitigate the data imbalance problem.en_US
dcterms.abstractSecond, to alleviate the phase shaking problem, we proposed a Surgical Phase Localization Network (SurgPLAN) to facilitate a more accurate and stable surgical phase recognition with the principle of temporal detection. We develop a novel Pyramid Slow-Fast (PSF) architecture, functioning as the core visual framework. It is uniquely designed to encapsulate multi-scale spatial and temporal characteristics through its dual-branch system, which operates at varying frame sampling rates. Furthermore, our research introduces the Temporal Phase Localization (TPL) module, which is pivotal in producing predictions for surgical phases. It achieves this through the generation of temporal region proposals, thereby ensuring both accuracy and consistency in the phase prediction process.en_US
dcterms.abstractExtensive experiments of each method confirm the significant advantages of our proposed approaches. The proposed method mitigates the above-mentioned problems effectively, leading to more accurate and stable surgical phase recognition.en_US
dcterms.extentx, 47 pages : color illustrationsen_US
dcterms.isPartOfPolyU Electronic Thesesen_US
dcterms.issued2023en_US
dcterms.educationalLevelM.Sc.en_US
dcterms.educationalLevelAll Masteren_US
dcterms.LCSHCataract -- Surgeryen_US
dcterms.LCSHSurgery, Operativeen_US
dcterms.LCSHDiagnostic imagingen_US
dcterms.LCSHDeep learning (Machine learning)en_US
dcterms.LCSHHong Kong Polytechnic University -- Dissertationsen_US
dcterms.accessRightsrestricted accessen_US

Files in This Item:
File Description SizeFormat 
8265.pdfFor All Users (off-campus access for PolyU Staff & Students only)2.34 MBAdobe PDFView/Open


Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show simple item record

Please use this identifier to cite or link to this item: https://theses.lib.polyu.edu.hk/handle/200/13814