Full metadata record
DC FieldValueLanguage
dc.contributorDepartment of Electrical and Electronic Engineeringen_US
dc.contributor.advisorWang, Yi (EEE)en_US
dc.creatorRen, Jianping-
dc.identifier.urihttps://theses.lib.polyu.edu.hk/handle/200/14051-
dc.languageEnglishen_US
dc.publisherHong Kong Polytechnic Universityen_US
dc.rightsAll rights reserveden_US
dc.titleExploring two-stage approaches for object detection and interaction recognition in egocentric viewsen_US
dcterms.abstractIn this paper, we focus on the task of Human-Object Interaction(HOI) Detection in first person, and the proposed method consists of two main parts: an object detection part and an interaction detection part. For object detection, we employ both two-stage detectors (e.g., Faster R-CNN) and single-stage detectors (e.g., CenterNet) to accurately localize people and objects in images. We trained and evaluated these detectors using standard datasets and evaluation metrics to ensure stable performance. In the interaction detection section, we use Bootstrapping Language-Image Pre-training-2 (BLIP2), a model that combines a multimodal language and image pre-training to infer interactions between detected people and objects. By combining an object detector and a fine-turned BLIP2, we aim to capture more complex and nuanced interactions by utilizing semantic information from visual and textual patterns. We conducted extensive experiments to evaluate the effectiveness of the proposed method. The results show that the Blip2 model can accomplish the HOI detection task in conjunction with the target detector after using fine-tuning, and can approach the traditional two-stage HOI detector after means of data enhancement.en_US
dcterms.extentvi, 42 pages : color illustrationsen_US
dcterms.isPartOfPolyU Electronic Thesesen_US
dcterms.issued2024en_US
dcterms.educationalLevelM.Sc.en_US
dcterms.educationalLevelAll Masteren_US
dcterms.accessRightsrestricted accessen_US

Files in This Item:
File Description SizeFormat 
8718.pdfFor All Users (off-campus access for PolyU Staff & Students only)4.94 MBAdobe PDFView/Open


Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show simple item record

Please use this identifier to cite or link to this item: https://theses.lib.polyu.edu.hk/handle/200/14051