Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Electrical and Electronic Engineering | en_US |
dc.contributor.advisor | Hu, Haibo (EEE) | en_US |
dc.creator | Chen, Zheyu | - |
dc.identifier.uri | https://theses.lib.polyu.edu.hk/handle/200/13893 | - |
dc.language | English | en_US |
dc.publisher | Hong Kong Polytechnic University | en_US |
dc.rights | All rights reserved | en_US |
dc.title | Don't lose yourself : boosting multimodal recommendation via reducing node-neighbor discrepancy in graph convolutional network | en_US |
dcterms.abstract | The rapid expansion of multimedia contents has led to the emergence of multimodal recommendation systems. It has attracted increasing attention in recommendation systems because its full utilization of data from different modalities alleviates the persistent data sparsity problem. As such, multimodal recommendation models can learn personalized information about nodes in terms of visual and textual. To further alleviate the data sparsity problem, some previous works have introduced graph convolutional networks (GCNs) for multimodal recommendation systems, to enhance the semantic representation of users and items by capturing the potential relationships between them. | en_US |
dcterms.abstract | However, adopting GCNs inevitably introduces the over-smoothing problem, which make nodes to be too similar. Unfortunately, incorporating multimodal information will exacerbate this challenge because nodes that are too similar will lose the personalized information learned through multimodal information. To address this problem, we propose a novel model that retains the personalized information of ego nodes during feature aggregation by Reducing Node-neighbor Discrepancy (RedNnD). Extensive experiments on three public datasets show that RedNnD achieves state-of-the-art performance on accuracy and robustness, with significant improvements over existing GCN-based multimodal frameworks. | en_US |
dcterms.extent | vi, 67 pages : color illustrations | en_US |
dcterms.isPartOf | PolyU Electronic Theses | en_US |
dcterms.issued | 2024 | en_US |
dcterms.educationalLevel | M.Sc. | en_US |
dcterms.educationalLevel | All Master | en_US |
dcterms.accessRights | restricted access | en_US |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
8300.pdf | For All Users (off-campus access for PolyU Staff & Students only) | 820.46 kB | Adobe PDF | View/Open |
Copyright Undertaking
As a bona fide Library user, I declare that:
- I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
- I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
- I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.
By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.
Please use this identifier to cite or link to this item:
https://theses.lib.polyu.edu.hk/handle/200/13893