Author: | Fang, Changsheng |
Title: | Deep learning methods for low-light images enhancement |
Degree: | M.Sc. |
Year: | 2023 |
Department: | Department of Electrical and Electronic Engineering |
Pages: | v, 54 pages : color illustrations |
Language: | English |
Abstract: | Low-light enhancement aims to improve the quality and visualization of images captured under low light or low illumination conditions. Deep learning approaches form the basis of the latest results in this field. In this paper, three of the most classical deep learning models in this field are selected for investigation, covering supervised learning, unsupervised learning, and the latest data-driven approaches: RetinexNet, EnglitenGAN, and ZerDCE. The advantages of these models are analyzed in terms of model structure, loss function, and feature design. These models are then trained and compared using the same dataset in the experimental phase. Additionally, a U-net enhancement network based on RetinexNet, which incorporates the channel and spatial attention mechanism, is proposed in this paper to fulfill the task of illumination enhancement in Retinex theory. Through experiments, it has been proved that this improvement scheme of RetinexNet obtains obvious enhancement and yields good results. |
Rights: | All rights reserved |
Access: | restricted access |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
8279.pdf | For All Users (off-campus access for PolyU Staff & Students only) | 1.8 MB | Adobe PDF | View/Open |
Copyright Undertaking
As a bona fide Library user, I declare that:
- I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
- I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
- I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.
By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.
Please use this identifier to cite or link to this item:
https://theses.lib.polyu.edu.hk/handle/200/13874