Author: Ye, Qiuliang
Title: Robust phase retrieval using optimization and deep learning techniques
Advisors: Lun, P. K. Daniel (EIE)
Degree: Ph.D.
Year: 2023
Subject: Signal processing
Image reconstruction
Hong Kong Polytechnic University -- Dissertations
Department: Department of Electronic and Information Engineering
Pages: 174 pages : color illustrations
Language: English
Abstract: Phase retrieval aims to recover a signal given the intensity measurement of its Fourier transform. It is a key problem in crystallography, optical imaging, astronomy, X-ray, and electronic imaging. Due to the ill-posedness of the problem, optical masks, acting as an extra constraint to the optimization process, are often adopted in optical phase retrieval systems. In recent years, various random masking schemes have been proposed. However, they ignored the non-bandlimited property of random masks that introduces many high-frequency data to the intensity measurement and lead to degraded performance. In this thesis, we propose two novel masking schemes to solve the problem. Firstly, we propose a binary green noise optical masking scheme that allows the energy of the masks to be concentrated in a ring shape mid-frequency region while satisfying the randomness property. However, since the green noise masks have only binary values, there can be many sharp changes between adjacent pixels which inevitably introduce high-frequency components to the mask. We therefore propose the second masking scheme named OptMask. OptMask is a multi-level random phase mask that can smooth out the sharp pixel value changes in binary masks. It is designed based on a two-stage optimization algorithm that allows the flexibility to adjust the cut-off frequencies and quantization levels. Experimental results show that the quality of the amplitude and phase images retrieved using the proposed OptMask significantly outperforms traditional masking schemes both qualitatively and quantitatively.
Although random masking schemes are effective, they require multiple intensity measurements for each reconstruction which increases the system cost and data acquisition time. To deal with the problem, we propose two deep neural network (DNN) based phase retrieval approaches that only require single maskless measurement for each reconstruction. The first DNN is designed in a feedforward-based structure that allows fast inference. It is equipped with a new feature extraction unit implemented using a Multi-Layer Perceptron to effectively utilize the spectral information and explore global representation. A residual learning structure with a self-attention mechanism is introduced to promote the global correlation in the reconstructed images. While the first DNN structure focuses on simplifying the architecture and fast inference, the second proposed DNN dubbed PPRNet has a physics-driven multi-scale structure that optimizes the performance. PPR-Net has a non-iterative feedforward structure and can effectively utilize the intensity measurement to guide the image reconstruction. It is enabled by the novel Hybrid Unwinding Blocks which separately process the global and local information of the feature maps with the aid of the intensity measurement. The effectiveness and practicality of PPRNet have been verified by a series of simulations and experiments performed on an optical platform designed for this research. PPRNet significantly outperforms the state-of-the-art deep learning-based phase retrieval methods, proving it a promising solution to practical phase retrieval applications.
We believe that the research findings and results of this thesis have made a substantial contribution to the field of study and will excite a lot of interest in the optical and digital imaging industries.
Rights: All rights reserved
Access: open access

Files in This Item:
File Description SizeFormat 
6891.pdfFor All Users5.86 MBAdobe PDFView/Open

Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show full item record

Please use this identifier to cite or link to this item: