Author: Shi, Yue
Title: On the lasso regression and asymmetric laplace distribution with applications
Advisors: Yiu, Ka-fai (AMA)
Degree: Ph.D.
Year: 2018
Subject: Hong Kong Polytechnic University -- Dissertations
Mathematical optimization
Department: Department of Applied Mathematics
Pages: xxii, 175 pages : color illustrations
Language: English
Abstract: In this thesis, we consider four classes of optimization models. One class is LAD Generalized Lasso models. We develop a descent algorithm for LAD-Lasso and a new active zero set descent algorithm for LAD Generalized Lasso under nonsmooth optimality conditions; The second class is constrained LAD Lasso models. We extend the descent algorithm to tackle the constraints as well. Application in Mean Absolute Deviation Lasso portfolio selection is studied. The third class is selection of penalty parameter for compressive sensing. We carry out tests using several criteria for selection of the penalty parameter. The fourth class is optimization under Asymmetric Laplace Distributions, namely robust mixture linear regression model and portfolio selection. We first consider LAD Generalized Lasso models. Under dynamic nonsmooth optimality conditions, we develop a descent algorithm by selecting fastest descent directions for LAD-Lasso regression. Then we derive a new active zero set descent algorithm for LAD Generalized Lasso regression. The algorithm updates the zero set and basis search directions recursively until optimality conditions are satisfied. It is also shown that the proposed algorithm converges in finitely many steps. We then consider Constrained LAD Lasso models. We develop a descent algorith-m by updating descent directions selected from basis directional set for nonsmooth optimization problems for MAD-Lasso portfolio selection strategy, extensive real data analysis are provided to evaluate the out-of-sample performances. We next consider selection of penalty parameter. For compressive sensing based signal recovery model, we apply regularized Least Squares for sparse reconstruction since it can reconstruct speech signal from a noisy observation, and proposed a two-level optimization strategy to incorporate the quality design attributes in the sparse solution in compressive speech enhancement by hyper-parameterizing the tuning parameter. The first level involves the compression of the big data and the second level optimizes the tuning parameter by using different optimization criteria (such as Gini index, the Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC)). The set of solutions can then be measured against the desired design attributes to achieve the best trade-off between suppression and distortion. Finally, we study two models under Asymmetric Laplace Distributions. We first present an efficient two-level latent EM algorithm for parameter estimation of mixture linear regression models, with group label as the first level latent variable and laplace intermediate variable as the second level latent variable. Explicit updating formula of each iteration are derived and computational complexity can thus be reduced significantly. Then we consider robust portfolio selection model, and derived the Expectation-Maximization (EM) algorithm for parameter estimation of Asymmetric Laplace distribution, efficient frontier analysis is provided to evaluate the performance.
Rights: All rights reserved
Access: open access

Files in This Item:
File Description SizeFormat 
991022165758103411.pdfFor All Users3.42 MBAdobe PDFView/Open


Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show full item record

Please use this identifier to cite or link to this item: https://theses.lib.polyu.edu.hk/handle/200/9671