|Proximal algorithms with extrapolation for nonconvex nonsmooth optimization problems
|Chen, Xiaojun (AMA)
|Hong Kong Polytechnic University -- Dissertations
|Department of Applied Mathematics
|xvi, 91 pages : color illustrations
|In this thesis, we consider the proximal algorithms with extrapolation for solving nonconvex nonsmooth optimization problems. This class of optimization problems arise in many application areas of engineering, computer science, economic field, see [18, 20, 23, 27, 40, 54, 64]. Due to the importance of these problems, a plenty of algorithms are proposed for solving them. This thesis mainly studies two classes of the proximal algorithms with extrapolation for solving different structured nonconvex and nonsmooth optimization problems. The details are as follows: 1. We first consider the proximal gradient algorithm with extrapolation for minimizing the sum of a Lipschitz differentiable function and a proper closed convex function. Using the error bound condition studied in the literature  for analyzing the convergence properties of proximal gradient algorithm, we show that there exists a threshold such that if the extrapolation coefficients are chosen below this threshold, then the sequence generated converges R-linearly to a stationary point of the problem. Moreover, the corresponding sequence of objective values is also R-linearly convergent. In addition, the threshold reduces to 1 for convex problems and, as a consequence, we obtain the R-linear convergence of the sequence generated by FISTA with fixed restart. Again for convex problems, we show that the successive changes of the iterates vanish for many choices of sequences of extrapolation coefficients that approach the threshold. In particular, we prove that this conclusion also holds for the sequence generated by the FISTA. Finally, we present some numerical experiments to illustrate our results. 2. Difference-of-convex (DC) optimization problems attract many researchers' attention in recent years. Many numerical algorithms are proposed for solving them. Among these algorithms, difference-of-convex algorithm (DCA) is a fundamental and classical one. We consider a class of DC optimization problems whose objective is level-bounded and is the sum of a smooth convex function with Lipschitz gradient, a proper closed convex function and a continuous concave function. This kind of DC problems can be solved by the aforementioned DCA, however, a direct application of DCA may lead to difficult subproblems. To overcome this difficulty, proximal DCA has been proposed. While the subproblems involved in the proximal DCA are simpler, proximal DCA maybe slow in practice. This is because proximal DCA reduces to the proximal gradient algorithm when the concave part of the objective is void. In this theis, motivated by the extrapolation techniques for accelerating the proximal gradient algorithm in the convex settings, we consider a proximal difference-of-convex algorithm with extrapolation to possibly accelerate the proximal DCA. We show that any accumulation point of the sequence generated by our algorithm is a stationary point of the DC optimization problem for a fairly general choice of extrapolation parameters: in particular, the parameters can be chosen as in FISTA with fixed restart . Moreover, by assuming the Kurdyka-Lojasiewicz property of an auxiliary function and the differentiability of the concave part, we establish global convergence of the sequence generated by our algorithm and analyze its convergence rate. From the results in our numerical experiments on two difference-of-convex regularized least squares models, proximal difference-of-convex algorithm with extrapolation usually outperforms the proximal DCA with nonmonotone linesearch.
|All rights reserved
Files in This Item:
|For All Users
As a bona fide Library user, I declare that:
- I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
- I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
- I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.
By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.
Please use this identifier to cite or link to this item: