Author:  Wang, Hong 
Title:  Secondorder methods for nonconvex optimization : theory and complexity analysis 
Degree:  Ph.D. 
Year:  2016 
Subject:  Hong Kong Polytechnic University  Dissertations Mathematical optimization. 
Department:  Dept. of Applied Mathematics 
Pages:  xv, 139 pages : illustrations 
Language:  English 
InnoPac Record:  http://library.polyu.edu.hk/record=b2935053 
URI:  http://theses.lib.polyu.edu.hk/handle/200/8818 
Abstract:  We consider the secondorder methods for solving two classes of nonconvex minimization problem arising from diverse applications of optimization. By secondorder methods, we refer to those methods involving secondorder information of the objective function. The first class of nonconvex problem is the socalled Affine Rank Minimization Problem (ARMP), whose aim is to minimize the rank of a matrix over a given affine set. The other one is the Partially Separable Minimization Problem (PSMP), which is to minimize the objective function with a partially separable structure over a given convex set. This thesis hence can be sharply divided into two distinct parts. In the first part, we focus on exploring the ARMP utilizing the matrix factorization reformulation. Under some particular situations, we show that the corresponding factorization models are of the property that all secondorder stationary points are global minimizers. By presuming such property holds, we propose an algorithm framework which outputs the global solution of the ARMP after solving a series of its factorization models with different ranks to the secondorder necessary optimality. Finally, we put forward a conjecture that the reduction between the global minima of the lowrank approximation with consecutive ranks is monotonically decreasing with the increase of the rank. If this conjecture holds, we can accelerate the estimation of the optimal rank by an adaptive technique, and hence significantly increase the efficiency of the overall performance of our framework in solving ARMP. In the second part of this thesis, we mainly study the PSMP over a convex constraint. We first propose an adaptive regularization algorithm for solving PSMP, in which the expense of using highorder models is mitigated by the use of the partially separable structure. We then show that the algorithm using an order p model needs at most O(ε p+1/p) evaluations of the objective function and its derivatives to arrive at an εapproximate firstorder stationary point. The complexity in terms of c is unaffected by the use of structure. An extension of the main idea is also presented for the case where the objective function might be nonLipschitz continuous. We apply the algorithm with an adaptive cubic regularization term to solving the problem of data fitting involving the qquasi norm for q ε(0, 1) and it turns out that even for nonLipschitz case, the complexity bound O(ε 3/2) can be retained. 
Files  Size  Format 

b29350530.pdf  1.766Mb 


As a bona fide Library user, I declare that:  


By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms. 