|Title:||Optimality conditions via exact penalty functions|
Hong Kong Polytechnic University -- Dissertations
|Department:||Department of Applied Mathematics|
|Pages:||x, 135 p. : ill. ; 30 cm.|
|Abstract:||The purpose of this thesis is, to study optimality conditions for constrained optimization problems in finite dimension spaces from the viewpoint of exact penalty functions. The tools that we use are mainly from the modern variational analysis popularized by Rockafellar and Wets' classical book. The problem models that we focus on are nonlinear programming and mathematical programs with complementarity constraints. We aim at developing a unified framework and providing a detailed exposition of optimality conditions from exactness of penalty functions. In this connection, we intend to answer questions as to when penalty functions are exact and how optimality conditions of the original constrained problems can be inherited from those of exact penalty functions. We study sufficient conditions for penalty terms to possess local error bounds, which guarantee exactness of penalty functions. We give characterizations for a stronger version of the local error bound property in terms of strong slopes, subderivative and regular subgradients for points outside the referenced set. In particular, we give full characterizations of the local error bound property for the elementary max function of a finite collection of smooth functions. With the aid of these characterizations, we show that the quasinormality constraint qualification implies the existence of a local error bound. We also study sufficient and necessary conditions for the existence of local error bounds by virtue of various limits defined on the boundary of the referenced set. We study first-and second-order necessary and sufficient conditions for penalty functions to be exact. These conditions are expressed by subderivatives, second-order subderivatives, and parabolic subderivatives, which are the notions that have been utilized to formulate tight optimality conditions for optimization problems. In our investigation, the kernels of these derivatives, representing directions at which derivatives vanish, play an key role. In particular, we show an interesting auxiliary result which asserts that, the polar cone of the subderivative kernel of an extended real-valued function at a local minimum is the same as the positive hull of its regular subgradients at the same point. We show how Karush-Kuhn-Tucker conditions and second-order necessary conditions in nonlinear programming, and strong and Mordukhovich stationarities in mathematical programs with complementarity constraints, can be derived from exactness of penalty functions under some additional conditions on constraint functions. In presenting these additional conditions, it turns out that the kernels of (parabolic) subderivatives of penalty terms are very crucial. By virtue of these kernels and a variational description of regular subgradients, we show necessity and sufficiency of these additional conditions. We also present conditions in terms of the original data by applying (generalized) Taylor expansions to calculate these kernels.|
|Rights:||All rights reserved|
As a bona fide Library user, I declare that:
- I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
- I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
- I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.
By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.
Please use this identifier to cite or link to this item: