|Title:||Learning with centered reproducing kernels|
|Subject:||Hong Kong Polytechnic University -- Dissertations|
|Pages:||x, 66 pages|
|Abstract:||In the past twenty years, reproducing kernels and the kernel-based learning algorithms have been widely and successfully applied to many areas of scientific research and industry, and are extensively studied. Many of these algorithms take the form of an optimization problem. Typically, the objective function consists of a fidelity term for fitting the observations, and a regularization term for preventing over-fitting. Examples include the support vector machines for classification, and the regularized least squares for regression. However, in many regression problems, the constant component should be treated differently in the regression function, and the existing kernel methods are not perfect tools to model this difference. Examples include score-based ranking function regression. In this thesis, we study a class of Centered Reproducing Kernels (CRKs), which separate the constant component from the reproducing kernel Hilbert spaces. We provide the non-asymptotic convergence analysis of the empirical CRK-based regularized least squares.|
Files in This Item:
|991022141358603411.pdf||For All Users||491.67 kB||Adobe PDF||View/Open|
As a bona fide Library user, I declare that:
- I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
- I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
- I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.
By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.
Please use this identifier to cite or link to this item: