New approaches to solve the local minimum problem and improve the classification ability of learning algorithms in multi-layer feed-forward neural networks

Pao Yue-kong Library Electronic Theses Database

New approaches to solve the local minimum problem and improve the classification ability of learning algorithms in multi-layer feed-forward neural networks

 

Author: Xu, Shensheng.
Title: New approaches to solve the local minimum problem and improve the classification ability of learning algorithms in multi-layer feed-forward neural networks
Degree: M.Phil.
Year: 2016
Subject: Neural networks (Computer science)
Computer algorithms.
Hong Kong Polytechnic University -- Dissertations
Department: Dept. of Electronic and Information Engineering
Pages: 1 online resource (ix , 78 pages) : illustrations
ix , 78 pages : illustrations
Language: English
InnoPac Record: http://library.polyu.edu.hk/record=b2890631
URI: http://theses.lib.polyu.edu.hk/handle/200/8492
Abstract: This thesis proposes two new algorithms. They are Wrong Output Modification (WOM) and Threshold of Output Differences (TOD). WOM is used to solve the local minimum problem in training a multi-layer feed-forward network while TOD is used to improve the classification ability in training a multi-layer feed-forward network. When a searching to find a global minimum is trapped by a local minimum, the change of weights could be zero or extremely small. Thus, the mean square error cannot be further reduced while its value is still so large that the searching cannot find the global minimum. In this circumstance, WOM locates the wrong output values and moves them closer to their corresponding target output values. Thus neuron weights are modified accordingly, and hence the searching can escape from such local minimum. WOM can be applied in different learning algorithms. Our performance investigation shows that learning with WOM can always escape from local minima and converge to a global minimum. Moreover, it obtains better classification ability after training. TOD monitors the difference of each output value and its corresponding target output value. All differences are used to identify whether a searching finds a global minimum or not. TOD can be applied in different learning algorithms. Our performance investigation shows that by using TOD, a multi-layer feed-forward neural network can be trained in a better way so that its classification ability is better. This improvement is very significant if all features in testing data can be found in training data.

Files in this item

Files Size Format
b28906317.pdf 829.1Kb PDF
Copyright Undertaking
As a bona fide Library user, I declare that:
  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.
By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

     

Quick Search

Browse

More Information