Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Electronic and Information Engineering | en_US |
dc.contributor.advisor | Cheung, Lawrence Chi-chung (EIE) | - |
dc.creator | Xu, Shensheng. | - |
dc.identifier.uri | https://theses.lib.polyu.edu.hk/handle/200/8492 | - |
dc.language | English | en_US |
dc.publisher | Hong Kong Polytechnic University | - |
dc.rights | All rights reserved | en_US |
dc.title | New approaches to solve the local minimum problem and improve the classification ability of learning algorithms in multi-layer feed-forward neural networks | en_US |
dcterms.abstract | This thesis proposes two new algorithms. They are Wrong Output Modification (WOM) and Threshold of Output Differences (TOD). WOM is used to solve the local minimum problem in training a multi-layer feed-forward network while TOD is used to improve the classification ability in training a multi-layer feed-forward network. When a searching to find a global minimum is trapped by a local minimum, the change of weights could be zero or extremely small. Thus, the mean square error cannot be further reduced while its value is still so large that the searching cannot find the global minimum. In this circumstance, WOM locates the wrong output values and moves them closer to their corresponding target output values. Thus neuron weights are modified accordingly, and hence the searching can escape from such local minimum. WOM can be applied in different learning algorithms. Our performance investigation shows that learning with WOM can always escape from local minima and converge to a global minimum. Moreover, it obtains better classification ability after training. TOD monitors the difference of each output value and its corresponding target output value. All differences are used to identify whether a searching finds a global minimum or not. TOD can be applied in different learning algorithms. Our performance investigation shows that by using TOD, a multi-layer feed-forward neural network can be trained in a better way so that its classification ability is better. This improvement is very significant if all features in testing data can be found in training data. | en_US |
dcterms.extent | 1 online resource (ix , 78 pages) : illustrations | en_US |
dcterms.extent | ix , 78 pages : illustrations | en_US |
dcterms.isPartOf | PolyU Electronic Theses | en_US |
dcterms.issued | 2016 | en_US |
dcterms.educationalLevel | All Master | en_US |
dcterms.educationalLevel | M.Phil. | en_US |
dcterms.LCSH | Neural networks (Computer science) | en_US |
dcterms.LCSH | Computer algorithms. | en_US |
dcterms.LCSH | Hong Kong Polytechnic University -- Dissertations | en_US |
dcterms.accessRights | open access | en_US |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
b28906317.pdf | For All Users | 809.72 kB | Adobe PDF | View/Open |
Copyright Undertaking
As a bona fide Library user, I declare that:
- I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
- I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
- I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.
By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.
Please use this identifier to cite or link to this item:
https://theses.lib.polyu.edu.hk/handle/200/8492