Full metadata record
DC FieldValueLanguage
dc.contributorDepartment of Electronic and Information Engineeringen_US
dc.contributor.advisorCheung, Lawrence Chi-chung (EIE)-
dc.creatorXu, Shensheng.-
dc.identifier.urihttps://theses.lib.polyu.edu.hk/handle/200/8492-
dc.languageEnglishen_US
dc.publisherHong Kong Polytechnic University-
dc.rightsAll rights reserveden_US
dc.titleNew approaches to solve the local minimum problem and improve the classification ability of learning algorithms in multi-layer feed-forward neural networksen_US
dcterms.abstractThis thesis proposes two new algorithms. They are Wrong Output Modification (WOM) and Threshold of Output Differences (TOD). WOM is used to solve the local minimum problem in training a multi-layer feed-forward network while TOD is used to improve the classification ability in training a multi-layer feed-forward network. When a searching to find a global minimum is trapped by a local minimum, the change of weights could be zero or extremely small. Thus, the mean square error cannot be further reduced while its value is still so large that the searching cannot find the global minimum. In this circumstance, WOM locates the wrong output values and moves them closer to their corresponding target output values. Thus neuron weights are modified accordingly, and hence the searching can escape from such local minimum. WOM can be applied in different learning algorithms. Our performance investigation shows that learning with WOM can always escape from local minima and converge to a global minimum. Moreover, it obtains better classification ability after training. TOD monitors the difference of each output value and its corresponding target output value. All differences are used to identify whether a searching finds a global minimum or not. TOD can be applied in different learning algorithms. Our performance investigation shows that by using TOD, a multi-layer feed-forward neural network can be trained in a better way so that its classification ability is better. This improvement is very significant if all features in testing data can be found in training data.en_US
dcterms.extent1 online resource (ix , 78 pages) : illustrationsen_US
dcterms.extentix , 78 pages : illustrationsen_US
dcterms.isPartOfPolyU Electronic Thesesen_US
dcterms.issued2016en_US
dcterms.educationalLevelAll Masteren_US
dcterms.educationalLevelM.Phil.en_US
dcterms.LCSHNeural networks (Computer science)en_US
dcterms.LCSHComputer algorithms.en_US
dcterms.LCSHHong Kong Polytechnic University -- Dissertationsen_US
dcterms.accessRightsopen accessen_US

Files in This Item:
File Description SizeFormat 
b28906317.pdfFor All Users809.72 kBAdobe PDFView/Open


Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show simple item record

Please use this identifier to cite or link to this item: https://theses.lib.polyu.edu.hk/handle/200/8492