Full metadata record
DC FieldValueLanguage
dc.contributorMulti-disciplinary Studiesen_US
dc.creatorChang, Wing-fai-
dc.identifier.urihttps://theses.lib.polyu.edu.hk/handle/200/85-
dc.languageEnglishen_US
dc.publisherHong Kong Polytechnic University-
dc.rightsAll rights reserveden_US
dc.titleA study of second order learning algorithms for recurrent neural networksen_US
dcterms.abstractBackpropagation (BP) of error gradients has proven its usefulness in training feedforward neural networks to tackle a large number of classification and function mapping problems. However, this method exhibits a number of serious problems while training. The user is required to select three arbitrary coefficients: learning rate, momentum, and the number of hidden nodes. An unfortunate choice can cause slow convergence. Besides, the network can become trapped in a local minimum of the error function, arriving at an unacceptable solution when a much better one exists. Also, the large number of learning iterations needed to optimally adjust the weights of the networks is prohibitive for online applications. Numerical optimization theory offers a rich and robust set of techniques which can be applied to improve the learning rate of neural networks. These techniques focus on methods using not only the local gradient of the function but also the second derivative. The first derivative of the error measures the slope of the error surface at a point while the second derivative measures the curvature of the error surface at the same point. This information is very useful in determining the optimal update direction. There are varieties of second order methods, in particular, the conjugate gradient method is commonly used in BP networks due to its speed and simplicity. It has been shown that a much shorter training time is required when the feedforward network is trained using second order methods. Recurrent networks, which include feedback loops (connections by which a node's prior output influences its subsequent output), are capable of processing temporal patterns and accepting sequences as inputs and producing them as outputs. Recurrent networks can be trained with backpropagation, however, such training requires a great deal of computation and memory, and encounters the same problems as in the feedforward networks. In this dissertation, the conjugate gradient method is applied to train a fully connected recurrent neural network. The result is then compared with another learning algorithm - real time recurrent learning algorithm, which uses only first derivative of error and is often used in training recurrent neural networks. The recurrent network implemented with these two learning algorithms is used to simulate a linear system (a second order Butterworth low pass filter). In addition, the recurrent network is applied to a speaker recognition problem. The application of the network to speaker recognition shows that the recurrent network is able to learn temporally encoded sequences, and to make decision on whether or not a speech sample corresponds to a particular speaker.en_US
dcterms.extentvi, 73 leaves : ill. ; 30 cmen_US
dcterms.isPartOfPolyU Electronic Thesesen_US
dcterms.issued1995en_US
dcterms.educationalLevelAll Masteren_US
dcterms.educationalLevelM.Sc.en_US
dcterms.LCSHNeural networks (Computer science)en_US
dcterms.LCSHAlgorithmsen_US
dcterms.LCSHMathematical optimizationen_US
dcterms.LCSHHong Kong Polytechnic University -- Dissertationsen_US
dcterms.accessRightsrestricted accessen_US

Files in This Item:
File Description SizeFormat 
b12050155.pdfFor All Users (off-campus access for PolyU Staff & Students only)2.32 MBAdobe PDFView/Open


Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show simple item record

Please use this identifier to cite or link to this item: https://theses.lib.polyu.edu.hk/handle/200/85