Full metadata record
DC FieldValueLanguage
dc.contributorMulti-disciplinary Studiesen_US
dc.contributorDepartment of Computingen_US
dc.creatorLam, Chi-chung-
dc.identifier.urihttps://theses.lib.polyu.edu.hk/handle/200/671-
dc.languageEnglishen_US
dc.publisherHong Kong Polytechnic University-
dc.rightsAll rights reserveden_US
dc.titleFinancial time series forecasting by neural network using conjugate gradient methoden_US
dcterms.abstractThe three-layer Multilayer Perceptron (MLP) neural network is used for the time series forecasting of short-term stock price. The prediction is based on a Technical Analysis point of view. A number of technical indicators in a n-day period are constructed for network input, where the network output is the change of stock price on the (n+1)-day. To overcome the problem of determining the network parameters (learning rate, momentum rate) and the slow convergence of basic Backpropagation learning algorithm, the Conjugate Gradient method with restart procedure is used for the network training. Random weight initialization with uniform distribution is a popular technique for network initialization. Different schemes have been proposed but they are experimental and problem-dependent. In this project, the possibility of weight initialization by using Multiple Linear Regression is discussed. In order for the nonlinear network can fit into the linear regression model during the initial weight calculation, a network linearization method will be adopted. The daily trade data of listing companies from the Shanghai Stock Exchange is used for the network learning and validation, using the Conjugate Gradient method with regression weight initialization and the Backpropagation method with momentum term using random weight initialization. The performance of these methods is compared and evaluated. The results indicate that the network can model the time series of the stock price satisfactory. The proposed conjugate gradient with linear regression weight initialization requires significant lesser number of iterations as required by the backpropagation. In addition, the sensitivity of regression ion initialization to both conjugate gradient and backpropagation has been investigated. Conjugate Gradient learning works better with regression initialization than with uniform random initialization. Less number of interations are required. However, similar conclusion cannot be drawn from the backpropagation learning.en_US
dcterms.extent[v], 67 leaves : ill. ; 30 cmen_US
dcterms.isPartOfPolyU Electronic Thesesen_US
dcterms.issued1998en_US
dcterms.educationalLevelAll Masteren_US
dcterms.educationalLevelM.Sc.en_US
dcterms.LCSHStock price forecastingen_US
dcterms.LCSHNeural networks (Computer science)en_US
dcterms.LCSHConjugate gradient methods -- Data processingen_US
dcterms.LCSHHong Kong Polytechnic University -- Dissertationsen_US
dcterms.accessRightsrestricted accessen_US

Files in This Item:
File Description SizeFormat 
b14258808.pdfFor All Users (off-campus access for PolyU Staff & Students only)2.49 MBAdobe PDFView/Open


Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show simple item record

Please use this identifier to cite or link to this item: https://theses.lib.polyu.edu.hk/handle/200/671