Author: Lam, Chi-chung
Title: Financial time series forecasting by neural network using conjugate gradient method
Degree: M.Sc.
Year: 1998
Subject: Stock price forecasting
Neural networks (Computer science)
Conjugate gradient methods -- Data processing
Hong Kong Polytechnic University -- Dissertations
Department: Multi-disciplinary Studies
Department of Computing
Pages: [v], 67 leaves : ill. ; 30 cm
Language: English
Abstract: The three-layer Multilayer Perceptron (MLP) neural network is used for the time series forecasting of short-term stock price. The prediction is based on a Technical Analysis point of view. A number of technical indicators in a n-day period are constructed for network input, where the network output is the change of stock price on the (n+1)-day. To overcome the problem of determining the network parameters (learning rate, momentum rate) and the slow convergence of basic Backpropagation learning algorithm, the Conjugate Gradient method with restart procedure is used for the network training. Random weight initialization with uniform distribution is a popular technique for network initialization. Different schemes have been proposed but they are experimental and problem-dependent. In this project, the possibility of weight initialization by using Multiple Linear Regression is discussed. In order for the nonlinear network can fit into the linear regression model during the initial weight calculation, a network linearization method will be adopted. The daily trade data of listing companies from the Shanghai Stock Exchange is used for the network learning and validation, using the Conjugate Gradient method with regression weight initialization and the Backpropagation method with momentum term using random weight initialization. The performance of these methods is compared and evaluated. The results indicate that the network can model the time series of the stock price satisfactory. The proposed conjugate gradient with linear regression weight initialization requires significant lesser number of iterations as required by the backpropagation. In addition, the sensitivity of regression ion initialization to both conjugate gradient and backpropagation has been investigated. Conjugate Gradient learning works better with regression initialization than with uniform random initialization. Less number of interations are required. However, similar conclusion cannot be drawn from the backpropagation learning.
Rights: All rights reserved
Access: restricted access

Files in This Item:
File Description SizeFormat 
b14258808.pdfFor All Users (off-campus access for PolyU Staff & Students only)2.49 MBAdobe PDFView/Open


Copyright Undertaking

As a bona fide Library user, I declare that:

  1. I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
  2. I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
  3. I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.

By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.

Show full item record

Please use this identifier to cite or link to this item: https://theses.lib.polyu.edu.hk/handle/200/671