|Author:||Yam, Kin Yi|
|Title:||Video object detection and parameterization|
Pattern recognition systems.
Image processing -- Digital techniques.
Hong Kong Polytechnic University -- Dissertations
|Department:||Department of Electronic and Information Engineering|
|Pages:||94 leaves : ill. (chiefly col.) ; 30 cm.|
|Abstract:||In this Thesis, a robust background extraction and a novel object detection are proposed, which comprise with filtering operations to detect non background objects in a monitoring scene. Conventionally, a statistical background model is extracted by using a training sequence without foreground objects and updated the background model parameters are being updated continuously to adept changes in the scene. However, it is not possible to control the monitoring scene to become static. Furthermore, "static objects" in the scene could be adapted to become part of the background. Problems arise when static objects start to move again. The conventional method would detect a "hold" at position, therefore it produces false alarm in the detection process. In our proposed algorithm, two background models are constructed by using an N-bins histogram method to indicate short term and long term changes of the monitoring scene. We then apply background subtractions to the current frame to obtain two error frames, which are combined for object detection and classification. Object states are classified as active, static or re-active deal with different kinds of real time applications. Extensive experimental work has been done, results of which show that the present approach provides a better solution compared with conventional approach, including resolving the problem of re-active objects. In the research study, an object based bi-directional counting system is also proposed, which comprises of an advanced object detection and a tracking algorithm to count the people flow in the monitoring scene. Conventionally, an overhead camera is used for objects counting purposes. However, this kind of setting is hardly to be installed in some places, especially for those low ceiling buildings. Another problem is the distortion of the image. A overhead camera needs a wide lens in order to capture more information in the monitoring scene, unless the camera is placed at a higher position. Besides, customers need to install the extra overhead camera for people counting propose only, which would increase the cost of the system. In our system, it can be acted as a plug-in software module to existing surveillance system that is usually installed with an angle of 45° to the monitoring scene. For this monitoring angle, recent researchers usually count the number of existing objects in a predefine detection region, which is lack of directional flow information. In order to resolve these problems, result of our investigation provides an efficient and easy to implement approach for real time monitoring system. Scene parameters can be calibrated to adapted different monitoring environments. Applying our advance objects detection system, with Initial Object Detector (IOD) and Existing Object Detector (EOD) models, objects can be tracked and counted with people directional flow. Real time experiment has been done. The result of which shows that the present approach can provide about 90% accuracy for bi-directional people counting with an angle of 45° to the scene.|
|Rights:||All rights reserved|
Files in This Item:
|b26158802.pdf||For All Users||21.13 MB||Adobe PDF||View/Open|
As a bona fide Library user, I declare that:
- I will abide by the rules and legal ordinances governing copyright regarding the use of the Database.
- I will use the Database for the purpose of my research or private study only and not for circulation or further reproduction or any other purpose.
- I agree to indemnify and hold the University harmless from and against any loss, damage, cost, liability or expenses arising from copyright infringement or unauthorized usage.
By downloading any item(s) listed above, you acknowledge that you have read and understood the copyright undertaking as stated above, and agree to be bound by all of its terms.
Please use this identifier to cite or link to this item: