簡易檢索 / 詳目顯示

研究生: 呂竑錂
Hung-ling Lu
論文名稱: 在非固定鏡頭下之物件追蹤-使用粒子濾波器結合三步搜尋追蹤演算法
Object Tracking under a Moving Camera-Using Particle Filter Embedded Three-Step Search Tracking Algorithm
指導教授: 王乃堅
Nai-jian Wang
口試委員: 劉昌煥
Chang-huan Liu
鍾順平
Shun-ping Chung
姚嘉瑜
Chia-yu Yao
姚立德
Li-tw Yao
學位類別: 碩士
Master
系所名稱: 電資學院 - 電機工程系
Department of Electrical Engineering
論文出版年: 2008
畢業學年度: 96
語文別: 中文
論文頁數: 65
中文關鍵詞: 物件追蹤粒子濾波器三步搜尋演算法
外文關鍵詞: Object tracking, Particle Filter, Three-Step Search
相關次數: 點閱:210下載:1
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

目前在人機視覺互動的娛樂系統或者是安全監視系統中移動物件追蹤系統都是其核心的部分。而在近幾年來隨著電腦運算能力的提升,移動物件追蹤系統逐漸受到重視。且基於物件模型的追蹤演算法中,粒子濾波器以及平均值位移演算法因為容易實作,因此許多物件追蹤應用上都使用這兩種追蹤方法。然而平均值位移演算法的誤差累積情況較為顯著,而粒子濾波器則受樣本數多寡影響有容易導致物件晃動的問題存在。因此本篇論文針對粒子濾波器的物件晃動問題進行改良,提出粒子濾波器結合三步搜尋追蹤演算法,並考慮物件中各色彩的權重使得整個追蹤演算法更為強健。我們利用色彩特徵並排除重心位移量過大的色彩來建立目標物件模型。並利用物件樣板及其近鄰近像素色彩分佈的多寡以及色彩重心的位移量來決定色彩權重,且在計算相似度時考慮色彩對目標物件的重要性,經過粒子濾波器追蹤到物體位置後,再透過三步搜尋演算法解決物件晃動的問題。經實驗後發現我們的方法的確可以減少粒子濾波器的晃動問題,並且只需要用少數的樣本數即可準確的追蹤物件。而色彩權重的概念更提高了演算法對於雜亂背景下追蹤的成功率。我們的粒子濾波器使用30個樣本數並結合三步搜尋追蹤演算法進行追蹤,實驗100次後仍可百分之百的追蹤到移動物件,而粒子濾波器在相同的條件下僅有73%的成功機率。在運算時間的部分,30個樣本數的粒子濾波器結合三步搜尋追蹤演算法比全域搜尋演算法快了29.621倍,而粒子濾波器則比全域搜尋演算法快31.416倍,因此兩者的運算時間上並無顯著的差距。


Currently, object tracking system is the core of the application of the hu-man-machine vision system for entertainment as well as surveillance system. Object tracking system is an important topic, but it wastes a lot of computing resource; how-ever, it has been used widely when computing has been speeding up in the past a few years. Many object tracking systems adopt Particle Filter and Mean Shift since they are easy to be implemented. However, Mean Shift results in conspicuous cumulative errors; and Particle Filter has serious drifting problem when the sample numbers are small. Accordingly, we propose a new algorithm- Particle Filter Embedded Three-Step Search (PFETSS) to solve the drifting problem. Also by considering the color weights, our algorithm is able to acquire a robust result. The color features are needed to generate the target object model. However, the colors with large center gravity movement are excluded from our model. Our algorithm generates the color weights using object template, its color distribution of adjacent pixels, as well as the center gravity of colors. In the process of calculating their similarity, we take the major color of the target object into consideration. After Particle Filter, an approximate object position is obtained. Then a Three-Step Search algorithm is applied to solve drifting problem. The experimental results show that our proposed algorithm can not only solve the drifting problem of Particle Filter, but also require fewer samples. In our method, the probability of successful tracking is greatly improved by considering color weights in clutter background. We use 30 samples in our PFETSS. After 100 expe-riments, our experimental results show that out method can 100% track the object successfully. Particle Filter, on the other hand, has 73% of probability of success, other conditions remaining the same; As for computation time, Particle Filter Embedded Three-Step Search algorithm, with 30 samples, is 29.621 times faster than Full Search algorithm according to my experiment result. With the sample number, Particle Filter is 31.416 times faster than Full Search algorithm. Hence, the computation time between Particle Filter Embedded Three-Step Search algorithm and Particle Filter algorithm is not significantly different.

目錄 i 中文摘要 iii ABSTRACT iv 誌謝 ivi 圖表目錄 viii 第一章 緒論 1 1.1 研究動機 1 1.2 研究背景與方法 3 1.3 論文組織 6 第二章 物件追蹤系統架構與模型建立及量測 7 2.1 移動物件追蹤系統架構 7 2.2 移動物件描述 9 2.2.1 核心函數 9 2.2.2 物體目標影像之色彩分佈密度函數 13 2.2.3 移動物體候選影像之色彩分佈密度函數 16 2.2.4 相似度計算Bhattacharyya Coefficient 18 第三章 移動物體的追蹤 23 3.1 平均值位移演算法 23 3.2 粒子濾波器 28 3.3 粒子濾波器結合三步搜尋追蹤演算法 32 3.3.1 移動物體模型建立 32 3.3.2 相似度計算 34 3.3.3 三步搜尋演算法 37 3.3.4 PFETSS追蹤流程與步驟 39 第四章 實驗結果與效能分析 42 4.1 實驗結果展示 43 4.2 實驗數據統計 58 第五章 結論與未來展望 61 5.1 結論 61 5.2 未來展望 61 參考文獻 62 作者簡介 65

[1] T. L. Hwang and J. J. Clark, “On Local Detection of Moving Edge,” Proceedings of IEEE International Conference on Pattern Recognition, Vol. I, pp. 180-184, 1990.
[2] R. C. Gonzalez and R. E. Woods, Digital Image Processing, 2nd Ed., Prentice Hall, pp. 626-627, 2002.
[3] R. Jain, D. Militzer, and H. Nagel, “Separating non-stationary from stationary scene components in a sequence of real world tv-images,” International Joint Conferences on Artificial Intelligence, pp. 612-618, 1977.
[4] B. K. P. Horn and B. G. Schunck, “DETERMINING OPTICAL FLOW,” Artifi-cial Intelligence, pp. 185-203, 1981.
[5] J. F. David and W. Yair, “Optical Flow Estimation,” in Paragios et al.: Handbook of Computer Vision, 2006.
[6] A. Talukder and L. Matthies, “Real-time detection of moving object vehicles us-ing dense stereo and optical flow,” IEEE/RSJ International Conference on Intel-ligent Robots and Systems, pp. 3718-3725, 2004.
[7] I. Haritaoglu, D. Harwood, L. S. Davis, “A fast background scene modeling and maintenance for outdoor surveillance,” International Conference of Pattern Recognition, vol. 4, pp. 241-219, 2000.
[8] M. Mason, Z. Duric, “Using histograms to detect and track objects in color vid-eo,” Applied Imagery Pattern Recognition Workshop, pp. 154–159, Oct. 2001.
[9] C. Stauffer, W.E.L Grimson, “Adaptive background mixture models for real-time tracking,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 246-252 ,1999.
[10] C. Stauffer, W. E. L. Grimson, “Learning patterns of activity using real-time tracking,” IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 747-757, 2000.
[11] S. J. McKenna, S. Jabri,Z. Duric,A. Rosenfeld,H. Wechsler, “Tracking groups of people,” Computer Vision and Image Understanding, pp. 42-56, 2000.
[12] N. K. Paragios, R. Deriche, “A PDE-based level-set approach for detection and tracking of moving objects,” Computer Vision, pp. 1139 - 1145, Jan. 1998.
[13] M. Bertalmio,G. Sapiro,G. Randall, “Region tracking on level-sets methods,” IEEE Transactions on Medical Imaging, pp. 448-451, 1999.
[14] I. Michael and B. Andrew. “A smoothing filter for Condensation,” European Conference on Computer Vision, Vol. 1, pp. 767-781, 1998.
[15] M. Isard, A. Blake, “CONDENSATION - Conditional Density Propagation for Visual Tracking,” International Journal of Computer Vision, pp. 5-28, 1998.
[16] N. Peterfreund, “Robust tracking of position and velocity with Kaiman snakes,” IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 564-569, 1999.
[17] N. Peterfreund, “Robust tracking with spatio-velocity snakes: Kalman filtering approach,” Proceedings of the IEEE International Conference on Computer Vision, pp. 433-439, 1998.
[18] R. Polana and R. Nelson, “Low level recognition of human motion,” IEEE Workshop Motion of Non-Rigid and Articulated Objects, pp. 77-82, 1994.
[19] B. Schiele, “Model-free tracking of cars and people based on color regions,” Image and Vision Computing, vol. 24, pp. 1172-1178, 2006.
[20] D. Comaniciu, V. Ramesh, P. Meer, “Real-Time Tracking of Non-Rigid Objects using Mean Shift,” IEEE Conference on Computer Vision and Pattern Recogni-tion, pp. 241-219, 2000.
[21] R.T. Collins, “Mean-shift blob tracking through scale space,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 234-240, 2003.
[22] D. Comaniciu, P. Meet, “Mean shift analysis and applications,” IEEE Interna-tional Conference on Computer Vision, pp. 1197-1203, 1999.
[23] G. Huimin, G. Ping, L. Hanqing, “A fast mean shift procedure with new iteration strategy and re-sampling,” IEEE International Conference on Systems, Man and Cybernetics, pp. 2385-2389, 2007.
[24] D. Comaniciu, V. Ramesh, “Mean shift and optimal prediction for efficient ob-ject tracking,” IEEE International Conference on Image Processing, pp. 70-73, 2000.
[25] D. Comaniciu, V. Ramesh, P. Meer, “Kernel-based object tracking,” IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 564-577, 2003.
[26] K. Nummiaro, E. K. Meier, and L. V. Gool, “A Color-based Particle Filter,” Im-age and Vision Computing, pp. 1-12, 2002.
[27] M. S. Arulampalam, S. Maskell, N. Gordon, T. Clapp, “A tutorial on particle fil-ters for online nonlinear/non-Gaussian Bayesian tracking,” IEEE Transactions on Signal Processing, pp. 174-188, 2002.
[28] C. Yang, R. Duraiswami, L. Davis, “Fast multiple object tracking via a hierar-chical particle filter,” IEEE International Conference on Computer Vision, pp. 212-219, 2005.
[29] B. Zhang, W. Tian, Z. Jin, “Robust appearance-guided particle filter for object tracking with occlusion analysis,” AEU - International Journal of Electronics and Communications, pp. 24-32, 2008.
[30] J. Deutscher, A. Blake, I. Reid, “Articulated body motion capture by annealed particle filtering,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 126-133, 2000.
[31] C. Chang, R. Ansari, A. Khokhar, “Multiple object tracking with Kernel Particle Filter,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 566-573, 2005.
[32] D. Schulz, W. Burgard, D. Fox, A. B. Cremers, “Tracking multiple moving tar-gets with a mobile robot using particle filters and statistical data association,” IEEE International Conference on Robotics and Automation, pp. 1665-1670, 2001.
[33] T. Koga, K. Iinuma, A. Hirano, Y. Iijima and T. Ishiguro, “Motion- compensated interframe coding for video conferencing,” National Telecommunications Con-ference, New Orleans, LA, pp. G5.3.1-G..5.3.5., 1981.
[34] D.W. Scott, Multivariate Density Estimation, New York: Wiley, pp. 24-26, 1992.

無法下載圖示 全文公開日期 2013/07/07 (校內網路)
全文公開日期 本全文未授權公開 (校外網路)
全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
QR CODE