簡易檢索 / 詳目顯示

研究生: 黃致瑋
Chih-Wei Huang
論文名稱: 一個基於粒子濾除與適應性提昇效能技術的即時手勢追蹤與辨識系統
A real-time gesture tracking and recognition system based on particle filtering and AdaBoosting techniques
指導教授: 范欽雄
Chin-Shyurng Fahn
口試委員: 廖弘源
Hong-Yuan Liao
范國清
Kuo-Chin Fan
黃仲陵
Chung-Lin Huang
林其禹
Chi-Yu Lin
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2006
畢業學年度: 95
語文別: 英文
論文頁數: 73
中文關鍵詞: 手部追蹤手勢辨識人機介面粒子濾除器小波轉換
外文關鍵詞: AdaBoost, hand tracking, gesture recognition, human computer interface, wavelet transform
相關次數: 點閱:226下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 近年來,手勢追蹤與辨識的努力成果在電腦視覺的研究領域中越來越受專注,手勢分析是人機介面領域中一個不可或缺的技術,而手部追蹤更在手勢分析技術裡扮演著極重要的角色。在本論文中,我們提出一個基於粒子濾除與適應性提昇效能技術的即時手勢追蹤與辨識系統,其中粒子濾除器被用來做手部追蹤,它是一個靈活有彈性的方法,且能夠適應非線性的追蹤問題;另外,為了避免身體上其它裸露的部位或是背景中類似膚色的物體之干擾,除了膚色資訊外,我們的系統進一步採用物體的運動資訊當作追蹤手部時的特徵。這和傳統的粒子濾除器相比較,我們的方法僅需要少量的粒子數目即可達到有效的取樣成果,如此不但計算成本降低,更節省了大量時間讓稍後的手勢辨識步驟能做充裕的運算。辨識的步驟係採用 AdaBoost演算法,它在訓練時的收斂速度佔了很大的優勢,有助於更新資訊或是新增手勢資料庫。根據實驗結果顯示:我們的手部追蹤系統是快速、正確而且極具強健性,此外,在複雜的環境因素下,本系統仍有不錯的手勢辨識率。


    In recent years, the researchers in the field of computer vision have devoted considerable efforts to the tracking and recognition of hand gestures. Gesture analysis is the crucial technology used in human computer interfaces, where hand tracking plays a significant role of gesture analysis. A real-time gesture tracking and recognition system based on particle filtering and AdaBoosting techniques is presented in this thesis. The particle filter, which is a flexible simulation-based method and suitable for non-linear tracking problems, is adopted to achieve hand tracking robustly. In order to avoid the influence of the other exposed skin part of a body and the skin colored objects in the background, our system further applies the motion information as the features of a hand in addition to the skin color information. Compared with the conventional particle filters, our method leads to more efficient sampling and requires fewer particles. It results in lowering computational cost and saving much time for gesture recognition later. The gesture recognition uses the features derived from the wavelet transform, and employs an AdaBoost algorithm which is excellent in facilitating the speed of convergence during the training. Hence, it is conducive to update new information and expand new gesture archives. The experimental results reveal our system is fast, accurate, and robust in hand tracking. Moreover, it has good performance in gesture recognition under complicated environments.

    Abstract………………………………………………………………………………..i 中文摘要……………………………………………………………………………....ii Contents.......................................................................................................................iii List of Figures...............................................................................................................v List of Tables...............................................................................................................vii Chapter 1 Introduction................................................................................................1 1.1 Overview..........................................................................................................1 1.2 Background......................................................................................................1 1.3 Motivation........................................................................................................2 1.4 Thesis organization...........................................................................................3 Chapter 2 Related Works............................................................................................4 2.1 Reviews of hand detection and tracking..........................................................4 2.2 Reviews of gesture recognition......................................................................10 Chapter 3 Our Proposed Method.............................................................................14 3.1 Moving target detection and blob extraction..................................................15 3.1.1 Morphological image processing........................................................16 3.1.2 The motion analysis method...............................................................19 3.2 Skin color segmentation.................................................................................20 3.3 Connected components..................................................................................23 3.4 Locating hand regions....................................................................................25 Chapter 4 The Hand Tracking Procedure..............................................................27 4.1 The tracking scheme......................................................................................27 4.2 The Kalman filter...........................................................................................28 4.3 The particle filter...........................................................................................30 4.4 Our hand tracking method.............................................................................35 Chapter 5 The Gesture Recognition Procedure......................................................42 5.1 Feature extraction...........................................................................................43 5.1.1 Preliminary processing........................................................................43 5.1.1 The discrete wavelet transform...........................................................44 5.1.3 Wavelet features extraction.................................................................47 5.2 Training of gestures.......................................................................................48 5.2.1 AdaBoost............................................................................................48 5.2.2 The weak classifier.............................................................................53 5.3 Gesture recognition strategies.......................................................................55 Chapter 6 Experimental Results and Discussions...................................................57 6.1 Performance evaluation of gesture tracking...................................................57 6.2 Gesture tracking results and discussions........................................................58 6.3 Performance evaluation of gesture recognition..............................................65 6.4 Gesture recognition results and discussions...................................................67 Chapter 7 Conclusions and Future Works...............................................................71 References...................................................................................................................72

    [1] G. R. Bradski, “Computer vision face tracking for use in a perceptual user interface,” Intel Technology Journal, vol. 2, no. 2, pp. 1-15, 1998.
    [2] K. Imagawa, S. Lu, and S. Igi, “Color-based hands tracking system for sign language recognition,” in Proceedings of the Third IEEE International Conference on Automatic Face and Gesture Recognition, pp. 462-467, 1998.
    [3] C. Shan, Y. Wei, T. Tan, and F. Ojardias, “Real time hand tracking by combining particle filtering and mean shift,” in Proceedings of the Sixth IEEE International Conference on Automatic Face and Gesture Recognition, pp. 669-674, 2004.
    [4] H. Zhai, X. Wu, and H. Han, “Research of a real-time hand tracking algorithm,” in Proceedings of the International Conference on Neural Networks and Brain, vol. 2, pp. 13-15, 2005.
    [5] X. Liu and K. Fujimura, “Hand gesture recognition using depth data,” in Proceedings of the Sixth IEEE International Conference on Automatic Face and Gesture Recognition, pp. 529-534, 2004.
    [6] S. Kumar, D. K. Kumar, A. Sharma, and N. McLachlan, “Classification of visual hand movements using multiresolution wavelet images,” in Proceedings of the International Conference on Intelligent Sensing and Information Processing, pp. 373-378, 2004.
    [7] K. Dorfmüller-Ulhaas and D. Schmalstieg, “Finger tracking for interaction in augmented environments,” in Proceedings of the IEEE and ACM International Symposium on Augmented Reality, pp. 55-64, 2001.
    [8] K. Oka, Y. Sato, and H. Koike, “Real-time fingertip tracking and gesture recognition,” in Proceedings of the IEEE Computer Graphics and Applications, vol. 22, no. 6, pp. 64-71, 2002.
    [9] R. Kjeldsen and J. Kender, “Finding skin in colour images,” in Proceedings of the IEEE International Conference on Automatic Face and Gesture Recognition, pp. 312-317, 1996.
    [10] R. Herpers, G. Verghese, K. Derpanis, R. McReady, J. MacLean, A. Levin, D. Topalovic, L. Wood, A. Jepson, and J. Tsotsos, “Detection and tracking of faces in real Environments,” in Proceedings of the International Workshop on Recognition, Analysis and Tracking of Faces and Gestures in Real-Time Systems, pp. 96-104, 1999.
    [11] J. MacLean, R. Herpers, C. Pantofaru, L. Wood, K. Derpanis, D. Topalovic, and J. Tsotsos, “Fast hand gesture recognition for real-time teleconferencing Applications,” in Proceedings of the IEEE ICCV Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems, pp. 133-140, 2001.
    [12] H. Fujiyoshi and A. Lipton, “Real-time human motion analysis by image skeletonization,” in Proceedings of the Fourth IEEE Workshop on Application of Computer Vision, pp. 19-21, 1998.
    [13] A. Talukder and L. Matthies, “Real-time detection of moving objects from moving vehicles using dense stereo and optical flow,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 4, pp. 3718-3725, 2004.
    [14] S. Lu, D. Metaxas, D. Samaras, and J. Oliensis, “Using multiple cues for hand tracking and model refinement,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2, pp. 18-20, 2003.
    [15] Y. Ran and Q. Zheng, “Multi moving people detection from binocular sequences,” in Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing, vol. 3, pp. 37-40, 2003.
    [16] S. Huwer and H. Niemann, “Adaptive change detection for real-time surveillance applications,” in Proceedings of the third IEEE International Workshop on Visual Surveillance, pp. 37-46, 2000.
    [17] A. Vacavant, and T. Chateau, “Realtime head and hands tracking by monocular vision,” in Proceedings of the IEEE International Conference on Image Processing, vol. 2, pp. 11-14, 2005.
    [18] A. Markarov, J. Vesin, and M. Kunt, “Instrusion detection using extraction of moving edges,” in Proceedings of the 12th IAPR International Conference on Pattern Recognition, Conference A: Computer Vision & Image Processing, vol. 1, pp. 804-807, 1994.
    [19] N. Paragious and G. Tziritas, “Detection and location of moving objects using deterministic relaxation algorithms,” in Proceedings of the 13th International Conference on Pattern Recognition, vol. 1, pp. 201-205, 1996.
    [20] M. Kölsch and M. Turk, “Fast 2D hand tracking with flocks of features and multi-cue integration,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshop, pp. 158-158, 2004.
    [21] Y. Liu and Y. Jia, “A robust hand tracking for gesture-based interaction of wearable computers,” in Proceedings of the Eighth International Symposium on Wearable Computers, vol. 1, pp. 22-29, 2004.
    [22] P. Pérez, C. Hue, J. Vermaak, and M. Gangnet, “Color-based probabilistic tracking,” in Proceedings of the 7th European Conference on Computer Vision- Part I, vol. 2350, pp. 661-675, 2002.
    [23] K. Okuma, A. Taleghani, N. de Freitas, J. Little, and D. Lowe, “A boosted particle filter: multitarget detection and tracking,” Europäischer Verband der Veranstaltungs-Centren eV, 2004.
    [24] L. K. Lee, S. Kim, Y. K. Choi, and M. H. Lee, “Recognition of hand gesture to human-computer interaction,” in Proceedings of the 26th Annual Conference on Industrial Electronics Society, vol. 3, pp. 2117-2122, 2000.
    [25] A. Chalechale, F. Safaei, G. Naghdy, and P. Premaratne, “Hand gesture selection and recognition for visual-based human-machine interface,” in Proceedings of the IEEE International Conference on Electro Information Technology, 2005.
    [26] W. N. Chan and S. Ranganath, “Gesture recognition via pose classification,” in Proceedings of the 15th International Conference on Pattern Recognition, vol. 3, pp. 699-704, 2000.
    [27] H. P. Graf, E. Cosatto, D. Gibbon, M. Kocheisen, and E. Petajan, “Multi-modal system for locating heads and faces,” in Proceedings of the Second International Conference on Automatic Face and Gesture Recognition, pp. 88-93, 1996.
    [28] K. Suzuki, I. Horiba, amd N. Sugie, “Linear-time connected-component labeling based on sequential local operations,” Computer Vision and Image Understanding archive, vol. 89, no. 1, pp. 1-23, 2003.
    [29] X. Q. Song, Real-Time Visual Detection and Tracking of Multiple Moving Objects Based on Particle Filtering Techniques, Master Thesis, Department of Computer Science and Information Engineering, National Taiwan University of Science and Technology, Taipei, 2005.
    [30] G. Welch and G. Bishop, An Introduction to the Kalman Filter, Technical Report TR 95-041, Department of Computer Science, University of North Carolina, Chapel Hill, 2004.
    [31] W. Q. Liao, Computer Vision-Based Eye Localization and Tracking Techniques for Blink Detection in Real Time, Master Thesis, Department of Computer Science and Information Engineering, National Taiwan University of Science and Technology, Taipei, 2005.
    [32] M. K. Hu, “Visual pattern recognition by moment invariants,” IEEE Transaction on Information Theory, vol. 8, no. 2, pp. 179-187, 1962.
    [33] P. R. G. Harding and T. J. Ellis, “Recognizing hand gesture using Fourier descriptors,” in Proceedings of the 17th International Conference on Pattern Recognition, vol. 3, pp. 286-289, 2004.
    [34] Y. Freund and R. E. Schapire, “Experiments with a new boosting algorithm,” in Proceedings of the 13th International Conference on Machine Learning, pp. 148-156, 1996.
    [35] J. Friedman, T. Hastie, and R. Tibshirani, “Additive logistic regression: a statistical view of boosting,” The Annals of Statistics, vol. 28, no. 2, pp. 337-407, 2000.
    [36] R. E. Schapire and Y. Singer, “Improved boosting algorithms using confidence-rated predictions,” Machine Learning, vol. 37, no. 3, pp. 297-336, 1999.
    [37] A. Vezhnevets, GML Adaboost Matlab Toolbox, Graphics and Media Laboratory, Computer Science Department, Moscow State University, Moscow, Russian Federation, http://research.graphicon.ru/.
    [38] L. Breiman, J. Friedman, R. Olshen, and C. Stone, Classification and Regression Trees, Chapman and Hall, New York, 1984.

    無法下載圖示 全文公開日期 2011/10/13 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)
    全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
    QR CODE