簡易檢索 / 詳目顯示

研究生: 楊孟璇
MENG-SHUANG YANG
論文名稱: 應用膚色過濾技術之即時手勢辨識系統
Instant Gesture Recognition System of Using The Complexion Filter Technology
指導教授: 李永輝
Yung-Hui Lee
口試委員: 蔡超人
none
王孔政
none
學位類別: 碩士
Master
系所名稱: 管理學院 - 工業管理系
Department of Industrial Management
論文出版年: 2007
畢業學年度: 95
語文別: 中文
論文頁數: 66
中文關鍵詞: 手勢辨識特徵擷取規則式邏輯
外文關鍵詞: hand gesture recognition, feature extraction, rule-based logic
相關次數: 點閱:340下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

本研究期望建立一套規則式邏輯作為手勢辨識之基礎,以一台攝影機擷取手勢影像來達成即時手勢辨識的目的。研究方法是透過一些自然手勢的表達來找出膚色區塊及距離的相關特徵值,選定的手勢是挑選10個數字手勢。受試者共有13人,分為訓練組(3人)及測試組(10人),由訓練組建立手勢資料庫,再藉由測試組的資料進行驗證。實驗設計是藉由一台攝影機抓取受試者手部膚色影像,手勢影像經由影像處理程序後,取得手指平面座標經過特徵值擷取,藉由觀察法找出手勢分類的重要特徵,以發展手勢辨識系統(Finger Gesture Recognition System, FGRS),進而完成手勢之辨識。
由於系統僅由一台攝影機取像,手勢影像容易受到遮蔽效應的影響,因此實驗過程中有取像條件限制。實驗之結果說明系統的平均整體辨識率都在九成以上,僅手勢六不到九成,只有八成四;個別辨識率除了手勢六與手勢八之外也都超過八成以上。經由討論分析後將錯誤辨識結果區分為兩類,合理錯誤(佔52%)與不合理錯誤(佔48%),合理的錯誤是由於判別特徵值落在臨界值邊緣之模糊地帶,不合理的錯誤主要時受到取像環境的影響。因此本研究認為在影像處理程序中加入手指個別標記等程序,應能降低誤判率,使整體的手勢辨識績效達到最佳。


In this study, we hope to establish a rule-based logical hand gesture recognition system, which used one camera to catch hand gestures for the purpose of recognition. We use the research method of using natural gestures, which are ten gestures of numbers from zero to nine, to find out the relative characteristic value between blocks of skin color and distances. There are thirteen subjects, three of them are used for training and other ten are used for gesture recognition. The hand gesture database is developed by the training group and is tested and verified by the other 10 subjects. This experiment is designed to catch the image of the subjects’ skin color by one camera. After we process the gesture images, we will get the 2D coordinate of fingers. And through the observation method we will find out the important characteristic of gesture classification, further we can develop the finger gesture recognition system (FGRS).
The images are easily influenced by masking effect because we only used one camera to catch the hand gestures. So we set catching image limits in the experiment process. The result of the experiment show that all of the average total recognition rates are above 90%. Except the gesture six didn’t get to 90%, it only got to 84%. The individual recognition rates all are more than 80% besides gesture six and gesture eight. After analyzing and discussing we divide the mistake results into two types, reasonable mistakes (52%) and unreasonable mistakes (48%). Reasonable mistakes is due to the value of attributes lie in indistinct region of threshold limit value, and unreasonable mistakes are mainly influenced by the environments of catching image. Therefore this study thinks that it can decrease the mistaken judging rates by marking each finger during image processing, and it can make the recognizing result reaches optimum

摘要 I 誌謝 III 目錄 IV 表目錄 VI 圖目錄 VII 第一章 緒論 1 1.1 研究背景與動機 1 1.2 研究目的 4 1.3 研究範圍與方法 5 1.4 研究限制 7 1.5 研究架構 8 第二章 文獻探討 10 2.1 手勢定義 10 2.1.1 定義手勢之基本元件 10 2.1.2 決定靜態手的位置 13 2.1.3 定義動態的動作軌跡 13 2.2 影像處理技術 14 2.2.1膚色過濾 15 2.2.2影像品質改善方法 18 2.2.3型態學 20 2.2.3移動物件偵測 22 2.2.4連接元件標記 24 2.3 特徵資料擷取 25 2.4 手勢辨識方法 27 2.5 Rule-Based邏輯 28 第三章 研究方法 30 3.1手勢辨識系統 30 3.2 移動邊緣偵測 35 3.3 膚色過濾 37 3.4 手指定位 37 3.5 手勢特徵偵測 39 3.5.1 連接元件標記法 39 3.5.2 Rule-based法則 41 3.6 實驗設計與規劃 43 3.6.1 實驗環境與硬體配備規格 43 3.6.2 實驗方法設計 44 3.7 手勢資料收集 47 第四章 系統實現與效能測試 48 4.1 系統實現 48 4.2 系統效能測試 51 4.2.1 系統辨識結果 51 4.2.2 辨識結果分析與討論 52 4.3 手勢3D資料庫連結 56 第五章 結論與建議 58 5.1 結論 58 5.2 建議 59 表目錄 表3.5.1 Rule表 41 表4.2.1 測試組之實驗辨識結果 51 表4.2.2 手勢誤判的累計次數表 55 圖目錄 圖1.1.1 手部骨骼架構與各關節之活動自由度 3 圖1.3.1 FGRS的系統架構 6 圖1.5.1 研究架構圖 9 圖2.1.1 五個手勢基本元件(資料來源:Ng, 2002) 11 圖2.1.2 研究架構圖五個基本元件組合所定義的指令 11 圖2.1.3 5DT數據手套的感應器位置 12 圖2.1.4 平滑法所使用之面罩 18 圖2.1.5 柱狀圖 19 圖2.1.6 侵蝕運算效果 20 圖2.1.7 侵蝕運算的實作方法 20 圖2.1.8 擴張運算效果 21 圖2.1.9 擴張運算的實作方法 21 圖2.1.10 連接元件標記法示意圖 24 圖2.1.11 四連接與八連接示意圖 25 圖2.1.12 特徵值資料選取流程圖 27 圖3.1.1 手勢辨識系統架構 31 圖3.1.2 移動物體偵測處理程序流程圖 32 圖3.1.3 膚色過濾處理程序流程圖 33 圖3.1.4 手指定位處理程序流程圖 34 圖3.1.5 特徵偵測處理程序流程圖 35 圖3.2.1 移動邊緣偵測法-灰階轉換與邊緣化 36 圖3.2.2 移動邊緣偵測法-平滑後的影像差異 36 圖3.2.3 移動邊緣偵測法- 與 AND 後的影像 36 圖3.4.1 手指伸縮比較 38 圖3.4.2 手指區域偵測 39 圖3.5.1 影像區塊 40 圖3.5.2 手勢規則邏輯架構 42 圖3.6.1 實驗環境 43 圖3.6.2 實驗流程 45 圖3.6.3 10個基本手語手勢 46 圖3.7.1 資料規劃與辨識應用 47 圖4.1.1 手勢辨識系統結果 49 圖4.1.2 10種手勢之系統介面辨識結果 50 圖4.2.1 個別辨識率 52 圖4.2.2 測試組之整體辨識率 52 圖4.2.3 rule 2的錯誤手勢 54 圖4.2.4 rule 5、7、9的錯誤手勢 54 圖4.2.5 不合理的錯誤手勢 55

1. Ali Erol, George Bebis, Mircea Nicolescu, Richard D. Boyle and Xander Twombly, “A Review on Vision-Based Full DOF Hand Motion Estimation,” Proceedings of the IEEE Workshop on Vision for Human-Computer Interaction (V4HCI), (in conjunction with the IEEE Conference on Computer Vision and Pattern Recognition), San Diego, California, June.2005.
2. Azarbayejani, A., Wren, C. and Pentland, A., “Real-time 3-d tracking of the human body,” Proceeding of IMAGE’COM 96, Bordeaux, France.1996.
3. C.R. Wren, A. Azarbayejani, T. Darrell, and A. Pentland. “Pfinder:Realtime Tracking of the Human Body,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 19, 780–785, 1997.
4. C. S. Chua, H. Y. Guan, and Y. K. Ho, “Model-based finger posture estimation,” In ACCV2000.2000.
5. C.S. Fuh and P. Maragos, “Region-Based Optical Flow Estimation,”Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, San Diego, CA, pp. 130-133, 1989.
6. David J. Sturman, David Zeltzer, “A Survey of Glove-based Input,” IEEE Computer Graphics & Applications.1994.
7. D.P. Huttenlocher, G.A.. Klandeman and W.J. Rucklidge, “Comparing images using the Hausdorff distance,” IEEE Trans. On PAM I, vol.15, no. 9, pp. 850-863.1993.
8. E. Holden, “Visual Recognition of Hand Motion. PhD thesis,”Department of Computer Science, University of Western Australia.1997.
9. E. Oyama, A. Agah, Karl F. M., T. Maeda, S. Tachi, “A modular neural network architecture for inverse kinematics model learning,” Neurocomputing 40, 797-805.2001.
10. Eun-Jung Holden, Robyn Owens, Geoffrey G. Roy, “An Adaptive Fuzzy Expert System for 3D Hand Motion Understanding,” Computer Science and Software Engineering.2000
11. Guangqi Ye, Jason J. Corso and Gregory D. Hager, “Gesture Recognition Using 3D Appearance and Motion Features,” CVPR Workshop on RTV4HCI.2004.
12. Herv´e, J.-Y., “Visual Hand Posture Tracking in a Gripper Guiding Application, ” International Conference on Robotics & Automation, 2, 1688-1694.2000.
13. http://homepages.inf.ed.ac.uk/rbf/HIPR2/morops.htm
14. I. Haritaoglu, D. Harwood, and L.S. Davis, “Hydra: Multiple People Detection and Tracking Using Silhouettes,” Proceedings of the Second IEEE Workshop on Visual Surveillance, pp. 6-13, June 26, 1999.
15. I. Haritaoglu, D. Harwood, and L. Davis, “W4: Who, When, Where,What: A Realtime System for Detecting and Tracking People,” IEEEInternational Conference on Automatic Face and Gesture Recognition,pp. 222–227, April 1998.
16. Iwai, Y., K. Watanabe, Y. Yagi and M. Yachoda, “Gesture Recognition by Using Colored Gloves, ” IEEE International Conference on Systems, Man, and Cybernetics, 1:10, 76-81.1996
17. J.C. Terrillon and S. Akamatsu, “Comparative Performance of DifferentChrominance Spaces for Color Segmentation and Detection of Human Faces in Complex Scene Images,” Proceedings of the Vision Interface '99 Conference, Canada, 19-21 May.
18. Jintae Lee and T. L. Kunii, “Model-based analysis of hand posture,” IEEE Computer Graphics and Appl., vol.15, no.5, pp.77-86, Sep.1995.
19. J.P. Serra, “Image Analysis and Mathematical Morphology, Academic Press, ” pp. 115-130, 1982.
20. J. Yang and A. Waibel, “A Real-Time Face Tracker, ” Proceedings 3rd IEEE Workshop on Applications of Computer Vision, pp.142-147, 1996.
21. J. Yang and A. Waibel, “Tracking Human Faces in Real-Time, ” CMU CS Technical Report, CMU-CS-95-210, 1995.
22. K.P. Horn and B.G. Schunck, “Determining Optical Flow,” Artificial Intelligence, Vol.17, pp. 185-203, 1981.
23. L.D. Stefano and A. Bulgarelli, “A Simple and Efficient Connected Components Labeling Algorithm,” Proceedings of 10th International Conference on Image Analysis, pp. 322-327, 1999.
24. Linda G.Shapiro , George C.Stockman, "Computer Vision", pp.65-68.
25. L. Lee and T. L, “Kunii,Constraint-based hand animation,” Models and Techniques in Computer Animation, Tokyo: Springer-Verlag, pp.110-127.1993.
26. Liu, F. Y. Zhuang, F. and Wu, Y. P., “3D motion retrieval with motion index tree,” Computer Vision and Understanding 92, 265-284.2003.
27. M. Hunke and A. Waibel, “Face Locating and Tracking for Human-Computer Interaction,“ Conference Record of the Twenty-Eighth Asilomar Conference on, Vol. 2, pp. 1277-1281, 1994.
28. Mu-Chun Su, “A Fuzzy Rule-Based Approach to Spatio-Temporal Hand Gesture Recognition, ” IEEE Transactions on Systems, Man, and Cybernetics—PartC: Applications and Reviews, vol. 30, No. 2, May.2000.
29. M.S. Lee, “Detecting People in Cluttered Indoor Scenes,” Processings of IEEE Conference on Computer Vision and Pattern Recognition, Vol.COM-31, No. 4, pp. 532-540, 1983.
30. Ng, C. and Ranganath, S., “Real-time recognition system application, ” Image and Vision, 20, 993-1007.2002.
31. N. Shimada and Y. Shirai, “3-D hand pose estimation and shape model refinement from a monocular image sequence, ” Proc. Of VSMM’96, pp. 423-428.1996.
32. O’Hagan, R.G., Zelinsky, A. and Rougeaux, S., “Visual gesture interfaces for virtual environment, ” Interacting with Computers, 14 ,231-250.2002.
33. Park, S. W., Seo, Y., and Hong, K.S., “Real-time camera calibration for virtual studio, ” Real-Time Imaging, 6, 433-448.2000.
34. R.L. Hsu and A.K. Jain, “Face Detection in Color Images,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vo. 24, No. 5, pp. 696-706, 2002.
35. Rochelle O'Hagan, “Alexander Zelinsky, Sebastien Rougeaux: Visual gesture interfaces for virtual environments, ” Interacting with Computers 14(3): 231-250.2002.
36. Sato, Y., Kobayashi, Y., and Koike, H. , “Fast tracking of hands and fingertips in infrared images for augmented desk interface, ” IEEE Automatic Face and Gesture Recognition (FG2000), pp.462-467.2000.
37. Segen, J., Kumar, S., “Look ma, no mouse! , ” Communications of the ACM 43(7), 103-109.2000.
38. S. Khan and M. Shah, “Tracking People in Presence of Occlusion,” Proceedings of the Asian Conference on Computer Vision, Taipei,Taiwan, pp. 1132-1137, January 2000.
39. T.L. Hwang and J.J Clark, “On Local Detection of Moving Edges, ” Proceedings of IEEE International Conference on Pattern Recognition, Vol. 1, pp. 180-184, 1990.
40. V. Pavlovic, R. Sharma, and T. S. Huang, “Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review, ” IEEE Trans. PAMI, vol. 19, No. 7, pp. 677-695, July.1997.
41. Wang, X., “A behavior-base inverse kinematics algorithm to predict arm prehension postures for computer-aided ergonomic evaluation,” Journal of Biomechanics, 32, 453-460.1999.
42. Y. Wu, J. Y. Lin, and T. S. Huang, “Capturing natural hand articulation,” Proc. IEEE Int. Conf. Computer Vision, pp. 426-432, Vol. 2.2001.
43. Ying Wu and T. S. Huang, “Hand modeling, analysis and recognition, ”IEEE Signal Processing, vol.18, no.3, pp.51-60, May.2001.
44. Y. Yasumuro, Qian Chen, and K. Chihara, “3D modeling of human hand with motion constraints, ” Proc. of the Int. Conf. on 3-D Digital Imaging and Modeling, pp.275-282, May.1997.
45. Y. Wang and M. Lu, “Moving Edges Detection in Time-Varying Image Sequences,“ IEEE International Conference on Circuits and Systems, Vol. 1, pp. 317-320, 1991.
46. S. Kawato and J. Ohya, “Two-step Approach for Real-Time Eye Tracking with a New Filtering Technique,“ IEEE International Conference on Systems, Man, and Cybernetics, Vol. 2, pp. 1366-1371,2000.
47. 王天培 “特徵手與類神經網路分類器於手勢辨識之研究”,國立中山大學碩士論文,1999。
48. 王國榮 “基於資料手套的智慧型手勢辨識之廣泛研究”,國立臺灣科技大學碩士論文,2001。
49. 林佑政 “以視覺為基礎之指尖手寫介面的設計及實作”,國立東華大學碩士論文,2005。
50. 翁家雯 “以3D光點座標為基礎並使用規則式邏輯之靜態手勢辨識”,國立臺灣科技大學碩士論文,2006。
51. 許宏昌 “主成分分析法於指尖亮點手勢辨識之應用”,國立中山大學海下技術研究所碩士論文,2003。
52. 張良國 “基於Hausdorff 距離的手勢識別”,Journal of Image and Graphics. vol. 7, no. 11, pp. 1144-1150.2002
53. 陳延聖 “以手部關節3D座標為隱藏馬可夫模型輸入的動態手勢辨識”,國立臺灣科技大學碩士論文,2006。
54. 陳偉政 “改良特徵抽取比對方法於手勢之識別”,中華大學碩士論文,2003。“”
55. 簡隆至 “即時移動物體偵測及自動追蹤系統”,國立臺灣科技大學碩士論文,2004 。
56. 蘇木春、張孝德 “ 機器學習:類神經網路、模糊系統以及基因演算法則”,全華科技圖書股份有限公司,2002。
57. 蘇哲煌 “獨立式遠端影像監控模組的實現”,國立臺灣科技大學碩士論文,2005。

無法下載圖示
全文公開日期 本全文未授權公開 (校外網路)
全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
QR CODE