簡易檢索 / 詳目顯示

研究生: Dimas Herjuno
Dimas - Herjuno
論文名稱: Intuitive Hand Gesture Design For Indoor Unmanned Aerial Vehicle Control
Intuitive Hand Gesture Design For Indoor Unmanned Aerial Vehicle Control
指導教授: 林昌鴻
Chang Hong Lin
口試委員: Jung-Chun Kao
Jung-Chun Kao
Chung-An Shen
Chung-An Shen
Shanq-Jang Ruan
Shanq-Jang Ruan
學位類別: 碩士
Master
系所名稱: 電資學院 - 電子工程系
Department of Electronic and Computer Engineering
論文出版年: 2017
畢業學年度: 105
語文別: 英文
論文頁數: 89
外文關鍵詞: naive bayes classifier, k-nearest neighbors, teleoperation, dynamic time warping
相關次數: 點閱:680下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報


Unmanned Aerial Vehicles (UAVs) have an increasingly important role in teleoperation.
There are a lot of works in various domains need to be done by UAVs: from doing intelligence activities to exploring remote areas.
It is desirable due to the fact that no on-board pilot is needed, and this helps to decrease the number of loss of life from the pilots and even reduce the operation cost.
In this thesis, we proposed a hand gesture interaction system in the context of directly manipulating an Unmanned Aerial Vehicle by using a commercial hand gesture interaction sensor called Leap Motion.
We design two types of gesture recognition systems, the first is the system of static hand gestures and the second is the system of dynamic hand gestures.
Static hand gestures can be distinguished into three styles of gestures, namely the first person view gestures, direct manipulation gestures, and American sign language gestures.
In these schemes, we collected 8 static sign databases, and classified them using Naive Bayes Classifier, k-Nearest Neighbors, and Support Vector Machine.
On the other side, we utilized 6 dynamic sign trajectory databases for dynamic hand gestures.
The hand movement trajectory is classified by using dynamic time warping.
The experimental results showed the evaluation of our proposed method on the recognition rate and the performance.

Recommendation Letter. . . . . . . . . . . . . . . . . . . . . . . . . . . i Approval Letter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ii Abstract in English. . . . . . . . . . . . . . . . . . . . . . . . . . . . iii Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iv List of Figures. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii List of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1 Background and Motivation. . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Goal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.3 Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2 Literature Review. . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.1 Related Works in Static Gesture Recognition. . . . . . . . . . . . . . 5 2.2 Related Works in Dynamic Gesture Recognition . . . . . . . . . . . . . 7 2.3 Leap Motion Controller . . . . . . . . . . . . . . . . . . . . . . . . 9 2.3.1 Leap Motion System Architecture. . . . . . . . . . . . . . . . . . . 11 2.3.2 Leap Motion API. . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2.4 Quadcopter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 2.4.1 Basic Quadcopter Mechanics . . . . . . . . . . . . . . . . . . . . . 22 2.4.2 The Parrot ARDrone 2 . . . . . . . . . . . . . . . . . . . . . . . . 24 3 Proposed Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 3.1 Preprocessing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 3.1.1 Filtering. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 3.1.2 Features Extraction. . . . . . . . . . . . . . . . . . . . . . . . . 31 3.2 Static Gesture Recognition . . . . . . . . . . . . . . . . . . . . . . 35 3.2.1 Classification of Static Gesture . . . . . . . . . . . . . . . . . . 35 3.2.2 Static Gesture Dataset . . . . . . . . . . . . . . . . . . . . . . . 40 3.3 Dynamic Gesture Recognition. . . . . . . . . . . . . . . . . . . . . . 50 3.3.1 Classification of Dynamic Gesture. . . . . . . . . . . . . . . . . . 50 3.3.2 Dynamic Gesture Dataset. . . . . . . . . . . . . . . . . . . . . . . 55 4 Experimental Results. . . . . . . . . . . . . . . . . . . . . . . . . . 60 4.1 Developing Platform. . . . . . . . . . . . . . . . . . . . . . . . . . 60 4.2 Static Gesture Recognition Experiment. . . . . . . . . . . . . . . . . 62 4.2.1 First Person View Gesture. . . . . . . . . . . . . . . . . . . . . . 62 4.2.2 Direct Manipulation Gesture. . . . . . . . . . . . . . . . . . . . . 64 4.2.3 American Sign Language Gesture . . . . . . . . . . . . . . . . . . . 67 4.2.4 Static Gesture Recognition Summary . . . . . . . . . . . . . . . . . 68 4.3 Dynamic Gesture Recognition Experiment . . . . . . . . . . . . . . . . 70 4.3.1 Dynamic Gesture 1. . . . . . . . . . . . . . . . . . . . . . . . . . 71 4.3.2 Dynamic Gesture 2. . . . . . . . . . . . . . . . . . . . . . . . . . 73 4.3.3 Dynamic Gesture 3. . . . . . . . . . . . . . . . . . . . . . . . . . 74 4.3.4 Dynamic Gesture 4. . . . . . . . . . . . . . . . . . . . . . . . . . 76 4.3.5 Dynamic Gesture 5. . . . . . . . . . . . . . . . . . . . . . . . . . 78 4.3.6 Dynamic Gesture 6. . . . . . . . . . . . . . . . . . . . . . . . . . 80 4.3.7 Dynamic Gesture Recognition Summary. . . . . . . . . . . . . . . . . 82 5 Conclusions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 5.1 Future Work. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 Letter of Authority . . . . . . . . . . . . . . . . . . . . . . . . . . . 89

[1] A. Jafar, S. M. Ahmad, and N. Ahmed, “Mathematical modeling and control law design for 1dof quadcopter flight dynamics,” in 2016 International Conference on Computing, Electronic and Electrical Engineering (ICE Cube), pp. 79–84, April 2016.
[2] M. Benjamin, Drone warfare: Killing by remote control. Verso Books, 2013.
[3] M. Quaritsch, E. Stojanovski, C. Bettstetter, G. Friedrich, H. Hellwagner, B. Rinner, M. Hofbaur, and M. Shah, “Collaborative microdrones: Applications and research challenges,”in Proceedings of the 2nd International Conference on Autonomic Computing and Communication Systems, Autonomics ’08, (ICST, Brussels, Belgium, Belgium), pp. 38:1–38:7, ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), 2001.
[4] K. Pfeil, S. L. Koh, and J. LaViola, “Exploring 3d gesture metaphors for interaction with unmanned aerial vehicles,” in Proceedings of the 2013 International Conference on Intelligent User Interfaces, IUI ’13, (New York, NY, USA), pp. 257–266, ACM, 2013.
[5] J. Lambrecht, M. Kleinsorge, and J. Kruger, “Markerless gesture-based motion control and programming of industrial robots,” in Emerging Technologies Factory Automation (ETFA), 2011 IEEE 16th Conference on, pp. 1–4, Sept 2011.
[6] Y. Song, D. Demirdjian, and R. Davis, “Continuous body and hand gesture recognition for natural human-computer interaction,” ACM Trans. Interact. Intell. Syst., vol. 2, pp.5:1–5:28, Mar. 2012.
[7] M. Urban and P. Bajcsy, “Fusion of voice, gesture, and human-computer interface controls for remotely operated robot,” in Information Fusion, 2005 8th International Conference on, vol. 2, pp. 8 pp.–, July 2005.
[8] T. Fong and C. Thorpe, “Vehicle teleoperation interfaces,” Autonomous Robots, vol. 11, no. 1, pp. 9–18, 2001.
[9] T. B. Sheridan, Telerobotics, Automation, and Human Supervisory Control. Cambridge, MA, USA: MIT Press, 1992.
[10] A. Kanso, I. H. Elhajj, E. Shammas, and D. Asmar, “Enhanced teleoperation of uavs with haptic feedback,” in 2015 IEEE International Conference on Advanced Intelligent Mechatronics (AIM), pp. 305–310, July 2015.
[11] A. Mashood, H. Noura, I. Jawhar, and N. Mohamed, “A gesture based kinect for quadrotor control,” in Information and Communication Technology Research (ICTRC), 2015 International Conference on, pp. 298–301, May 2015.
[12] D. Xu, W. Yao, and Y. Zhang, “Hand gesture interaction for virtual training of spg.,” in ICAT Workshops, pp. 672–676, IEEE Computer Society, 2006
[13] T. N. T. Huong, T. V. Huu, T. L. Xuan, and S. V. Van, “Static hand gesture recognition for vietnamese sign language (vsl) using principle components analysis,” in 2015 International Conference on Communications, Management and Telecommunications (ComManTel), pp. 138–141, Dec 2015.
[14] H. Li, L. Yang, X. Wu, S. Xu, and Y. Wang, “Static hand gesture recognition based on hog with kinect,” in Intelligent Human-Machine Systems and Cybernetics (IHMSC), 2012 4th International Conference on, vol. 1, pp. 271–273, Aug 2012.
[15] M. Mohandes, S. Aliyu, and M. Deriche, “Arabic sign language recognition using the leap motion controller,” in Industrial Electronics (ISIE), 2014 IEEE 23rd International Symposium on, pp. 960–965, June 2014.
[16] T. Starner and A. Pentland, “Real-time american sign language recognition from video using hidden markov models,” in Computer Vision, 1995. Proceedings., International Symposium on, pp. 265–270,1995.
[17] B.-R. Huang, “A virtual interactive manipulation system based on augmented reality,” Master’s thesis, NTUST, 101.
[18] A. F. Ibadillah, “American sign language recognition using principal component analysis and dynamic time warping,” Master’s thesis, National Taiwan University of Science and Technology, 2013.
[19] Y. Wang, C. Yang, X. Wu, S. Xu, and H. Li, “Kinect based dynamic hand gesture recognition algorithm research,” in 2012 4th International Conference on Intelligent Human-Machine Systems and Cybernetics, vol. 1, pp. 274–279, Aug 2012.
[20] D. Xu, Y. L. Chen, C. Lin, X. Kong, and X. Wu, “Real-time dynamic gesture recognition system based on depth perception for robot navigation,” in 2012 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 689–694, Dec 2012.
[21] Y. Chen, B. Luo, Y. L. Chen, G. Liang, and X. Wu, “A real-time dynamic hand gesture recognition system using kinect sensor,” in 2015 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 2026–2030, Dec 2015.
[22] J. Shotton, A. Fitzgibbon, M. Cook, T. Sharp, M. Finocchio, R. Moore, A. Kipman, and A. Blake, “Real-time human pose recognition in parts from single depth images,” in CVPR 2011, pp. 1297–1304, June 2011.
[23] X. Xue, W. Zhong, L. Ye, and Q. Zhang, “The simulated mouse method based on dynamic hand gesture recognition,” in 2015 8th International Congress on Image and Signal Processing (CISP), pp. 1494–1498, Oct 2015.
[24] H.-K. Lee and J. H. Kim, “An hmm-based threshold model approach for gesture recognition,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 21, pp. 961–973, Oct 1999.
[25] M. Spiegelmock, Leap Motion Development Essentials. Packt Publishing, 2013
[26] A. COLGAN, “How does the leap motion controller work?,” 2014.
[27] F. Weichert, D. Bachmann, B. Rudak, and D. Fisseler, “Analysis of the accuracy and robustness of the leap motion controller,” Sensors, vol. 13, no. 5, p. 6380, 2013.
[28] B. Sanders, Mastering Leap Motion. Packt Publishing, 2014.
[29] “Leap motion system architechture.” https://developer.leapmotion.com/documentation/cpp/devguide/Leap_Architecture.html. Accessed: 2016-02-01.
[30] A. Melnikov and I. Fette, “The WebSocket Protocol.” RFC 6455, Oct. 2015.
[31] “Leap motion api overview.” https://developer.leapmotion.com/documentation/cpp/devguide/Leap_Overview.html. Accessed: 2016-02-01.
[32] J. J. Engel, “Autonomous camera-based navigation of a quadrocopter,” Master’s thesis, TUM, 2011.
[33] T. Krajník, V. Vonásek, D. Fišer, and J. Faigl, AR-Drone as a Platform for Robotic Research and Education, pp. 172–186. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011.
[34] C. Bills, J. Chen, and A. Saxena, “Autonomous mav flight in indoor environments using single image perspective cues,” in Robotics and Automation (ICRA), 2011 IEEE International Conference on, pp. 5776–5783, May 2011.
[35] “Ar.drone 2.0.” www.parrot.com/products/ardrone-2. Accessed: 2016-02-01.
[36] “Ar-drone developer guide for sdk 1.6,” 2011.
[37] S. W. Smith, The Scientist and Engineer’s Guide to Digital Signal Processing. San Diego, CA, USA: California Technical Publishing, 1997.
[38] G. Marin, F. Dominio, and P. Zanuttigh, “Hand gesture recognition with leap motion and kinect devices,” in 2014 IEEE International Conference on Image Processing (ICIP), pp. 1565–1569, Oct 2014.
[39] C. H. Chuan, E. Regina, and C. Guardino, “American sign language recognition using leap motion sensor,” in Machine Learning and Applications (ICMLA), 2014 13th International Conference on, pp. 541–544, Dec 2014.
[40] N. E. Gillian and J. A. Paradiso, “The gesture recognition toolkit.,” Journal of Machine Learning Research, vol. 15, no. 1, pp. 3483–3487, 2014.
[41] T. Cover and P. Hart, “Nearest neighbor pattern classification,” IEEE Transactions on Information Theory, vol. 13, pp. 21–27, January 1967.
[42] V. V. Corinna Cortes, Support-vector networks. 1995.
[43] C.-C. Chang and C.-J. Lin, “Libsvm: A library for support vector machines,” ACM Trans. Intell. Syst. Technol., vol. 2, pp. 27:1–27:27, May 2011.
[44] LeapMotion, “conversion,” 2016. Accessed: 2016-02-01
[45] puku0x, “cvdrone.” https://github.com/puku0x/cvdrone, 2014.
[46] G. Nick, “grt.” https://github.com/nickgillian/grt, 2014.
[47] L. E. Potter, J. Araullo, and L. Carter, “The leap motion controller: A view on sign language,” in Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration, OzCHI ’13, (New York, NY, USA), pp. 175–178, ACM, 2013.
[48] Y. Lee, “Handwritten digit recognition using k nearest-neighbor, radial-basis function, and backpropagation neural networks,” Neural Comput., vol. 3, pp. 440–449, Sept. 1991.
[49] M. Müller, Information Retrieval for Music and Motion. Secaucus, NJ, USA: Springer-Verlag New York, Inc., 2007.

無法下載圖示 全文公開日期 2022/01/26 (校內網路)
全文公開日期 本全文未授權公開 (校外網路)
全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
QR CODE