簡易檢索 / 詳目顯示

研究生: Xanno Kharis Sigalingging
Xanno - Kharis Sigalingging
論文名稱: A Human-Computer Interface Classification Algorithm using EMG-Based Facial Gesture Recognition
A Human-Computer Interface Classification Algorithm using EMG-Based Facial Gesture Recognition
指導教授: 陳俊良
Jiann-Liang Chen
呂政修
Jenq-Shiou Leu
口試委員: 林昌鴻
Chang Hong Lin
蔡子傑
Tzu-Chieh Tsai
學位類別: 碩士
Master
系所名稱: 電資學院 - 電機工程系
Department of Electrical Engineering
論文出版年: 2017
畢業學年度: 105
語文別: 英文
論文頁數: 34
中文關鍵詞: humaninterfacedevice(HID)electromyograph(EMG)hiddenmarkovmodel(HMM)assistivetechnology(AT)
外文關鍵詞: human interface device (HID), electromyograph (EMG), hidden markov model (HMM), assistive technology (AT)
相關次數: 點閱:222下載:18
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • An assistive technology (AT) in the form of a novel human interface device (HID) using (electromyography) EMG-based gesture recognition scheme is proposed in this thesis. The system utilizes MYO device from Thalmic Labs with a custom software as the signal capturing tool. The electrodes of the EMG are placed on certain positions on the face, corresponding to the locations of the major muscles that govern certain facial gestures. The signals are then processed using a Hidden Markov model (HMM) algorithm to classify the gesture, and based on the gesture a custom command can be assigned. Depending on the gesture made, there are countless possible command. The accuracy of the system is 94.4% with 5 gestures classification


    An assistive technology (AT) in the form of a novel human interface device (HID) using (electromyography) EMG-based gesture recognition scheme is proposed in this thesis. The system utilizes MYO device from Thalmic Labs with a custom software as the signal capturing tool. The electrodes of the EMG are placed on certain positions on the face, corresponding to the locations of the major muscles that govern certain facial gestures. The signals are then processed using a Hidden Markov model (HMM) algorithm to classify the gesture, and based on the gesture a custom command can be assigned. Depending on the gesture made, there are countless possible command. The accuracy of the system is 94.4% with 5 gestures classification

    Abstract i Acknowledgement ii Contents iii List of Figures v List of Tables vi Chapter 1 Introduction 1 Chapter 2 Related Works 4 2.1 Assistive Technology 4 2.1.1 Physical Motion-Based Assistive Technology 5 2.1.2 Voice Recognition-Based Assistive Technology 5 2.1.3 Motion Tracking-Based Assistive Technology 5 2.1.4 Electromagnetic Sensor-Based Assistive Technology 5 2.1.5 Physiological Signal-Based Assistive Technology 6 2.2 Gesture Recognition 6 Chapter 3 Electromyography 8 3.1 Muscle and Action Potential 8 3.2 Eelectromyograph Device 8 3.2.1 Analog Circuit 9 3.2.2 Digital Circuit 9 3.3 Electrodes 10 3.4 Facial Action Coding System 12 Chapter 4 Hidden Markov Model 14 4.1 Definition 14 4.2 Baum Welch Algorithm 17 Chapter 5 Proposed Method 20 5.1 Signal Acquisition 20 5.2 Signal Preprocessing 21 5.3 Hidden Markov Model Classification 22 Chapter 6 Evaluation 24 6.1 Experiment Setup 24 6.2 Experiment Results 25 Chapter 7 Conclusion 28 Bibliography 29

    [1] WHO [Online]. Available: http://www.who.int/mediacentre/factsheets/ fs384/ en/
    [2] A. B. Jackson, M. Dijkers, M. J. DeVivo, R. B. Poczatek, “A demographic profile of new traumatic spinal cord injuries: Change and stability over 30 years,” Archives of Physical Medicine and Rehabilitation, November 2004
    [3] G. Pacnik, K. Benkic, and B. Brecko, “Voice operated intelligent wheelchair– VOIC,” Proc. ISIE, vol. 3, pp. 1221–1226, 2005.
    [4] U. Qidwai and M. Shakir, "Ubiquitous Arabic voice control device to assist people with disabilities," 4th International Conference on Intelligent and Advanced Systems (ICIAS2012), Kuala Lumpur, 2012.
    [5] J.R. Wolpaw, N. Birbaumer, D.J. McFarland, G. Pfurtscheller, and T.M. Vaughan, “Brain-computer interfaces for communication and control,” Clinical Neurophysiology, vol. 113, pp. 767–791, 2002.
    [6] U. Qidwai and M. Shakir, "Fuzzy Classification-based control of wheelchair using EEG data to Assist People with Disabilities," Lecture Notes in Computer Science 7666, Springer, pp. 458-467, December 2012.
    [7] I. S. MacKenzie, “Evaluating eye tracking systems for computer input,” in Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies. Hershey, PA, USA: IGI Global, 2012, pp. 205–225.
    [8] S. Guness, F. Deravi, K. Sirlantzis, M. Pepper and M. Sakel, "Evaluation of vision-based head-trackers for assistive devices," Proc. IEEE Annu. Int. Conf. Eng. Med. Biology Soc., pp. 4804-4807, 2012
    [9] H. Jiang, B. S. Duerstock, and J. P. Wachs, “An Analytic Approach to Decipher Usable Gestures for Quadriplegic Users,” presented at the 2014 IEEE International Conference on Systems, Man and Cybernetics, San Diego, CA, USA, 2014.
    [10] Z. P. Bian, J. Hou, L. P. Chau and N. Magnenat-Thalmann, "Facial Position and Expression-Based Human–Computer Interface for Persons With Tetraplegia," in IEEE Journal of Biomedical and Health Informatics, vol. 20, no. 3, pp. 915-924, May 2016.
    [11] S. Epstein, E. Missimer and M. Betke, "Using kernels for a video-based mouse-replacement interface," Pers. Ubiquitous Comput., vol. 18, no. 1, pp. 47-60, 2014
    [12] C. Mandel, L. Thorsten, T. Laue, R. Thomas, A. Gr, C. Mandel, L. Thorsten, T. Laue, R. Thomas, A. Gr, "Navigating a Smart Wheelchair with a Brain-Computer Interface Interpreting Steady-State Visual Evoked Potentials", IEEE/RSJ International Conference on Intelligent Robots and Systems”, pp. 1118-1125, 2009.
    [13] K. Ullah, M. Ali, M. Rizwan and M. Imran, "Low-cost single-channel EEG based communication system for people with lock-in syndrome," 2011 IEEE 14th International Multitopic Conference, Karachi, 2011, pp. 120-125.
    [14] M.H. Jali, M.F. Sulaima, T.A. Izzuddin, W.M. Bukhari, M.F. Baharom, "Comparative Study of EMG based Joint Torque Estimation ANN Models for Arm Rehabilitation Device", International Journal of Applied Engineering Research, vol. 9, no. 10, pp. 1289-1301, 2014.
    [15] T. A. Izzuddin, M. A. Ariffin, Z. H. Bohari, R. Ghazali and M. H. Jali, "Movement intention detection using neural network for quadriplegic assistive machine," 2015 IEEE International Conference on Control System, Computing and Engineering (ICCSCE), George Town, 2015, pp. 275-280.
    [16] M. Mazo, “An integral system for assisted mobility automated wheelchair,” IEEE Robot. Autom. Mag., vol. 8, no. 1, pp. 46–56, Mar. 2001.
    [17] C. Lau and S. O’Leary, “Comparison of computer interface devices for persons with severe physical disabilities,” Amer. J. Occupat. Therapy, vol. 47, no. 11, pp. 1022–1030, Nov. 1993.
    [18] M. Jose and R. de Deus Lopes, “Human-computer interface controlled by the lip,” IEEE J. Biomed. Health Informat., vol. 19, no. 1, pp. 302–308, Jan. 2015.
    [19] X. Huo, H. Park, J. Kim, and M. Ghovanloo, “A dual-mode human computer interface combining speech and tongue motion for people with severe disabilities,” IEEE
    [20] J.-S. Park, G.-J. Jang, J.-H. Kim, and S.-H. Kim, “Acoustic interference cancellation for a voice-driven interface in smart TVs,” IEEE Trans. Consum. Electron., vol. 59, no. 1, pp. 244–249, Feb. 2013.
    [21] X. Zhang and I. S. MacKenzie, “Evaluating eye tracking with ISO 9241 part 9,” in Proc. 12th Int. Conf. Human-Comput. Interaction: Intell. Multimodal Interaction Environ., 2007, pp. 779–788.
    [22] B. Yousefi, X. Huo, E. Veledar, and M. Ghovanloo, “Quantitative and comparative assessment of learning in a tongue-operated computer input device,” IEEE Trans. Inform. Technol. Biomed., vol. 15, no. 5, pp. 747–757, Sep. 2011.
    [23] M. Betke, J. Gips, and P. Fleming, “The camera mouse: Visual tracking of body features to provide computer access for people with severe disabilities,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 10, no. 1, pp. 1–10, Mar. 2002.
    [24] S. Guness, F. Deravi, K. Sirlantzis, M. Pepper, and M. Sakel, “Evaluation of vision-based head-trackers for assistive devices,” in Proc. IEEE Annu. Int. Conf. Eng. Med. Biology Soc., Aug. 2012, pp. 4804–4807.
    [25] T. Morris and V. Chauhan, “Facial feature tracking for cursor control,” J. Netw. Comput. Appl., vol. 29, no. 1, pp. 62–80, 2006.
    [26] J. Tu, H. Tao, and T. Huang, “Face as mouse through visual face tracking,” Comput. Visi. Image Understanding, vol. 108, no. 12, pp. 35–40, 2007, Special Issue on Vision for Human-Comput. Interaction.
    [27] S. Epstein, E. Missimer, and M. Betke, “Using kernels for a video-based mouse-replacement interface,” Pers. Ubiquitous Comput., vol. 18, no. 1, pp. 47–60, 2014.
    [28] X. Huo, J. Wang, and M. Ghovanloo, “Introduction and preliminary evaluation of the tongue drive system: wireless tongue-operated assistive technology for people with little or no upper-limb function,” Journal of rehabilitation research and development, 45(6):921–930, 2007.
    [29] T. S. Saponas, D. Kelly, B. A. Parviz, D. S. Tan, “Optically sensing tongue gestures for computer input,” In Proceedings of the 22nd annual ACM symposium on User interface software and technology, pages 177–180. ACM, 2009.
    [30] P. McCool, G. Fraser, A. Chan, L. Petropoulakis, and J. Soraghan, “Identification of contaminant type in surface electromyography (EMG) signals,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 22, no. 4, pp. 774–783, Jul. 2014.
    [31] M. Williams and R. Kirsch, “Evaluation of head orientation and neck muscle EMG signals as command inputs to a human-computer interface for individuals with high tetraplegia,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 16, no. 5, pp. 485–496, Oct. 2008.
    [32] V. Mihajlovic, B. Grundlehner, R. Vullers, and J. Penders, “Wearable, wireless EEG solutions in daily life applications: What are we missing?” IEEE J. Biomed. Health Informat., vol. 19, no. 1, pp. 6–21, Jan. 2015.
    [33] J. Mak and J. Wolpaw, “Clinical applications of brain-computer interfaces: Current state and future prospects,” IEEE Rev. Biomed. Eng., vol. 2, pp. 187–199, 2009. DOI 10.1109/RBME.2009.2035356
    [34] Y. Nam, B. Koo, A. Cichocki, and S. Choi, “GOM-Face: GKP, EOG, and EMG-based multimodal interface with application to humanoid robot 924 IEEE Journal of Biomedical and Health Informatics, Vol. 20, No. 3, May 2016 control,” IEEE Trans. Biomed. Eng., vol. 61, no. 2, pp. 453–462, Feb. 2014.
    [35] R. Xu, S. Zhou, and W. Li, “MEMS accelerometer based nonspecificuser hand gesture recognition,” IEEE Sensors J., vol. 12, no. 5, pp. 1166–1173, May 2012.
    [36] M. T. Wolf, C. Assad, M. T. Vernacchia, J. Fromm, and H. L. Jethani, “Gesture-based robot control with variable autonomy from the JPL BioSleeve,” in Proc. IEEE Int. Conf. Robot. Autom., May 2013, pp. 1160–1165.
    [37] S. Iengo, S. Rossi, M. Staffa, and A. Finzi, “Continuous gesture recognition for flexible human-robot interaction,” in Proc. IEEE Int. Conf. Robot Autom., May/Jun. 2014, pp. 4863–4868.
    [38] L. Hargrove, K. Englehart, and B. Hudgins, “A comparison of surface and intramuscular myoelectric signal classification,” IEEE Trans. Biomed. Eng., vol. 54, no. 5, pp. 847–853, May 2007.
    [39] B. Giroux and M. Lamontagne, “Comparisons between surface electrodes and intramuscular wire electrodes in isometric and dynamic conditions,” Electromyogr. Clin. Neurophysiol., vol. 30, no. 7, pp. 397–405, 1990.
    [40] Y.-H. Chiou, J.-J. Luh, S.-C. Chen, J.-S. Lai, and T.-S. Kuo, “The comparison of electromyographic pattern classifications with active and passive electrodes,” Med. Eng. Phys., vol. 26, no. 7, pp. 605–610, 2004.
    [41] J. Hamm, C. G. Kohler, R. C. Gur, R. Verma, "Automated Facial Action Coding System for dynamic analysis of facial expressions in neuropsychiatric disorders," Journal of Neuroscience Methods. 2011. pp. 237–256.
    [42] A. Freitas-Magalhães, “Microexpression and macroexpression,” in V. S. Ramachandran (Ed.), Encyclopedia of Human Behavior. 2012. Vol. 2, pp. 173–183. Oxford: Elsevier/Academic Press. ISBN 978-0-12-375000-6
    [43] J. Bilmes, “Natural statistical models for automatic speech recognition,” PhD thesis, CS Division, U.C. Berkeley, 1999.
    [44] Rabiner, R. A. Lawrence, “Tutorial on Hidden Markov Models and selected applications in speech recognition,” IEEE, 1989.
    [45] L. Moss, “Example of the Baum-Welch Algorithm,” 2008.
    [46] M. R. Williams and R. F. Kirsch, “Evaluation of head orientation and neck muscle EMG signals as command inputs to a human-computer interface for individuals with high tetraplegia,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 16, no. 5, pp. 485–496, Oct. 2008.
    [47] E. W. Sellers, D. J. Krusienski, D. J. McFarland, T. M. Vaughan, and J. R. Wolpaw, “A P300 event-related potential brain-computer interface (BCI): The effects of matrix size and inter stimulus interval on performance,” Biol. Psychol., vol. 73, pp. 242–252, Oct. 2006.
    [48] F. Nijboer, E. W. Sellers, J. Mellinger, M. A. Jordan, T. Matuz, A. Furdea, S. Halder, U. Mochty, D. J. Krusienski, T. M. Vaughan, J. R. Wolpaw, N. Birbaumer, and A. Kubler, “A P300-based brain-computer interface for people with amyotrophic lateral sclerosis,” Clin. Neurophysiol., vol. 119, pp. 1909–1916, Aug. 2008.
    [49] C. Choi, B. C. Rim, and J. Kim, “Development and evaluation of an assistive computer interface by sEMG for individuals with spinal cord injuries,” in Proc. IEEE Int. Conf. Rehabilitation Robotics, Zurich, Switzerland, 2011.
    [50] S. Benatti et al., "A Versatile Embedded Platform for EMG Acquisition and Gesture Recognition," in IEEE Transactions on Biomedical Circuits and Systems, vol. 9, no. 5, pp. 620-630, Oct. 2015.

    QR CODE