簡易檢索 / 詳目顯示

研究生: 黃俊家
CHUN-CHIA HUANG
論文名稱: 低自由度機器頭顱表情機構之研發與應用
Development and Application of a Face Robot with Simplified Facial Expression Mechanism
指導教授: 林其禹
Chyi-Yeu Lin
口試委員: 黃漢邦
Han-Pang Huang
王文俊
Wen-June Wang
邱士軒
Shih-Hsuan Chiu
林紀穎
Chi-Ying Lin
學位類別: 博士
Doctor
系所名稱: 工程學院 - 機械工程系
Department of Mechanical Engineering
論文出版年: 2013
畢業學年度: 101
語文別: 中文
論文頁數: 157
中文關鍵詞: 划拳機器人機械手臂機械頭顱臉部表情人形機器人
外文關鍵詞: finger gaming robot, robot arm, face robot, facial expression, humanoid
相關次數: 點閱:365下載:2
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 在中國的餐飲文化中,飲酒的文化是相當重要與歷史悠久的,其中在喝酒過程中所進行的酒遊戲,可說是華人的特有文化之一。為了帶給消費者有趣的用餐體驗,以及為餐飲業帶來全新的高科技娛樂選擇,本文提出發展一部具有表情展示能力的人形划拳機器人。其中一具符合成本要求的機械頭顱,與一組符合划拳速度要求與耐用的機械手臂與手掌,都是構成一部成功划拳機器人的重要技術項目。因此在機械頭顱的部份,本文透過分析與歸納目前機械頭顱的普遍作法後,提出一種低自由度表情產生的裝置,可以使用較少的致動器,來實現高度表情展示的能力,並且還保有說話嘴型的能力,為表情機械頭顱的發展,提供一種更節省成本與能源的發展方向。而機械手臂與手掌的部分,透過觀察與歸納玩家的動作後,划拳機器人的手臂被設計成僅具有一個自由度,並且使用一種合乎一般玩家的動作的固定往復姿態來進行划拳遊戲,手掌則是具備五個自由度,可以做出0至5的六種手勢,並且以低於人類可取巧的時間完成出拳的動作。


    In the Chinese dining culture, drinking is a crucial element with a long history, and the drinking games played in the dining process are one of many unique customs in Chinese. This study propose developing a finger gaming robot with facial expression in order to bring the dining industry a revolutionary high-tech entertainment choice and the customers with an interesting and fun dining experience. A face robot that meets cost requirements and a robot arm and palm that meet the speed and durability requirements are all important components of a successful finger gaming robot. As for the face robot, this study has developed a lower degree-of-freedom facial expressional mechanism to generate more various facial expressions and to keep basic mouth shape variation. This new design provides a new direction of face robot development to reduce cost and save energy. As to the robot arm and palm of this robot, after we observe the motion of players and induce all results, the arm is designed with one degree-of-freedom and it will follow the similar routine movements as most of the players in the finger guessing game. The palm is designed with five degrees-of-freedom to generate six hand gestures (0~5) and the speed of a gesture change for the robot is faster than reaction speed of people.

    中文摘要......................................I 英文摘要......................................II 誌謝.........................................III 目錄.........................................IV 圖目錄........................................VI 表目錄........................................IX 1.緒論........................................1 1.1.研究動機...................................1 1.2.研究目的...................................2 1.3.文獻回顧...................................3 1.4.預期貢獻...................................11 1.5.論文結構...................................11 2.擬真機械頭顱硬體設計...........................13 2.1.面部表情肌肉分佈............................13 2.2.表情動作編碼系統............................16 2.3.機械頭顱面部變化方法與表情控制點安排............19 2.4.臉皮的製作.................................24 2.5.擬真機械頭顱的演進...........................27 2.5.1.第一代擬真機械頭顱-Casper.................27 2.5.2.第二代擬真機械頭顱-Janet..................30 2.5.3.第三代擬真機械頭顱-Thomas & Eve...........36 2.5.4.第四代擬真機械頭顱-Judy...................41 3.低自由度表情產生機構...........................44 3.1.表情分析...................................44 3.1.1.表情動作歸納..............................44 3.1.2.表情動作簡化..............................47 3.2.具低自由表情產生裝置之擬真機械頭顱設計...........52 3.2.1.眼睛模組.................................54 3.2.2.表情產生模組..............................55 3.2.3.臉皮製作.................................61 3.3.系統架構...................................63 3.4.表情展示...................................67 3.5.其他低自由度機械頭顱設計......................67 3.5.1.簡化版機械頭顱-Eason II...................67 3.5.2.簡化版機械頭顱-Nicole.....................75 4.可划拳機器人之研發.............................82 4.1.划拳玩法說明 .............................83 4.2.划拳機器人尺寸與周邊硬體規劃...................84 4.3.機械手臂設計................................88 4.3.1.划拳時手臂動作分析.........................89 4.3.2.手臂機構設計..............................93 4.4.機械手掌設計................................99 4.4.1.手掌動作分析..............................99 4.4.2.手掌機構設計..............................101 4.5.控制系統架構................................103 4.6.划拳機器人已開發之功能........................106 4.6.1.手勢辨識系統..............................107 4.6.2.語音辨識系統..............................109 4.6.3.划拳遊戲系統..............................111 5.實驗與評估...................................113 5.1.簡化型擬真機械頭顱...........................114 5.1.1.表情辨識率分析............................114 5.1.2.變形量分析...............................116 5.1.3.輸出功率比較.............................122 5.2.划拳機械手臂與手掌...........................123 5.2.1.速度分析.................................123 5.2.2.疲勞分析.................................124 5.3.結果與討論.................................127 6.結論與未來展望................................131 6.1.結論......................................131 6.2.未來展望與建議..............................132 參考文獻.......................................135 附錄..........................................144

    [1] V. Kumar, G. Bekey and Y. Zheng, “Industrial, personal and service robots,” In international assessment of research and development in robotics. Baltimore: World Technology Evaluation Center, pp. 55-62, 2006.
    [2] Restaurant Technology Inc., “Food preparation robot,” US4922435, 1988.
    [3] W. X. Yan, Z. Fu, Y. H. Liu, Y. Z. Zhao, X. Y. Zhou, J. H. Tang and X. Y. Liu, “A novel automatic cooking robot for Chinese dishes,” Robotica, vol. 25, no. 4, pp. 445-450, 2007.
    [4] W. T. Ma, W. Yan, Z. Fu and Y. Zhao, “A Chinese cooking robot for elderly and disabled people,” Robotica, vol. 29, no. 06, pp. 843–852, 2011.
    [5] A. Jambulingam Muthu, P. Chembian J. Jesly and G. Aravindh, “Autonomous Chef-Bot,” Proceedings of the 2008 International Conference on Embedded Systems & Applications, Las Vegas, Nevada, pp. 289-294, 2008.
    [6] Robot Ramen Restaurant , Available from: http://kisyoku.bakufu.org/fa.htm [Accessed 31 May 2013]
    [7] B. Graf, C. Parlitz and M. Hagele, “Robotic home assistant care-O-bot 3 - Product vision and innovation platform,” Proceedings of IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO2009), Tokyo, Japan, pp. 139-144, 2009.
    [8] Restaurant service robot FURO-R2, Available from: http://www.futurerobot.com/contents_eng/sub42.htm [Accessed 31 May 2013]
    [9] S. Pieska, M. Luimula, J. Jauhiainen, and V. Spiz, “Social service robots in public and private environments,” Proceedings of the 12th WSEAS international conference on Robotics, Control and Manufacturing Technology, pp. 190-195, 2012.
    [10] Q. Yu, C. Yuan, Z. Fu, and Y. Zhao, “An autonomous restaurant service robot with high positioning accuracy,” Industrial Robot, vol.39, no. 3, pp. 271-281, 2012.
    [11] Robot waiters in Chinese restaurant, Available from:
    http://www.designboom.com/technology/chinas-first-robot-restaurant/ [Accessed 31 May 2013]
    [12] Hajime Robot Restaurant, Available from:
    http://hajimerobot.com/bangkok/en/aboutUs.php [Accessed 31 May 2013]
    [13] Noodle cutting robot, Available from:
    http://www.youtube.com/watch?v=ukNkCnNJuR8 [Accessed 31 May 2013]
    [14] Restaurant service robot in Harbin, Available from:
    http://english.sina.com/technology/p/2012/0621/479172.html [Accessed 31 May 2013]
    [15] Restaurant service robot in Shanghai, Available from:
    http://v.youku.com/v_show/id_XMjQ1NTU4Njky.html [Accessed 31 May 2013]
    [16] N. Montfort and I. Bogost, “Racing the beam: The Atari video computer system,” The MIT Press, 2008.
    [17] H. Du, “From altar to forests: chinese ancient wine’s cultural functions,” Asian Culture and History, vol. 4, no. 3, pp. 118-121, 2012.
    [18] Chinese drinking game, Available from:
    http://www.travelchinaguide.com/intro/cuisine_drink/alcohol/jiuling.htm [Accessed 31 May 2013]
    [19] L. Jiang, “2011. Comparison of the difference between chinese and western drinking culture,” Asian Culture and History, vol. 7, no. 5, pp. 251-257, 2011.
    [20] A. Mehrabian, “Communication without Words,” Psychology Today, vol. 2, no. 4, pp. 53-56, 1968.
    [21] Y. Hasuda, S. Ishibashi and J. Ishikawa, “A robot designed to play the game Rock, Paper, Scissors,” Proceedings of the 2007 IEEE International Symposium on Industrial Electronics, Piscataway: IEEE, pp. 2065-2070, 2007.
    [22] H. Kobayashi and F. Hara, “Study on face robot for active human interface-mechanisms of face robot and expression of 6 basic facial expressions,” Proceedings of IEEE International Workshop on Robot and Human Communication, pp.276-281, 1993.
    [23] H. Kobayashi, “Study on Face Robot Platform as a KANSEI Medium,” 2000 IEEE International Conference on Industrial Electronics, Control and Instrumentation, pp. 481-486, 2000.
    [24] F. Hara, H. Akazawa and H. Kobayashi, “Realistic Facial Expressions by SMA Driven Face Robot,” Proceedings of IEEE International Workshop on Robot and Human Communication, pp.504-511, 2001.
    [25] H. Kobayashi, Y. Ichikawa, M. Senda, and T. Shiiba, “Realization of Realistic and Rich Facial Expressions by Face Robot,” In IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 1123-1128, 2003.
    [26] D. Hanson, G. Pioggia, Y. Bar-Cohen, D. De Rossi, “Androids: Application of EAP as Artificial Muscles to Entertainment Industry,” Proc. of EAPAD, SPIE’s 8th Annual Int. Symp. of Smart Structures and Materials. 2001.
    [27] D. Hanson, G. Pioggia, S. Dinelli, F. Di Francesco, R. Francesconi, D. De Rossi, “Bio-inspired Facial Expression Interfaces for Emotive Robots”, Proc. AAAI National Conference in Edmonton, CA, 2002.
    [28] T. Minato, M. Shimada, H. Ishiguro, S. Itakura, “Development of an android robot for studying human-robot interaction”, Proc. 17th Int. Conf. on Industrial and Engineering Applications of Artificial Intelligence and Expert Systems; Ottawa, pp. 424–434, 2004.
    [29] D. Matsui, T. Minato, K. MacDorman, and H. Ishiguro, “Generating natural motion in an android by mapping human motion,” Intelligent Robots and Systems, 2005. (IROS 2005). 2005 IEEE/RSJ International Conference on, pp. 3301–3308, Aug. 2005.
    [30] M. Shimada, T. Minato, S. Itakura, and H. Ishiguro, “Evaluation of android using unconscious recognition,” In Proceedings of the IEEE- RAS International Conference on Humanoid Robots, pp. 157–162, 2006.
    [31] Kokoro Company Ltd, Available from: http://www.kokoro-dreams.co.jp [Accessed 31 May 2013]
    [32] W. Wu., Q. Men and Y. Wang, “Development of the Humanoid Head Portrait Robot System With Flexible Face and Expression,” IEEE International Conference on Robotics and Biomimetics, Shenyang, China, pp. 757–762, 2004.
    [33] J. H. Oh, D. Hanson, W. S. Kim, Y. Han, J. Y. Kim, I. W. Park, “Design of Android type Humanoid Robot Albert HUBO”, in Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, pp. 1428-1433, Oct. 2006.
    [34] Hanson Robotics Company Ltd, Available from: http://www.hansonrobotics.com [Accessed 31 May 2013]
    [35] D. W. Lee, T. G. Lee, B. So, M. Choi, E. C. Shin, K. W. Yang, M. H. Back, H. S. Kim and H. G. Lee, “Development of an Android for Emotional Expression and Human Interaction,” in Proceedings of International Federation of Automatic Control, Seoul, Korea, pp. 4336-4337, 2008.
    [36] K. Berns and T. Braun, “Design concept of a human like robot head,” in Proceedings of the IEEE-RAS/RSJ International Conference on Humanoid Robots (Humanoids), Tsukuba, Japan, pp. 32–37, December, 2005.
    [37] K. Berns and J. Hirth, “Control of facial expressions of the humanoid robot head roman,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Beijing, China, October 9 – 15, 2006.
    [38] T. Hashimoto, S. Hiramatsu, H. Kobayashi, “Development of Face Robot for Emotional Communication between Human and Robot”, T. Hashimoto, S. Hiramatsu, H. Kobayashi, “Development of Face Robotfor Emotional Communication between Human and Robot”, in Proc. IEEE Int. Conf. Mechatronics and Automation (ICMA 06), pp. 25-30, Jun, 2006.
    [39] H. Minoru, Y. Chisaki, et al., “Development and Control of a Face Robot Imitating Human Muscular Structures,” IEEE/RSJ International Conference on Intelligent Robots and Systems, pp.1855-1860, Oct. 2006.
    [40] K. Kaneko, F. Kanehiro, M. Morisawa, K. Miura, S. Nakaoka and S. Kajita, “Cybernetic Human HRP-4C,” in IEEE-RAS International Conference on Humanoid Robots, Paris, France, pp. 7-14, December, 2009.
    [41] S. Nakaoka, F. Kanehiro, K. Miura, M. Morisawa, K. Fujiwara, K. Kaneko, S. Kajita, and H. Hirukawa, “Creating facial motions of Cybernetic Human HRP-4C,” in IEEE-RAS International Conference on Humanoid Robots, Paris, France, pp. 561-567, December, 2009.
    [42] B. Allison, G. Nejat and E. Kao, “The Design of an Expressive Human-like Socially Assistive Robot,” ASME Journal of Mechanisms and Robotics, vol. 1, no 1, pp. 1-8, 2009.
    [43] C. Y. Lin, C. K. Tseng, W. C. Teng, W. C. Lee, C. H. Kuo, H. Y. Gu, K. L. Chung, and C. S. Fahn, “The realization of robot theater-humanoid robots and theatric performance,” presented at the Int. Conf. Adv. Rob. (ICAR), Munich, Germany, Jun. 22–26, 2009.
    [44] C. Y. Lin, L. C. Cheng, and C. K. Tseng, “The development of mimetic engineering for theatric android head,” presented at the 2009 Int. Conf. on Service and Interactive Rob. (SIRCon), Taipei, Taiwan, Aug. 6–8, 2009.
    [45] C. Y. Lin, L. C. Cheng, C. K. Tseng, H. Y. Gu. K. L. Chung, C. S. Fahn, K. J. Lu and C. C. Chang, “A face robot for autonomous simplified musical notation reading and singing,” Robotics and Autonomous Systems, submitted for publication.
    [46] P. Ekman and W. V. Friesen, “Unmasking the Face,” Prentice-hall, Inc., 1975.
    [47] P. Ekman and W. V. Friesen, “The Facial Action Coding System,” Consulting Psychologists Press, 1978.
    [48] G. W. Jenkins, C. P. Kemnitz and G. J. Tortora, “Anatomy and physiology: from science to life,” John Wiley & Sons Inc., pp. 348-353, vol. 11, 2007.
    [49] 許世昌,新編解剖學 最新修訂版,永大書局,2000。
    [50] 鄭欽元,具表情人臉機械頭顱之設計製作,國立台灣科技大學機械工程系碩士學位論文,2007。
    [51] 鄭立傑,劇場表演用人形機器人頭顱之研發、評估與改善,國立台灣科技大學機械工程系博士學位論文,2012。
    [52] 邱培文,人臉表情之視覺辨識技術,國立台灣科技大學機械工程系碩士學位論文,2006。
    [53] 曾昌國,仿真人臉機器頭顱之整合設計與應用,國立台灣科技大學機械工程系博士學位論文,2009。
    [54] Ministry of Education, Available: http://www.edu.tw, 2013.
    [55] C. Y. Lin, L. C. Cheng, C. C. Huang, L. W. Chuang, W. C. Teng, C. H. Kuo, H. Y. Gu, K. L. Chung and C. S. Fahn, “Versatile Humanoid Robots for Theatric Performance,” International Journal of Advanced Robotic Systems, ISBN: 1729-8806, InTech, DOI: 10.5772/50644. Available from: http://www.intechopen.com/journals/international_journal_of_advanced_robotic_systems/versatile-humanoid-robots-for-theatrical-performances
    [56] C. Y. Lin, C. K. Tseng, W. C. Teng, W. C., Lee, C. H. Kuo, H. Y. Gu, K. L. Chung and C. S. Fahn, “The realization of robot theater: humanoid robots and theatric performance,” In: Proceedings of the International Conference on Advanced Robotics (ICAR), Munich, Germany, pp.1-6, 2009.
    [57] L. C. Cheng, C. Y. Lin and C. C. Huang, “Visualization of Facial Expression Deformation Applied to the Mechanism Improvement of Face Robot,” International Journal of Social Robotics, Online ISSN: 1875-4805, DOI: 10.1007/s12369-012-0168-5, October, 2012.
    [58] C. S. Chen, Y. P. Hung, C. C. Chiang and J. L. Wu, “Range Data Acquisition Using Color Structured Lighting and Stereo Vision,” Image Vision Comput., vol. 15, no. 6, pp. 445-456,1997.
    [59] N. A. Dodgson, “Variation and extrema of human interpupillary distance,” Proceedings of the SPIE The International Society for Optical Engineering 5291, pp. 36–46, 2004.
    [60] 行政院勞工安全衛生研究所,人體計測資料庫,綱:http://www.iosh.gov.tw/,中華民國102年6月擷取。
    [61] 中華民國紡織拓展會服裝打樣中心,針對女裝尺碼分析,網址: http://designatelier.textiles.org.tw/TTFP_index.asp,中華民國102年6月擷取。
    [62] Y. Ogura, H. Aikawa, K. Shimomura, H. Kondo, A. Morishima, H.O. Lim, and A. Takanishi, “Development of A Humanoid Robot WABIAN-2,” Proc. IEEE Int. Conference on Robotics and Automation, pp. 76-81, 2006.
    [63] J. H. Oh, D. Hanson, W. S. Kim, I. Y. Han, J. Y. Kim and I. W. Park, “ Design of android type humanoid robot Albert HUBO,” In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Beijing, China, pp.1428-1433, 2006.
    [64] 1walicki, Einstein robot does Tai Chi [Video clip]. YouTube. Available from: http://www.youtube.com/watch?v=-5Cxae-lG4E&feature=related [Accessed 31 May 2013].
    [65] S. S. Lv, Q. Huang, Y. Shi, C. C. Li and K. J. Li, “Realization of on-line trajectory generation based on two-computer system of humanoid robot BHR-02,” In: Proceedings of the IEEE International Conference on Information Acquisition (ICIA), Weihai, China, pp.677-682, 2006.
    [66] M. Hirose and K. Ogawa, “Honda humanoid robots development,” Philosophical Transactions: Mathematical, Physical and Engineering Sciences (Series A), vol. 365, no. 1850, pp. 11-19, 2007.
    [67] K. Miura, S. Nakaoka, S. Kajita, K. Kaneko, F. Kanehiro, M. Morisawa and K. Yokoi, “Trials of cybernetic human HRP-4C toward humanoid business,” Proceedings of the 2010 IEEE Workshop on Advanced Robotics and Its Social Impacts. Seoul, pp. 165-169, Oct. 26-28, 2010.
    [68] Y. Hosoda, S. Egawa, J. Tamamoto, K. Yamamoto, R. Nakamura and M. Togami, “Basic design of human-symbiotic robot EMIEW,” Proceedings of the 2006 IEEE International Conference on Intelligent Robots and Systems, Beijing, China, pp. 5079-5084, 2006.
    [69] P. Martin and M. Egerstedt, “Optimization of multi-agent motion programs with applications to robotic marionettes,” In Hybrid Systems: Computation and Control, San Francisco, CA, pp.262-275, April, 2009.
    [70] M. Fuchs, C. Borst, P. R. Giordano, A. Baumann, E. Kraemer, J. Langwald, R. Gruber, N. Seitz, G. Plank, K. Kunze, R. Burger, F. Schmidt, T. Wimboeck and G. Hirzinger, “Rollin' Justin - Design considerations and realization of a mobile platform for a humanoid upper body,” Proceedings of the 2009 IEEE International Conference on Robotics and Automation, NJ, USA, pp. 4131-4137, 2009.
    [71] K. G. Oh, C. Y. Jung, Y. G. Lee and S. J. Kim, “Real-time lip synchronization between text-to-speech (TTS) system and robot mouth,” In: Proceedings of IEEE International Workshop Robot and Human Interactive Communications (RO-MAN), Viareggio, Italy, pp.620-625, 2010.
    [72] M. Onishi, Z. W. Luo, T. Odashima, S. Hirano, K. Tahara and T. Mukai, “Generation of human care behaviors by human-interactive robot RI-MAN,” Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Roma, pp. 3128-3129, 2007.
    [73] M. Diftler, J. Mehling, M. Abdallah, N. Radford, L. Bridgwater, A. Sanders, S. Askew, D. Linn, J. Yamokoski, F. Permenter, B. Har-grave, R. Platt, R. Savely and R. Ambrose, “Robonaut 2 - The first humanoid robot in space,” Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, pp. 2178-2183, 2011.
    [74] M. H. Chang, H. P. Huang and S. W. Chang, “A New State of Charge Estimation Method for LiFePO4 Battery Packs Used in Robots,” Energies vol. 6, no. 4, pp. 2007-2030, 2013.
    [75] M. Iftikhar, Sh. M. M. Jan and M. Mariappan, “Coeffecient of underactuated mechanism (CoUAM): Novel approach for underactuated mechanism of myoelectric hand prosthesis,” In 5th International Conference on Bioinformatics and Biomedical Engineering, pp. 1-7, May 10-12, 2011.
    [76] N. Tsujiuchi, T. Koizumi, S. Shirai, T. Kudawara, Y. Ichikawa, “Development of a low pressure driven pneumatic actuator and its application to a robot hand,” Proceedings of the IEEE International Conference on Industrial Electronics, pp.3040-3045, 2006.
    [77] B. Choi, S. Lee, H. R. Choi and S. Kang, “Development of anthropomorphic robot hand with tactile sensor: SKKU hand II,” IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2006, pp. 3779–3784, 2006.
    [78] C.Y. Brown and H.H. Asada, “Inter-finger Coordination and Postural Synergies in Robot Hands via Mechanical Implementation of Principal Components Analysis,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2877-2882, Oct. 29-Nov. 2, 2007.
    [79] S. Lee, S. Noh, Y. Lee and J. Park, “Development of Bio-mimetic Robot Hand Using Parallel Mechanisms,” In: Proceedings of the IEEE International Conf. on Robotics and Biomimetics, Guilin, China, December 19-23, 2009.
    [80] T. Takaki and T. Omata, “High-performance anthropomorphic robot hand with grasping-force-magnification mechanism,” IEEE/ASME Trans. Mechatronics, vol. 16, no. 3, pp.583 -591, 2011.
    [81] 林克臻,高可靠度划拳手指數目視覺偵測系統研發,國立台灣科技大學機械工程系碩士學位論文,2012。
    [82] X. Wen, Y. Niu, “A Method for Hand Gesture Recognition Based on Morphology and Fingertip- Angle,” The 2nd International Conference on Computer and Automation Engineering (ICCAE), vol. 1, pp. 688-691, 2010.
    [83] C. Y. Lin, L. W. Chuang and C. C. Huang, “Development of hand posture recognition system for Finger Gaming Robot,” 2013 International conference on Advanced Robotics and Intelligent Systems, Tainan, Taiwan, 2013.
    [84] L. R. Rabiner, “A tutorial on hidden Markov models and selected applications in speech recognition,” IEEE Proceedings, vol. 77, no. 2, pp. 257-286, 1989.
    [85] Mandarin microphone speech corpus – TCC300. Available from: http://www.aclclp.org.tw/use_mat.php#tcc300edu [Accessed 31 May 2013]
    [86] S. B. Davis and P. Mermelstein, “Comparison of parametric representations for monosyllabic word recognition in continuously spoken sentences,” IEEE Transactions on Acoustics, Speech, and Signal Processing, vol. ASSP-28, pp. 357-366, 1980.
    [87] M. N. Dailey, G.W. Cottrell, C. Padgett, R. Adolphs, “EMPATH: A neural network that categorizes facial expressions,” Journal of Cognitive Neuroscience, vol. 14, no. 8, 1158–1173, 2002.
    [88] H. Aviezer, R. Hassin, J. Ryan, G. Grady, J. Susskind, A. Anderson, M. Moscovitch, S. Bentin, “Angry, Disgusted or Afraid? Studies on the Malleability of Emotion Perception,” Psychological Science, vol. 19, pp. 724–732, 2008.
    [89] R. Jack, C. Blais, C. Scheepers, P. Schyns, and R. Caldara, “Cultural Confusions Show that Facial Expressions Are Not Universal,” Current Biology, vol. 19, pp. 1-6, 2009.
    [90] C. Saxena, R. Kaur and P. Arun, “Reaction time of a group of physics students,” Physics Education, vol. 43, no. 3, pp. 309-313, May, 2008.
    [91] K. K. Yin, M. B. Cline and D. K. Pai, “Motion perturbation based on simple neuromotor control models,” Proceedings of the 11th Pacific Conference on Computer Graphics and Applications, pp. 445-449, 2003.

    QR CODE