研究生: |
鄭欽元 Chin-Yuan Cheng |
---|---|
論文名稱: |
具表情人臉機械頭顱之設計製作 Design of Robot Head with Facial Expression |
指導教授: |
林其禹
Chyi-Yeu Lin |
口試委員: |
李維楨
Wei-Chen Lee 郭重顯 Chung-Hsien Kuo |
學位類別: |
碩士 Master |
系所名稱: |
工程學院 - 機械工程系 Department of Mechanical Engineering |
論文出版年: | 2007 |
畢業學年度: | 95 |
語文別: | 中文 |
論文頁數: | 58 |
中文關鍵詞: | 機械頭顱 、表情 、控制點 |
外文關鍵詞: | Robot Head, Expression, Control point |
相關次數: | 點閱:130 下載:1 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本論文研究目標為設計製作能夠展現多種臉部表情之智慧型機器頭顱,本研究首先考慮人臉部的運動單元法與肌肉組織,定義出以矽膠製作的臉部機器人皮膚上控制點,使用伺服馬達代替肌肉,將細鋼線一邊鑲在皮膚後的控制點上,而另一端固定於伺服馬達,當伺服馬達旋轉拉動控制點移動時臉皮便會產生位移。本研究研發製作出之機器頭顱,在控制軟體上定義出擬表現之臉部表情後,控制器便能指揮最多31個臉部內伺服馬達,依照電腦傳送的指令順序展現多種臉部表情。
The goa1 of this paper is to study mechanical design of an intelligent robot head that can show many kinds of facial expressions. To start with this study, consider the action unit's (AU’s) movements method and muscle form of human face to define the control points on the face-robot skin made by silicon rubber. Using servo motor to substitute for muscle. Let one side of thin steel line connects mounted on the control point of backside of skin, another side was fixed on the servo motor. When servo motor rotated to pull control point movement, the skin will be produced displacement. To make the robot head of this study define facial expressions in the control soft in advance, the controller maximum commanded 31 servo motors in the head in compliance with command receive from computer program to show various facial expressions.
[1]朱斌妤, “專業演講的要素,” 元智大學最佳化設計實驗室, 2000.
[2]Breazeal, C. and Scassellati, B., “How to build robots that make friends and influence people,” Proceedings of IROS Conference 1999, pp. 858-863, 1999.
[3]Breazeal, C. and Scassellati, B., “Infant-like social interactions between a robot and a human caretaker,” Journal of Adaptive Behavior, pp. 49-74, 2000.
[4]Miwa, H., Itoh, K., Matsumoto, M., Zecca, M., Takanobu, H., Roccella, S., Carrozza, C.M., Dario, P., and Takanishi, A., ” Effective emotional expressions with emotion expression humanoid robot WE-4RII,” Proceedings of IROS Conference 2004, pp. 2203-2208, 2004.
[5]Haralab:http://www.hafu030.me-out.kagu.tus.ac.jp/haralab/
[6]Hashimoto, T., Senda, M., Shiiba, T., and Kobayashi, H., “Development of the interactive receptionist system by the face robot,” Proceedings of SICE Annual Conference, pp. 1404-1408, 2004.
[7]Hanson Robotics:http://hansonrobotics.com/
[8]Kokoro Company:http://www.kokoro-dreams.co.jp/english/index.html
[9]徐全生, 甄穎, ”基於知識庫進行人臉圖像特徵提取之研究,” 瀋陽工業大學學報, pp. 43-46, 2000.
[10]Artnatomy:http://www.artnatomia.net/uk/artnatomy.html
[11]Ekman, P. and Friesen, W. V., “ Manual for the Facial Action Coding System” Consulting Psychologists Press, Palo Alto, CA, 1978.
[12]林佳鴻, ”驚訝在面部表情動畫中的運用-以吉卜力工作室動畫影片為例”,玄奘大學視覺傳達設計學系研究報告.
[13]Waters, K., “A muscle model for animation three-dimensional facial expression,’’ Proceedings of the 14th annual conference on Computer graphics and interactive techniques , pp. 17-24, 1987.
[14]視野及量測畫面:http://www.cyut.edu.tw/~hcchen/%B9%EA%C5%E7%B3% E6%A4%B8/%B5%F8%B3%A5%A4%CE%B5%F8%A4O%B6q%B4%FA.htm