簡易檢索 / 詳目顯示

研究生: 林松潤
Sung-run Lin
論文名稱: 仿真機器人臉皮之表情模擬與驗證技術
Facial expression Simulation and Verification Techniques of Robotic Artificial Facial Skin
指導教授: 林其禹
Chyi-Yeu Lin
口試委員: 李維楨
Wei-Chen Lee
林紀穎
ChiYing Lin
學位類別: 碩士
Master
系所名稱: 工程學院 - 機械工程系
Department of Mechanical Engineering
論文出版年: 2010
畢業學年度: 99
語文別: 中文
論文頁數: 72
中文關鍵詞: 機器頭顱仿真機器人臉部表情人工皮膚有限元素分析
外文關鍵詞: Face Robot, Android Robot, Facial Expression, Artificial Skin, FEM Analysis
相關次數: 點閱:254下載:5
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 近幾年來,愈來愈多人投入仿真人形機器頭顱臉部表情的動作研發,但所做出來的表情逼真度始終和真人的表情有所差異。大部分探討臉部表情動作的論文中所定義之各種臉部表情的控制點位移運動方式都是參考Ekman所提出的臉部動作編碼或是參照人類解剖學中的臉部表情肌肉,控制點之位移量及方向則是實際量測人類各種臉部表情的位移量和方向去拉動臉皮來顯示出各種不同的臉部表情。
    本文主要研究利用ANSYS Workbench分析軟體去模擬各種臉部表情動作的相關技術和程序,在模擬前須先建立臉皮模型和頭殼模型,再開始定義控制點之位置接著開始設定人造皮膚材料性質、拘束條件和位移量及方向等等。
    爲了驗證臉部表情模擬中所設定的邊界條件是否正確,在本研究中設計製作符合軟體中所設定之邊界條件的臉部表情機構並實際比較電腦模擬的方式和實體動作表情相符合。本論文研究開發之模擬技術可以用來設計控制點位置和施力方向,以產生最接近人類臉部表情之機器頭顱。


    Recently, more and more people have devoted to the development of facial expressional motions on android face. But, most of them still exist some significant differences compared with real human face. Most papers had referred to the Facial Action Coding System by Ekman or the muscle distributions in anatomy to define various kinds of facial expressions and their motion by some control points. The displacements and pulling directions of various expression performances are based on measurements on real human face.
    In this thesis, ANSYS Workbench software is used to simulate facial expressions. The skin and skull geometric models need to be established before simulation. Next, the positions of those control points, the material properties, boundary conditions, displacements, and pulling directions are all pre-processing details needed to be properly defined.
    To verify the boundary condition settings, a similar mechanical device to the simulated face model is built to compare with the simulation results. The techniques developed in this thesis can be adopted to assist in the design of the positions of the control points and the force pulling direction so as to generate facial expression most similar to human face.

    中 文 摘 要 i 英 文 摘 要 ii 誌 謝 iii 目 錄 iv 圖 目 錄 vi 表 目 錄 x 第一章 前言 1 1.1 研究目標 1 1.2 研究動機 1 1.3 研究背景與文獻回顧 2 1.4 論文結構 7 第二章 人類臉部表情 8 2.1 臉部之生理構造 8 2.1.1 生理學之定義 8 2.1.2 肌肉的特性 8 2.1.3 顏面表情肌肉(Muscles of Facial Expression) 8 2.2 表情之臉部運動機制 10 第三章 人造臉皮之材料機械性能 12 3.1 皮膚與肌肉之機械性質 12 3.1.1 皮膚 12 3.1.2 肌肉 13 3.2 橡膠 14 3.2.1 聚胺酯橡膠(PU, Polyurethane Rubber) 15 3.3 超彈性體(Hyperelastic)材料之拉伸試驗 15 3.3.1 ASTM D412-6 規範 15 3.3.2 拉伸試驗 18 3.3.3 實驗數據 20 第四章 有限元素模擬分析 21 4.1 建立模型 21 4.1.1. 三次元非接觸式掃描之逆向工程 21 4.1.2. 模型重建(Model Reconstruction) 23 4.1.3. 拉伸/控制區域的配置 25 4.2 前處理(Pre-Processing) 27 4.2.1. 材料設定 27 4.2.2. 綴面(Patch)之拓墣(Topology)修整 28 4.2.3. 網格分割 28 4.3 邊界條件(Boundary Conditions)之設定 29 4.3.1. 拘束 29 4.3.2. 位移 29 4.4 分析結果 30 4.4.1. 微笑 30 4.4.2. 生氣 31 4.4.3. 哀傷 33 4.4.4. 驚訝 34 4.5 後處理(Post-Processing) 36 4.5.1. 無表情 36 4.5.2. 微笑 37 4.5.3. 生氣 37 4.5.4. 哀傷 38 4.5.5. 驚訝 38 第五章 模擬驗證實驗 39 5.1 具臉部表情機器頭顱之製作與設計 39 5.1.1 人工皮膚的製作 39 5.1.2頭顱外殼的製作 40 5.1.3 驗證機構的設計 41 5.1.4 伺服馬達的選用 43 5.1.5 伺服馬達控制晶片 43 5.1.6 控制點與機構的連接方式 44 5.1.7 表情人臉機器頭顱硬體機構的組裝 44 5.1.8 表情人臉機器頭顱控制軟體介面 47 5.2 臉部表情模擬驗證方式 48 5.2.1 人臉機器頭顱的表情動作結果 48 5.2.2 實體人形機器頭顱表情圖檔建模 50 5.2.3曲面誤差比對軟體 53 5.3.1 無表情 53 5.3.2 微笑 54 5.3.3 生氣 55 5.3.4 哀傷 56 5.3.5 驚訝 56 5.3.6 結果與討論 57 第六章 結論與未來展望 58 6.1 結論 58 6.2 未來展望 58 參考文獻 60 附錄 63 個人簡介 72

    [1] 林其禹, 曾昌國, 鄭立傑, “「機器人劇場」表演之技術與實現 ” , 2009年台北國際機器人展大會專刊, 52~58頁, 2009。
    [2] Chyi-Yeu Lin, Chang-Kuo Tseng, Wei-Chung Teng, Wei-Chen Lee, Chung-HsienKuo, Hung-Yan Gu, Kuo-Liang Chung, and Chin-Shyurng Fahn, “ The Realization of Robot Theater: Humanoid Robots and Theatric Performance,” The 14th International Conference on Advanced Robotics (ICAR 2009), June 22 – 26, 2009, Munich, Germany.
    [3] Chyi-Yeu Lin, Li-Chieh Cheng, Chang-Kuo Tseng, “The Development of Mimetic Engineering for Theatric Android Head”, International Conference on Service and Interactive Robotics, SIRCon 2009, Taipei, Taiwan, August 6-7, 2009.
    [4] NTUST Robot ( Eva ) - An android Head. (2009)
    Available at: http://www.youtube.com/watch?v=zDkgJIqR7n4
    (Accessed: 1 June 2010)
    [5] Cynthia L. Breazeal, Sociable Machines: Expressive Social Exchange between Humans and Robots, MIT Press, Cambridge, MA, 2000.
    [6] Hiroshi Kobayashi and Fumio HARA, “Study on Face Robot for Active Human Interface - Mechanisms of Face Robot and Expression of 6 Basic Facial Expressions”, Proceedings of IEEE International Workshop on Robot and Human Communication, pp.276~281,1993.
    [7] Ekman, P. and Friesen, W. V., “ Manual for the Facial Action Coding System” Consulting Psychologists Press, Palo Alto, CA, 1978.
    [8] Takuya Hashimoto, Sachio Hiramatsu, Toshiaki Tsuji and Hiroshi Kobayashi, “Development of the Face Robot SAYA for Rich Facial Expressions”, pp.5423∼5428, SICE-ICASE International Joint Conference 2006 (2006.10.18∼21), Busan, Korea.
    [9] KOKORO Company Ltd. (2010)
    Available at: http://www.kokoro-dreams.co.jp/english/
    (Accessed: 1 June 2010)
    [10] Daisuke Matsui, Takashi. Minato, Karl. F. MacDorman, and Hiroshi Ishiguro, "Generating Natural Motion in an Android by Mapping Human Motion", Proc. of IEEE/RSJ International Conference on Intelligent Robots and Systems, pp.1089-1096, Aug. 2005.
    [11] Freerk Pieter Wilbers, Carlos Toshinori Ishi, Hiroshi Ishiguro: A blendshape model for mapping facial motions to an android. Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, pp.542-547, 2007.
    [12] Hanson Robotics (2010)
    Available at: http://hansonrobotics.wordpress.com/
    (Accessed: 1 June 2010)
    [13] HUBO Lab. (2010), Albert HUBO,
    Available at: http://hubolab.co.kr/AlbertHUBO.php
    (Accessed: 1 June 2010)
    [14] Seulgi Lee, Byung-Rok So, and Ho-Gil Lee, “Development of Motion-Marker for an Android”, Proceedings of International Conference on Control, Automation and Systems, 1168-1172, 2008.
    [15] Zeno, robot hero from tomorrow (2008)
    Available at: http://www.youtube.com/watch?v=WLGi4Q9xc24
    (Accessed: 1 June 2010)
    [16] National Institute of Advanced Industrial Science and Technology (AIST) (2009), Successful Development of a Robot with Appearance and Performance Similar to Humans,Availableat:http://www.aist.go.jp/aist_e/latest_research /2009/20090513/ 20090513.html (Accessed: 1 June 2010)
    [17] 西安超人機器人 (2009)
    Available at: http://www.chaoren.cn/ (Accessed: 1 June 2010)
    [18] 自由時報電子報(2007),台科大矽膠臉機器人表情豐富
    Available at: http://www.libertytimes.com.tw/2007/new/mar/31/today- life6.htm
    (Accessed: 1 June 2010)
    [19] Simplified Face Robot (NTUST Robot) (2010)
    Available at: http://www.youtube.com/watch?v=pQyBvkvsfxk
    (Accessed: 1 June 2010).
    [20] 許世昌博士編著,“新編解剖學”,永大書局,1999,p168~p169、p187~p189。
    [21] Van De Graaff, “Human Anatomy” McGraw-Hill Companies, 2001, p107.
    [22] Leonid Bunegin and Jeffery B. Moore,“Simultaneous Spectrophotometric and Mechanical Property Characterization of Skin” Photonic Therapeutics and Diagnostics, Proc. of SPIE ,Vol. 6078, (2006).
    [23] Emmanuelle Jacquet, Gwendal Josse, Fouad Khatyr and Camille Garcin, “A New Experimental Method for Measuring Skin’s Natural Tension,” Skin Research and Technology, 2008; 14: 1–7.
    [24] 陳卓昇, “骨骼肌生理”,陳卓昇老師的講義,Sept. 2006。
    [25] Eric J. Chen, Jan Novakofski, W. Kenneth Jenkins, and William D. O’Brien, Jr, “Young’s Modulus Measurements of Soft Tissues with Application to Elasticity Imaging” IEEE Transactions on Ultrasonic, Ferroelectrics, and Frequency Control, Vol. 43, No. 1, January 1996.
    [26] 蔡信行主編,“最新化工製程及材料,新文京開發,2005,p474。
    [27] 黃崑耀主編, “橡膠工業手册”,台灣區橡膠工業同業公會,1985,p239。
    [28] Copyright ASTM International, “Standard Test Methods for Vulcanized Rubber and Thermoplastic Elastomers—Tension” , 2006.
    [29] 虎門科技股份有限公司,“ANSYS Workbench 結構分析進階課程講義”,虎門科技股份有限公司,2009。
    [30] 陳文賢編著, “逆向工程軟體”,全華科技圖書股份有限公司,2006,p1-2~p1-4。
    [31] 章明、姚宏宗、鄭正元、林宸生、姚文隆 編著, “逆向工程技術與系統”,全華科技圖書股份有限公司,2005,p1-2~p1-4、p5-1~5-11。
    [32] 龍騰科技股份有限公司, “Beauty 3D v3.0 使用手冊”,龍騰科技股份有限公司,2009,pp.5-6。
    [33] Tingfan Wu, Nicholas J. Butko, Paul Ruvulo, Marian S. Bartlett, Javier R. Movellan, “Learning to Make Facial Expressions” 2009 IEEE 8th International Conference on Intelligent Development and Learning.

    QR CODE