簡易檢索 / 詳目顯示

研究生: 洪志憲
Jhih-sian Hong
論文名稱: 利用架構對應區塊之人型機器人動作生成方法
An Approach to Generate Motion Data for Humanoid Robots Utilizing Structural Mapping Zone
指導教授: 鄧惟中
Wei-Chung Teng
口試委員: 唐政元
Cheng-Yuan Tang
林其禹
Chyi-Yeu Lin
傅立成
Li-Chen Fu
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2007
畢業學年度: 95
語文別: 中文
論文頁數: 46
中文關鍵詞: 人型機器人機器人對應方式
外文關鍵詞: Humanoid Robot, Robot, Mapping Function
相關次數: 點閱:183下載:2
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 人型機器人為具有仿人類外型且具有執行仿人類動作能力之機器人裝置。人型機器人為了呈現人類的動作,其架構傾向於被設計為與人類相似之運動關節配置。由於人型機器人之架構通常會有所不同,因此相同人體動作於相異人型機器人上會各自呈現出相近之動作。本論文考慮人體骨架在運動時各關節連動之特性,而提出以「架構對應區塊」為基礎之方法來解決同一動作於相異架構之對應問題。架構對應區塊為一對人體骨架與人型機器人骨架之間共同具有的關節集合,且同一組對應區塊內之關節具有相同或相似的連動特性。區塊內之關節對應情形可進一步分析出1-1、1-n、n-1、m-n四種特定區塊對應方法,且由此方法來產生所需之動作資料。最後我們建構人型機器人運動模型並實做1-1對應方法,將生成之動作資料透過模擬軟體動態呈現人型機器人之動作。實驗結果顯示本論文提出之方法可以有效轉換出與人體動作相似之人型機器人動作。


    Humanoid robots are robotic devices which have ability to imitate human motions and have human appearance. In order to perform human motions, their skeletons of the joint deployment tend to be designed to fit human body. However, since the skeleton of each humanoid robot usually vary, so the same human motion performed by different humanoid robots usually slightly differ from each other .In this thesis, considered the joint linkage feature of the human skeleton in movement, we propose an approach which is based on the “structural mapping zone” to solve the mapping problems of the same motion among different skeletons. A structural mapping zone is a set of joints existing both in human skeleton and in the humanoid robot skeleton .And joints in a mapping zone have the same or similar motion feature. The mappings between two zones can be categorized into four zone mapping methods, which are 1-1, 1-n, n-1, and n-m, for generating the motion data. Finally, we constructed a humanoid robot motion model and implemented the 1-1 mapping method to perform the transformed humanoid robot motions in a robot simulator software .The result shows that our method can transform human motions into humanoid robot motion effectively and with high fidelity.

    摘要....................................................I ABSTRACT...............................................II 誌謝..................................................III 目錄...................................................IV 圖目錄.................................................VI 表目錄................................................VII 第一章 緒論.........................................1 1.1 前言................................................1 1.2 研究動機............................................2 1.3 研究目的............................................3 1.4 論文架構............................................3 第二章 文獻探討.....................................4 2.1人體動作分類.........................................4 2.2 H-ANIM人體動畫規格..................................4 2.2.1 H-Anim設計目標....................................5 2.2.2 H-Anim規格內容....................................5 2.2.3 H-Anim應用........................................7 2.3機器人架構紀錄相關方法...............................7 2.3.1 Denavit-Hartenberg parameters(D-H parameters).....8 2.3.2 OpenHRP(Open Humanoid Robot Project)..............9 2.4 人型機器人與真人動作對應方法.......................10 第三章 研究方法....................................13 3.1研究流程............................................13 3.2架構儲存形式定義....................................13 3.2.1架構儲存型式設計..................................14 3.2.2架構儲存資料欄位定義..............................14 3.3理想人型機器人架構之儲存............................17 3.4 架構檔案之可驗證性與可擴充性.......................19 3.5 骨架對應方法.......................................20 3.5.1 機器人架構區塊化.................................20 3.5.2 區塊間之匹配方法.................................26 3.5.3 區塊間之動作對應方法.............................28 第四章 研究結果....................................36 4.1 研究工具...........................................36 4.2 實驗資料之取得與建立...............................37 4.2.1 人體架構資料之取得...............................37 4.2.2 人型機器人動作模型之建立.........................38 4.3動作控制程式開發與關節動作計算......................40 4.4人型機器人動作呈現..................................41 第五章 結論與未來後續工作..........................43 5.1結論................................................43 5.2未來後續工作........................................43

    [1] Humanoid animation網站:http://h-anim.org/
    [2] Higgins, J.,”Human Movement: An Integrated Approach.” The C. V. Mosby CO.: St. dovis, Mo. ,1997
    [3] Rachid Manseur, ROBOT MODELING AND KINEMATICS., Charles River Media, 2006.
    [4] Open HRP網站:http://www.is.aist.go.jp/humanoid/openhrp/Japanese/
    [5] 晉茂林,機器人學,五南圖書公司,台北,2000.
    [6] Pollard N.S., Hodgins J.K., Riley M.J., Atkeson C.G., “Adapting human motion for the control of a humanoid robot”, Robotics and Automation Proceedings. ICRA '02. IEEE International Conference, Vol. 2, pp. 1390-1397, 2002.
    [7] Nakazawa A., Nakaoka S., Ikeuchi K., Yokoi K., “Imitating Human Dance Motions through Motion Structure”, Intelligent Robots and Systems IEEE/RSJ International Conference, Vol. 3, pp. 2539-2544, 2002.
    [8] Nakaoka S., Nakazawa A., Yokoi K., Ikeuchi, K., “Leg motion primitives for a dancing humanoid robot”, Inst. of Industrial Science., Tokyo University, Japan, May 2004.
    [9] Matsui D., Minato T., MacDorman K.F., Ishiguro H., “Generating Natural Motion in an Android by Mapping Human Motion”, Intelligent Robots and Systems IEEE/RSJ International Conference, pp. 3301-3308, 2005.
    [10] Kaneko K., Kanehiro F., Kajita S., Hirukawa H., Kawasaki T., Hirata M., Akachi K., Isozumi T., “Humanoid robot HRP-2”, Robotics and Automation Proceedings, ICRA '04. 2004 IEEE International Conference, Vol. 2, pp. 1083-1090, 2004.
    [11] Omer A.M.M., Yu Ogura, Kondo H., Morishima A., Carbone G., Ceccarelli M., Hun-ok Lim, Takanishi A., “Development of A Humanoid Robot Having 2-DOF Waist and 2-DOF Trunk”, Humanoid Robots 5th IEEE-RAS International Conference, pp. 333-338, 2005.
    [12] 台灣地區勞工人體計測資料庫網站:http://www.iosh.gov.tw/ergo.htm
    [13] Sing Bing Kang, Ikeuchi K., “Toward automatic robot instruction from perception-mapping human grasps to manipulator grasps”, Robotics and Automation IEEE Transactions, Vol. 13, Issue 1, pp 81-95, 1997.
    [14] 刘杰, 孙汉旭, 张玉茹, “人手到灵巧手指尖运动映射的实现”, 北京机械工业自动化所机器人中心, 北京邮电大学自动化学院, 北京航空航天大学机器人研究所, 2003.
    [15] Webots Reference Manual release 5.1.9, Cyberbotics Ltd.

    QR CODE