簡易檢索 / 詳目顯示

研究生: 董其昌
CHI-CHANG TUNG
論文名稱: 利用具泛用性的肢段向量平行法轉換人體姿勢至人型機器人
Limb Vector Paralleling: A General-purpose Approach to Translate Human Posture to Humanoid Robots
指導教授: 鄧惟中
Wei-Chung Teng
口試委員: 邱士軒
Shih-Hsuan Chiu
項天瑞
Tien-Ruey Hsiang
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2014
畢業學年度: 102
語文別: 中文
論文頁數: 73
中文關鍵詞: 機器人人體動作姿勢轉換模仿體感控制器
外文關鍵詞: human, imitate
相關次數: 點閱:249下載:11
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

人形機器人通常具有16個以上的自由度,要製作一個直覺的使用者介面來即時操作這麼高自由度的人形機器人是個挑戰。本研究提出一個數學計算方法,利用人體動作追蹤系統所得之人體關節位置資訊,計算人形機器人各關節旋轉角度,使得人形機器人做出和使用者一樣的姿勢,來達到即時控制機器人的目的。本方法可以應用在現今幾乎所有人形機器人上。將本方法應用到 RoboBuilder小型人形機器人上,實驗結果證實RoboBuilder 的確可以即時擺出跟人一樣的姿勢。同方法並且在Webots 7.2.4機器人模擬器所提供的所有人形機器人上驗證無誤。利用本方法可以將人體動作追蹤系統取得的人體動作儲存成 BVH 檔,以供後續編修、播放或在人形機器人上重現等各種應用。
本研究提出的方法所適用的人體動作追蹤系統條件為:能提供軀幹、右肩、右肘、右手、左肩、左肘、左手、右髖、右膝、右踝、左髖、左膝、左踝等共13處的3D位置座標。適用的人型機器人條件為:具有一個身體、雙手、雙腳,每隻手有肩及肘兩個關節,每隻腳有髖及膝兩個關節,每個關節需要有兩個自由度。
本方法不討論機器人平衡問題,故在實驗時,將機器人膝關節及踝關節作調整,使機器人比較容易保持站立,以利實驗進行。實驗以Webots 7.2.4所提供的人型機器人NAO、HOAP2進行測試,操作者做出指揮動作、伏地挺身動作及雙手各種角度,模擬器中的機器人能做出與操作者相似的姿勢。


There are often 16 or more degrees of freedom (DOF) in most humanoid robots, and it remains an issue to build an intuitive interface for human to control such a high DOF humanoid robots in real-time. This paper purposes a method to calculate humanoid robot joint angles from the human joint locations tracked by motion capture system, so as to make the humanoid imitating human posture in real-time. The purposed method can be applied to almost all humanoid robots nowadays. The proposed method is implemented on a small-sized humanoid robot – RoboBuilder, and the experiment results show that the robot can imitate human posture in real time correctly. We also verified that our method works to all humanoid robots supported by Webots 7.2.4 simulator. A program that saves the human motion captured by motion capture system to a BVH file is also implemented. The BVH file can be used on other purposes like editing, playing back on screen, or driving a real humanoid robot, etc.
There are some requirements to apply the proposed method. The human motion capture system requirement is that the system must provide 13 three dimensional positions including human torso, left shoulder, left elbow, left hand, right shoulder, right elbow, right hand, left hip, left knee, left ankle, right hip, right knee and right ankle. The humanoid robot requirement is that the robot must have one body, 2 arms and 2 legs; there exist one shoulder joint and one elbow joint in each arm; there exist one hip joint and one knee joint on each leg; finally there should have 2 degrees of freedom in each joint.

摘要 I ABSTRACT II 誌謝 III 圖表索引 VI 第1章 緒論 1 1.1 前言 1 1.2 研究動機與目的 1 1.3 研究方法 2 1.4 本文架構 3 第2章 文獻回顧 4 第3章 肢段向量平行法 8 3.1 人體動作追蹤系統需求 9 3.2 人型機器人機械結構需求 10 3.3 各種人型機器人關節自由度介紹 12 3.4 人型機器人home posture需求 13 3.5 肢段向量平行法的前置工作 15 3.6 肢段向量平行法-右肩關節的計算 19 3.7 肢段向量平行法-右肘關節的計算 20 3.8 肢段向量平行法-左肩關節的計算 21 3.9 肢段向量平行法-左肘關節的計算 22 3.10 肢段向量平行法-自由度不足的右肘關節的計算 23 3.11 肢段向量平行法-雙腿關節的計算 24 3.12 使用特殊home posture機器人關節的計算 27 第4章 驗證與應用 30 4.1 驗證目的與結果 30 4.2 系統架構 31 4.3 Kinect原理簡介 32 4.4 BVH檔案格式介紹 33 4.5 RoboBuilder 簡介 39 4.6 定義姿態矩陣及旋轉矩陣 40 4.7 定義姿態矩陣及旋轉矩陣的運算 41 4.8 從Kinect到BVH檔的轉換 44 4.9 BVH到RoboBuilder機器人的轉換 52 4.10 一個使用者控制一個機器人 57 4.10.1 系統架構 57 4.10.2 程式架構 58 4.10.3 操作步驟 59 4.10.4 實驗結果 60 4.11 兩個使用者分別控制兩個機器人 61 4.11.1 系統架構 61 4.11.2 程式架構 62 4.11.3 操作步驟 63 4.11.4 實驗結果 63 4.12 一個使用者控制七個機器人 64 4.12.1 說明 64 4.12.2 系統架構 64 4.12.3 程式架構 65 4.12.4 操作步驟 66 4.12.5 實驗結果 66 4.13 利用webots模擬器進行驗證 67 第5章 結論 69 5.1 分析與比較 69 5.2 結論 70

[1] M. Field, Z. Pan, D. Stirling, and F. Naghdy, "Human motion capture sensors and analysis in robotics," Industrial Robot: An International Journal, vol. 38, pp. 163-171, 2011.
[2] D. Matsui, T. Minato, K. F. MacDorman, and H. Ishiguro, "Generating natural motion in an android by mapping human motion," in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2005, pp. 3301-3308.
[3] M. Do, P. Azad, T. Asfour, and R. Dillmann, "Imitation of human motion on a humanoid robot using non-linear optimization," in 8th IEEE-RAS International Conference on Humanoid Robots, 2008, pp. 545-552.
[4] Y.-A. Zheng, "Humanoid Robot Behavior Emulation and Representation," Master Thesis, Department of Electrical Engineering, Nation Sun Yat-sen University, Kaohsiung, Taiwan, 2012.
[5] F. Zuher and R. Romero, "Recognition of Human Motions for Imitation and Control of a Humanoid Robot," in Brazilian Robotics Symposium and Latin American Robotics Symposium (SBR-LARS), 2012, pp. 190-195.
[6] A. Billard and M. J. Matarić, "Learning human arm movements by imitation:: Evaluation of a biologically inspired connectionist architecture," Robotics and Autonomous Systems, vol. 37, pp. 145-160, 2001.
[7] A. Ude, C. G. Atkeson, and M. Riley, "Programming full-body movements for humanoid robots by observation," Robotics and Autonomous Systems, vol. 47, pp. 93-108, 2004.
[8] M. Ruchanurucks, S. Nakaoka, S. Kudoh, and K. Ikeuchi, "Humanoid robot motion generation with sequential physical constraints," in In proceedings of IEEE International Conference on Robotics and Automation, 2006, pp. 2649-2654.
[9] A. Nakazawa, S. Nakaoka, K. Ikeuchi, and K. Yokoi, "Imitating human dance motions through motion structure analysis," in IEEE/RSJ International Conference on Intelligent Robots and Systems, 2002, pp. 2539-2544 vol.3.
[10] N. S. Pollard, J. K. Hodgins, M. J. Riley, and C. G. Atkeson, "Adapting human motion for the control of a humanoid robot," in In Proceedings of IEEE International Conference on Robotics and Automation (ICRA), 2002, pp. 1390-1397 vol.2.
[11] A. Safonova, N. S. Pollard, and J. K. Hodgins, "Optimizing human motion for the control of a humanoid robot," in International Symposium on Adaptive Motion of Animals and Machines (AMAM), Kyoto, Japan, 2003.
[12] S. Nakaoka, A. Nakazawa, K. Yokoi, H. Hirukawa, and K. Ikeuchi, "Generating whole body motions for a biped humanoid robot from captured human dances," in in proceedings of IEEE International Conference on Robotics and Automation (ICRA), 2003, pp. 3905-3910 vol.3.
[13] W. Suleiman, E. Yoshida, F. Kanehiro, J. P. Laumond, and A. Monin, "On human motion imitation by humanoid robot," in IEEE International Conference on Robotics and Automation (ICRA), 2008, pp. 2697-2704.
[14] P. Azad, T. Asfour, and R. Dillmann, "Toward an Unified Representation for Imitation of Human Motion on Humanoids," in IEEE International Conference on Robotics and Automation, 2007, pp. 2558-2563.
[15] T. Liu, H. Utsunomiya, Y. Inoue, and K. Shibata, "Synchronous imitation control for biped robot based on wearable human motion analysis system," in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2008, pp. 2744-2749.
[16] A. D'Souza, S. Vijayakumar, and S. Schaal, "Learning inverse kinematics," in IEEE/RSJ International Conference on Intelligent Robots and Systems, 2001, pp. 298-303 vol.1.
[17] I. Ha, Y. Tamura, H. Asama, H. Jeakweon, and D. W. Hong, "Development of open humanoid platform DARwIn-OP," in Proceedings of SICE Annual Conference (SICE), 2011, pp. 2178-2181.
[18] M. Meredith and S. Maddock, "Motion capture file formats explained," Department of Computer Science, University of Sheffielf, 2001.
[19] A. Shpunt and B. Pesach, "Optical pattern projection," 2010.
[20] PrimeSense. (2013). solutions. Available: http://www.primesense.com/solutions/
[21] K.-l. Tsai, "An Approach to Translate Recorded Human Motions for Humanoid Robots," 2007.
[22] D. Wooldridge, ‘bvhacker’ computer software version1.8. Available: http://www.bvhacker.com, BVH file viewing, analysing, converting, fault finding and preparation.
[23] J. Nohra, ‘Bigfoot [viewer]’ computer software version 0.1b. Available: http://jadnohra-tech.blogspot.com/, a free .BVH player.
[24] D. Eberly. (2008). "Euler angle formulas". Available: http://www.geometrictools.com/Documentation/EulerAngles.pdf

QR CODE