簡易檢索 / 詳目顯示

研究生: 陳詩霖
Shih-lin Chen
論文名稱: 一個轉換人體動作至具有非正交轉軸肢體之人形機器人的方法
A Method for Translating Human Motions to Humanoid Robots with Non-orthogonal Limb Axes
指導教授: 鄧惟中
Wei-Chung Teng
口試委員: 邱士軒
Shih-Hsuan Chiu
李維楨
Wei-chen Lee
姚智原
Chih-Yuan Yao
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2015
畢業學年度: 103
語文別: 中文
論文頁數: 49
中文關鍵詞: 非正交轉軸人形機器人動作轉換肢體向量
外文關鍵詞: Non-orthogonal Axes, Humanoid Robots, Motion Translate, Limb vector
相關次數: 點閱:315下載:1
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

將人體動作直接轉換至人形機器人是一種快速且直覺的機器人動作輸入方式。本研究即實作出一個動作轉換方法,使用Microsoft Kinect作為人體動作擷取裝置,並使用數學方法計算三維空間中的向量,計算出人形機器人各關節的旋轉角度,讓一般使用者用自己的身體就可以輕易的控制人形機器人作出與使用者相同的動作。另外,部分人形機器人的肢體具有非正交的轉軸,當非正交的轉軸旋轉就會使得人形機器人肢體同時產生Pitch、Roll、Yaw三個方向的角度,在計算上會比正交轉軸還要複雜,所以過去的研究常捨棄這類轉軸的資料。本論文提出一簡易的通解,可以將非正交轉軸的自由度加入至轉換結果,以呈現更貼近人體的動作。

本論文基於此動作轉換方法進行實作,將從Kinect擷取的人體動作轉換成人形機器人關節的旋轉角度,並使用Cyberbotics Webots機器人模擬器中的Aldebaran NAO進行模擬,確認了模擬器中的人形機器人可以做出與使用者相同的動作。


It is a rapid yet intuitive way to create motions of humanoid robots by translating human’s motions directly to robot’s ones. This work implemented a motion translation system using inputs from Microsoft Kinect. A sensed human postures is decomposed to many vectors in 3D space, and these vectors are further used to calculate the rotation degrees of joints in the target humanoid robot. In this way, users are able to control robots which play as a copycat. Whereas, some humanoid robots have non-orthogonal limb axes and these degrees of freedom were ignored in the past studies. It was considered complicated to utilize these degrees as the rotation angles may contain pitch, roll, and yaw at the same time. In this research, a set of general formulas was developed to include these rotation angles within the translation process. We validated our method by implementing the method on Aldebaran NAO robot through Cyberbotics Webots robot simulator. The implementation includes several users’ motions: in-situ dribble, boxing moves, squat, and stoop. The results show that our method successfully make NAO robot mimics the users’ motions.

摘要............................................i Abstract........................................ii 目錄............................................iii 圖目錄..........................................v 表目錄..........................................vii 1 緒論..........................................1 1.1 前言........................................1 1.2 研究動機與目的..............................1 1.3 論文架構....................................2 2 背景知識與文獻探討............................4 2.1 旋轉矩陣與歐拉旋轉定理......................4 2.1.1 旋轉矩陣..................................4 2.1.2 歐拉旋轉定理..............................5 2.2 零力矩點....................................6 2.3 來源動作產生方式............................7 2.3.1 動作捕捉技術..............................7 2.3.1.1 穿戴式感應器與光學式高速攝影機..........7 2.3.1.2 Microsoft Kinect........................8 2.3.1.3 影像辨識................................9 2.3.2 動作編輯..................................10 2.3.2.1 角度輸入................................10 2.3.2.2 直接轉動機器人..........................11 2.3.2.3 拖曳3D模型..............................11 2.4 物理引擎與機器人模擬器......................12 2.5 相關研究....................................13 3 動作轉換與平衡方法............................17 3.1 人體與人形機器人骨架相關對應關係............17 3.1.1 世界座標系與本地座標系....................17 3.1.2 人體關節與人形機器人對應..................18 3.1.2.1 自由度對應..............................18 3.1.3 人形機器人骨架需求與初始位置對應..........20 3.1.3.1 人形機器人骨架需求......................20 3.1.3.2 人形機器人初始位置對應..................22 3.2 來源動作轉換至人形機器人....................23 3.2.1 相關符號定義與計算方法....................23 3.2.2 手部動作轉換..............................26 3.2.3 腿部動作轉換..............................29 3.2.4 腰部動作轉換..............................32 3.2.5 非正交轉軸轉換............................34 3.3 平衡方法....................................38 3.3.1 零力矩點計算..............................38 3.3.2 踝關節平衡方法............................38 3.3.3 腰部平衡方法..............................40 3.4 實作與驗證..................................41 3.4.1 實驗環境與系統架構........................41 3.4.2 實驗結果..................................42 3.4.3 加入非正交轉軸後之比較....................44 4 結論與未來工作................................46 4.1 結論........................................46 4.2 未來工作....................................46 參考文獻........................................47

[1]G. Slabaugh, “Computing Euler angles from a rotation matrix,” denoted as TRTA implementation from: http://www.starfireresearch.com/services/java3d/samplecode/FlorinEulers.html, 1999.
[2]K. Choi, H. Lee, and M. Lee, “Fuzzy posture control for biped walking robot based on force sensor for ZMP,” proceeding of SICE-ICASE International Joint Conference 2006, pp. 1185–1189, 2006.
[3]K. Suwanratchatamanee, M. Matsumoto, and S. Hashimoto, “Balance Control of Robot and Human-Robot Interaction with Haptic Sensing Foots,” proceeding of 2nd International Conference on Human System Interaction, pp. 68–74, 2009.
[4]D. Krkljes, L. Nagy, M. Nikolic, and B. Kalman, “Foot Force Sensor – Error Analysis of the ZMP Position Measurement,” proceeding of 7th International Symposium on Intelligent Systems and Informatics, pp. 221–226, 2009.
[5]L. Chou and W. Wang, “A HUMANOID ROBOT WITH MOTION IMITATION ABILITY,” proceeding of International Conference on Machine Learning and Cybernetics, pp. 19–22, 2007.
[6]P. Azad, T. Asfour, and R. Dillmann, “Toward an Unified Representation for Imitation of Human Motion on Humanoids,” proceeding of IEEE International Conference on Robotics and Automation, pp. 10–14, 2007.
[7]C. H. Wang, “An Interactive Motion Editing and Translating System for Humanoid Robots of Different DoF Configurations,” 2013.
[8]N. S. Pollard, J. K. Hodgins, M. J. Riley, and C. G. Atkeson, “Adapting Human Motion for the Control of a Humanoid Robot,” proceeding of IEEE International Conference on Robotics and Automation, pp. 1390–1397, 2002.
[9]A. Nakazawa, S. Nakaokat, K. Ikeuchit, and K. Yokoitt, “Imitating Human Dance Motions through Motion Structure Analysis,” proceeding of IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2539–2544, 2002.
[10]S. Nakaoka, A. Nakazawa, K. Yokoitt, H. Himkawa, and K. Ikeuchi, “Generating Whole Body Motions for a Biped Humanoid Robot from Captured Human Dances,” proceeding of IEEE International Conference on Robotics and Automation, pp. 3905–3910, 2003.
[11]S. Nakaoka, A. Nakazawa, F. Kanehiro, K. Kaneko, M. Morisawa, and K. Ikeuchi, “Task Model of Lower Body Motion for a Biped Humanoid Robot to Imitate Human Dances,” proceeding of IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3157–3162, 2005.
[12]M. Ruchanurucks, S. Nakaoka, and K. Ikeuchi, “Humanoid Robot Motion Generation with Sequential Physical Constraints,” proceeding of IEEE International Conference on Robotics and Automation, pp. 2649–2654, 2006.
[13]D. Matsui, T. Minato, K. F. MacDorman, and H. Ishiguro, “Generating Natural Motion in an Android by Mapping Human Motion,” proceeding of IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3301–3308, 2005.
[14]M. Do, P. Azad, T. Asfour, and R. Dillmann, “Imitation of Human Motion on a Humanoid Robot using Non-Linear Optimization,” proceeding of IEEE-RAS International Conference on Humanoid Robots, pp. 545–552, 2008.
[15]T. Liu, H. Utsunomiya, Y. Inoue, and K. Shibata, “Synchronous imitation control for biped robot based on wearable human motion analysis system,” proceeding of IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2744–2749, 2008.
[16]K. Teachasrisaksakul, Z. Zhang, and G.-Z. Yang, “Demo Abstract: Upper Limb Motion Imitation Module for Humanoid Robot Using Biomotion+ Sensors,” proceeding of IEEE International Conference on Body Sensor Networks, pp. 1–3, 2013.
[17]J. Koenemann, F. Burget, and M. Bennewitz, “Real-time Imitation of Human Whole-Body Motions by Humanoids,” proceeding of IEEE International Conference on Robotics and Automation, pp. 2806–2812, 2014.
[18]Y. Zheng, “Humanoid Robot Behavior Emulation and Representation,” 2012.
[19]F. Zuher and R. Romero, “Recognition of human motions for imitation and control of a humanoid robot,” proceeding of IEEE International Conference on Robotics and Automation, pp. 190–195, 2012.
[20]M. Wasielica, M. Wasik, A. Kasi’ nski, and P. Skrzypczy’ nski, “Interactive Programming of a Mechatronic System: A Small Humanoid Robot Example,” proceeding of IEEE/ASME International Conference on Advanced Intelligent Mechatronics, pp. 459–464, 2013.
[21]M. Guo, S. Das, J. Bumpus, E. Bekele, and N. Sarkar, “Interfacing of Kinect Motion Sensor and NAO Humanoid Robot for Imitation Learning,” 2013.
[22]L. Poubel, S. Sakka, D. Cehajic, and D. Creusot, “Support Changes during Online Human Motion Imitation by a Humanoid Robot using Task Specificatio,” proceeding of IEEE International Conference on Robotics and Automation, 2014.
[23]T. CHI-CHANG, “Limb Vector Paralleling: A General-purpose Approach to Translate Human Posture to Humanoid Robots,” 2014.
[24]Z.-L. Bao, S.-L. Chen, C.-C. Tung, R. Y. Tara, H. Fabroyir, and W.-C. Teng, “Limb-vector paralleling: A general approach to translate postures from human to humanoid robots,” proceeding of International Conference on Advanced Robotics and Intelligent Systems, pp. 51–55, 2014.
[25]N. Kofinas, “Forward and inverse kinematics for the NAO humanoid robot,” 2012.

QR CODE