簡易檢索 / 詳目顯示

研究生: 李胤鋒
YIN-FENG LI
論文名稱: 人類動作轉換至自由度不足之小型人形機器人之研究
A Study of Human Motion Translation to Small-sized Humanoid Robots of Insufficient Degrees of Freedom
指導教授: 鄧惟中
Wei-Chung Teng
口試委員: 鮑興國
Hsing-Kuo Pao
項天瑞
none
謝仁偉
none
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2014
畢業學年度: 102
語文別: 中文
論文頁數: 70
中文關鍵詞: 小型人形機器人動作轉換
外文關鍵詞: Small-sized humanoid robots, Motion translation
相關次數: 點閱:209下載:2
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 科技日新月異,不僅機器人高度發展,人形機器人更是趨於成熟。其中,小型人形機器人既符合教學目的,同時更具娛樂性質,從市面上的各種小型人形機器人的琳瑯滿目,即可看出些端倪,然而比起中型與大型的人形機器人,其自由度的設計更顯不足。如何在這類四肢自由度不足的機器人上表現出如人類生動流暢且穩定之動作,是一個值得探討的問題。
    本研究著眼於人類動作於小型人形機器人的全身轉換,內容包含上身和腰部實作以及下身平衡。於上身的肩膀關節連動手肘與手腕,其自由度的不足將使得肩膀肢段往不同於來源動作的方向呈現,直接造成手肘與手腕點位置的改變,其表現之姿勢也與來源姿勢存在著相當的落差,本研究的手部轉換處理,實際降低了手肘與手腕點的距離誤差,保持一定的相似。腰部動作是身體平衡的重要關鍵,然而機器人缺少此關節的設計,不僅影響整體平衡,也使得機器人的動作總是呈現上半身死板的狀況,本研究實作出腰部的轉角,讓機器人可以做出更多種日常生活中的動作,如彎腰、蹲下、籃球罰球等,表現出的動作更是如實呈現人類模樣,其成果有別於其它研究僅是呈現人類上身簡單的晃手動作。髖關節自由度的不足直接影響膝蓋的位置與姿勢,這部份也間接影響機器人的平衡,本研究將其位置修正,其結果大大的降低了與來源動作位置的誤差。在腳底的平衡上,人類骨架設計為腳踝與腳趾的關節同時動作,而機器人的馬達設計僅有腳踝,結構設計的相異致使了人類動作上的平衡並無法確保機器人表現上也保有平衡,為此本研究提出了解決方法,不僅考慮腳底與地面角度的交互關係,更將平衡視做一個整體的考量。
    最後本研究選取了多個日常生活中的動作於機器人上做驗證,其中包含數據的驗證以及受試者觀察驗證的評估,數據的部分皆顯示出本研究轉換的動作與來源動作的確能保有相對低的誤差,而受試者實際觀察驗證的評估,也証實了轉換處理後的動作,更能有效的表現出來源動作所傳達的內容。


    Humanoid Robots, especially small-sized ones, are considered mature products theses day. Small-sized humanoids are adequate for teaching and entertainment purposes. However, the degrees of freedom (DOF) of small-sized humanoids are usually less than life-sized humanoids, not to mention human. Thus, it becomes an issue when we want small-sized humanoids to perform human-like smooth motions by their insufficient numbers of DOF.
    This study focuses on issues about translating human motions to humanoid commands, including motion similarity improvement and lower body balancing. There are three parts in this study. For arm motions, naive translation usually generates different wrist positions due to insufficient numbers of DOF and the movable range limit in each motor. The first part discusses a similarity measure based on the positions of elbow and wrist such that we can fine tune the generated arm motions. The second part explains how the absence of waist degree may destruct motion similarity, and how to compensate this problem on small-sized humanoids when waist rotation is not available. Finally, the third part introduces the differences between the structure of human legs and humanoid legs. A few tricks are then developed to adjust hip and heel joints to keep the humanoid robots from falling when performing a translated motion.
    To verify the above-mentioned techniques, we adopted the CMU motion capture database as input and performed the improved translation on nine selected motions. These motions are then played back on MotionBuilder, a low-end humanoid, and observed by 13 people. The results of feedback show that fine tuned motions obtain higher scores than direct translation version for all 9 motions.

    摘要 i Abstract ii 目錄 iii 圖目錄 v 表目錄 vii 第一章 緒論 1 1.1 前言 1 1.2 研究動機與目的 1 1.3 研究成果 4 1.4 論文架構 4 第二章 文獻探討 5 2.1 人形機器人 5 2.2 機器人的動作來源 10 2.2.1 手動編輯 10 2.2.1.1 GUI介面參數設定 10 2.2.1.2 拖曳式動作編輯 12 2.2.2 動作擷取資料 13 2.2.2.1 光學式高速攝影機 13 2.2.2.2 Kinect 19 2.2.2.3 其它擷取裝置 20 2.3 人與機器人的關節對應 22 2.3.1 關節對應 22 2.3.2 角度對應 23 第三章 機器人動作轉換法 25 3.1 動作轉換之問題 25 3.1.1 手臂肢段轉換的失真 25 3.1.2 腰部關節角度無法表現 26 3.1.3 大腿肢段轉換的失真 28 3.1.4 基本平衡站立 29 3.2 手部轉換處理 30 3.2.1 手部關節結構 30 3.2.2 轉換法內容 31 3.3 腰部轉換處理 34 3.3.1 腰部關節結構 34 3.3.2 轉換法內容 36 3.4 髖部轉換處理 38 3.4.1 髖部關節結構 38 3.4.2 轉換法內容 39 3.5 基本平衡轉換處理 41 3.5.1 腳底部位關節結構 41 3.5.2 轉換法內容 42 3.6 根關節轉換處理 43 3.6.1 根關節結構 43 3.6.2 轉換法內容 44 3.7 轉換處理總流程 45 第四章 轉換處理驗證 47 4.1 驗證動作 47 4.2 轉換處理結果 50 4.3 驗證結果數據 51 4.3.1 手部 51 4.3.2 腰部 51 4.3.3 髖部 53 4.3.4 基本平衡 53 4.4 受試者驗證評估 54 4.4.1 驗證設計 54 4.4.2 驗證結果 55 第五章 結論與未來工作 57 5.1 分析與結論 57 5.2 未來工作 59 參考文獻 60

    [1] Hashimoto, K., Yoshimura, Y., Kondo, H., Lim, H.-O., and Takanishi, A. “Realization of quick turn of biped humanoid robot by using slipping motion with both feet”. Proceedings of the IEEE International Conference on Robotics and Automation, pp. 2041-2046, 2011.
    [2] Kajita, S., Kaneko, K., Kaneiro, F., Harada, K., Morisawa, M., Nakaoka, S.-I. … and Hirukawa, H. “Cybernetic human hrp-4c: A humanoid robot with human-like proportions”. Proceedings of the International Symposium on Robotics Research, pp. 301-314, 2011.
    [3] Teng, W.-C., Wang, C.-H., Yang, K.-H. and Wang, P.-J. “Development of an interactive dynamic motion editing toolkit for biped humanoid robots”. Proceedings of the International Symposium on Robotics, 2012.
    [4] Teachasrisaksakul, K., Zhang, Z., and Yang, G.-Z. “Demo abstract: Upper limb motion imitation module for humanoid robot using biomotion+ sensors”. Proceedings of the IEEE International Conference on Body Sensor Networks, pp. 1-3, 2013.
    [5] Rosado, J., Silva, F., Santos, V., and Lu, Z. “Reproduction of human arm movements using Kinect-based motion capture data”. Proceedings of the IEEE International Conference on Robotics and Biomimetics, pp. 885-890, 2013.
    [6] Thobbi, A., and Sheng, W. “Imitation learning of hand gestures and its evaluation for humanoid robots”. Proceedings of the IEEE International Conference on Information and Automation, pp. 60-65, 2010.
    [7] Dariush, B., Gienger, M., Jian, B., Goerick, C., and Fujimura, K. “Whole body humanoid control from human motion descriptors”. Proceedings of the IEEE International Conference on Robotics and Automation, pp. 2677-2684, 2008.
    [8] Pollard, N. S., Hodgins, J. K., Riley, M. J., and Atkeson, C. G. “Adapting human motion for the control of a humanoid robot”. Proceedings of the IEEE International Conference on Robotics and Automation, Vol. 2, pp. 1390-1397, 2002.
    [9] Zanchettin, A. M., Rocco, P., Bascetta, L., Symeonidis, I., and Peldschus, S. “Kinematic analysis and synthesis of the human arm motion during a manipulation task”. Proceedings of the IEEE International Conference on Robotics and Automation, pp. 2692-2697, 2011.
    [10] Wasielica, M., Wasik, M., Kasinski, A., and Skrzypczynski, P. “Interactive programming of a mechatronic system: A small humanoid robot example”. Proceedings of the IEEE/ASME International Conference on Advanced Intelligent Mechatronics, pp. 459-464, 2013.
    [11] Zheng, Y.-A. “Humanoid Robot Behavior Emulation and Representation”. 2012.
    [12] Zuher, F., and Romero, R. “Recognition of human motions for imitation and control of a humanoid robot”. Proceedings of the Robotics Symposium and Latin American Robotics Symposium, pp. 190-195, 2012.
    [13] Igorevich, R. R., Ismoilovich, E. P., and Min, D. “Behavioral synchronization of human and humanoid robot”. Proceedings of the International Conference on Ubiquitous Robots and Ambient Intelligence, pp. 655-660, 2011.
    [14] Tung, C.-C. “Limb Vector Paralleling: A General-purpose Approach to Translate Human Posture to Humanoid Robots”. 2014.
    [15] Menezes, P., Lerasle, F., and Dias, J. “Towards human motion capture from a camera mounted on a mobile robot”. Image and Vision Computing, Vol. 29, No. 6, pp. 382-393, 2011.
    [16] Kim, J.-Y., and Kim, Y.-S. “Development of motion capture system using dual video cameras for the gait design of a biped robot”. International Journal of Humanoid Robotics, Vol. 8, No .02, pp. 275-299, 2011.
    [17] Chou, L.-P., and Wang, W.-J. “A humanoid robot with motion imitation ability”. Proceedings of the International Conference on Machine Learning and Cybernetics, Vol. 4, pp. 2031-2036, 2007.
    [18] Lee, W.-K., and Jung, S. “FPGA Design for Controlling Humanoid Robot Arms by Exoskeleton Motion Capture System”. Proceedings of the IEEE International Conference on Robotics and Biomimetics, pp. 1378-1383, 2006.
    [19] RoboBuilder Company Limited, “RoboBuilder Creator Kit 5720T” 2014. http://www.robobuilder.net/en/products/education/edu_04.php
    [20] CMU Graphics Lab, “The Carnegie Mellon University Motion Capture” 2014. http://mocap.cs.cmu.edu/

    QR CODE