簡易檢索 / 詳目顯示

研究生: 陳威誠
Wei-Cheng Chen
論文名稱: 基於工業型機器手臂之人機協同搬運研究
Investigation of Human-Robot Cooperative Handling Using an Industrial Robot
指導教授: 陳亮光
Liang-Kuang Chen
口試委員: 林紀穎
Chi-Ying Lin
張以全
I-Tsyuen Chang
學位類別: 碩士
Master
系所名稱: 工程學院 - 機械工程系
Department of Mechanical Engineering
論文出版年: 2022
畢業學年度: 110
語文別: 中文
論文頁數: 78
中文關鍵詞: 工業型機器手臂人機互動順應性系統
外文關鍵詞: industrial robotic arm, human-robot interaction, compliance system
相關次數: 點閱:204下載:17
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

在醫療體系人員不足的情況下,逐漸開始使用自動化設備的輔助來降低醫療人員的負擔,因此本文將利用工業型機械手臂作為輔具,進行對病人的人機協同式搬運作業,考慮到安全性,本研究並沒有對真人進行搬運實驗,而是使用1:2縮小版進行實驗模擬。
在本文中,使用三種不同的感測器對操作者以及病人的姿態進行量測,並開發出一套演算法,在操作者的搬運行為稍有變化之下,使機械手臂能夠判斷人的意圖,適應並且調整其運動行為,使被搬運者(病人)的姿態都保持在小角度誤差範圍內,達到具有安全性且即時互動性高的人機協同搬運作業。
實驗結果顯示,此控制方法可以讓病人的姿態保持在10度內,利用微小且穩定的角度誤差使搬運速度改變至少20%。


In the case of insufficient personnel in the medical system, the assistance of automated equipment is gradually used to reduce the burden of medical personnel. Therefore, this paper will use an industrial robotic arm as an auxiliary device to carry out human-machine coordinated handling of patients, taking into account safety. , this study did not carry out handling experiments on real people, but used a 1:2 scaled down version for experimental simulation.
In this paper, three different sensors are used to measure the posture of the operator and the patient, and an algorithm is developed to enable the robot arm to judge the intention of the human under the slight change of the operator's handling behavior. , adapt and adjust its movement behavior, so that the posture of the person being transported (patient) is kept within a small angular error range, so as to achieve the human-machine cooperative transportation operation with safety and high instant interaction.
The experimental results show that this control method can keep the patient's posture within 10 degrees, and use a small and stable angle error to change the transport speed by at least 20%.

摘 要 I ABSTRACT II 目錄 III 圖索引 V 表索引 VII 第一章 緒論 1 1.1 前言與研究目標 1 1.2 文獻回顧 4 1.2.1 EGM模組應用(EGM module application) 4 1.2.2 Kinect視覺影像 5 1.2.3 協作式機器人(Collaborative robot) 6 1.3 研究目的 7 1.4 論文架構 7 第二章 合作搬運之硬體架構 9 2.1 機械手臂背景說明及通訊設備介紹 9 2.1.1 ABB機器人狀態即時監控通訊 10 2.2 Externally Guided Motion (EGM) 10 2.2.1 EGM之通訊方法DeviceNet 11 2.3 Kinect 影像感測器 13 2.3.1 Kinect通訊方式 13 2.3.2 Kinect誤差測試實驗 14 2.4 IMU慣性感測器 15 2.5 力量感測器 16 2.5.1 力量感測器的通訊方式 17 2.6 人機介面 18 第三章 EGM模型建立 20 3.1 EGM前置設定 20 3.2 EGM之響應時間 21 3.3 EGM之響應增益及飽和 24 3.4 EGM模型之建立 31 3.4.1 經驗模型 31 3.4.2 Matlab 系統識別 34 第四章 合作搬運控制器設計 37 4.1 情境設計 38 4.1.1 前置設定 38 4.1.2 受試者情境設計 40 4.2 控制器設計 40 4.2.1 姿態控制 40 4.2.2 姿態速度整合 41 4.3 EGM Inverse Model 43 第五章 實驗設計與結果 47 5.1 實驗規劃與流程 47 5.2 實驗結果與討論 48 5.2.1 情境一實驗結果 48 5.2.2 情境二實驗結果 51 5.2.3 情境三實驗結果 54 5.3 結果與討論 57 第六章 結論與未來展望 58 6.1 研究總結 58 6.2 未來展望 58 參考文獻 59 附錄A 62 附錄B 63 附錄C 66 附錄D 67

[1] Stadelmann, L., Sandy, T., Thoma, A. and Buchli, J. (2019). End-effector pose correction for versatile large-scale multi-robotic systems. IEEE Robotics and Automation Letters, 4(2), 546-553.
[2] Gao, J. (2016). Industrial robot motion control for joint tracking in laser welding.
[3] Chen, S. and Wen, J. T. (2019). Industrial Robot Trajectory Tracking Using Multi-Layer Neural Networks Trained by Iterative Learning Control. arXiv preprint arXiv:1903.00082.
[4] Zhang, Q., Yang, S., Liu, H., Xie, C., Cao, Y. and Wang, Y. (2018). Real-time implementation of a joint tracking system in robotic laser welding based on optical camera. Transactions on Intelligent Welding Manufacturing, pp. 99-111. Springer, Singapore.
[5] Karlsson, M., Bagge Carlson, F., De Backer, J., Holmstrand, M., Robertsson, A. and Johansson, R. (2016). Robotic seam tracking for friction stir welding under large contact forces. 7th Swedish Production Symposium (SPS), pp. 25-27.
[6] Mao, Y., Lu, Q. and Xu, Q. (2018). Visual Servoing Control Based on EGM Interface of an ABB Robot. 2018 IEEE Chinese Automation Congress (CAC), pp. 3260-3264.
[7] Mæhre, Ø. (2016). Following Moving Objects Using Externally Guided Motion (EGM), Master's thesis, University of Stavanger, Norway.
[8] Vinh, T. Q. and Tri, N. T. (2015). Hand gesture recognition based on depth image using kinect sensor. 2015 IEEE 2nd National Foundation for Science and Technology Development Conference on Information and Computer Science (NICS), pp. 34-39.
[9] Liu, Y., Dong, M., Bi, S., Gao, D., Jing, Y. and Li, L. (2016). Gesture recognition based on Kinect. 2016 IEEE International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (CYBER), pp. 343-347.
[10] Ding, I. J., Chang, C. W. and He, C. J. (2014). A kinect-based gesture command control method for human action imitations of humanoid robots. 2014 IEEE International Conference on Fuzzy Theory and Its Applications, pp. 208-211.
[11] Torres, S. H. M. and Kern, M. J. (2017). 7 DOF industrial robot controlled by hand gestures using Microsoft Kinect v2. 2017 IEEE 3rd Colombian Conference on Automatic Control (CCAC), pp. 1-6.
[12] Ghonge, E. P. and Kulkarni, M. N. (2017). Gesture based control of IRB1520ID using Microsoft's Kinect. 2017 IEEE 2nd International Conference on Communication and Electronics Systems (ICCES), pp. 355-358.
[13] Tian, Y., Meng, X., Tao, D., Liu, D. and Feng, C. (2015). Upper limb motion tracking with the integration of IMU and Kinect. Neurocomputing, 159, 207-218.
[14] Endsley, M. R. and Kiris, E. O. (1995). The out-of-the-loop performance problem and level of control in automation. Human factors, 37(2), 381-394.
[15] Fong, T., Thorpe, C. and Baur, C. (2001). Collaboration, dialogue and human-robot interaction. Proceedings of the 10th International Symposium of Robotics Research.
[16] Scholtz, J. (2003). Theory and evaluation of human robot interactions. IEEE 36th Annual Hawaii International Conference on System Sciences, 2003. Proceedings of the, pp. 10-pp.
[17] Lee, K. Y., Lee, S. Y., Choi, J. H., Lee, S. H. and Han, C. S. (2006). The application of the human-robot cooperative system for construction robot manipulating and installing heavy materials. 2006 IEEE SICE-ICASE International Joint Conference, pp. 4798-4802.
[18] Rahman, S. M., Ikeura, R., Nobe, M. and Sawai, H. (2010). Design guidelines for industrial power assist robots for lifting heavy objects based on human's weight perception for better HRI. 2010 IEEE 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 121-122.
[19] Saida, M., Medina, J. R. and Hirche, S. (2012). Adaptive attitude design with risk-sensitive optimal feedback control in physical human-robot interaction. 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, pp. 955-961.
[20] Yang, C., Ganesh, G., Haddadin, S., Parusel, S., Albu-Schaeffer, A. and Burdet, E. (2011). Human-like adaptation of force and impedance in stable and unstable interactions. IEEE transactions on robotics, 27(5), 918-930.
[21] Erden, M. S. and Billard, A. (2014). End-point impedance measurements across dominant and nondominant hands and robotic assistance with directional damping. IEEE transactions on cybernetics, 45(6), 1146-1157.
[22] Erden, M. S. and Billard, A. (2015). Robotic assistance by impedance compensation for hand movements while manual welding. IEEE transactions on cybernetics, 46(11), 2459-2472.
[23] Sebanz, N., Bekkering, H. and Knoblich, G. (2006). Joint action: bodies and minds moving together. Trends in cognitive sciences, 10(2), 70-76.
[24] Sakita, K., Ogawara, K., Murakami, S., Kawamura, K. and Ikeuchi, K. (2004). Flexible cooperation between human and robot by interpreting human intention from gaze information. 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)(IEEE Cat. No. 04CH37566), Vol. 1, pp. 846-851.
[25] Carlson, T. and Demiris, Y. (2012). Collaborative control for a robotic wheelchair: evaluation of performance, attention, and workload. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 42(3), 876-888.
[26] ABB(2016)。Product manual IRB 1200。
[27] ABB(2018)。Application manual DeviceNet Master/Slave。
[28] ICP DAS(2016)。I-7565-DNM USB / DeviceNet Master Converter User’s Manual。
[29] ABB(2018)。Application manual Controller software IRC5。
[30] ROBOTIQ(2016)。Robotiq Force Torque Sensor FT 150/300 Instruction Manual

QR CODE