簡易檢索 / 詳目顯示

研究生: 賴宥升
Yu-Sheng Lai
論文名稱: 具備使用無障礙電梯功能之自主移動機器人開發
Development of Autonomous Mobile Robot with Barrier Free Elevator Accessing Functions
指導教授: 郭重顯
Chung-Hsien Kuo
口試委員: 陳金聖
CHIN-SHENG CHEN
鍾聖倫
Sheng-Luen Chung
劉孟昆
Meng-Kun Liu
學位類別: 碩士
Master
系所名稱: 電資學院 - 電機工程系
Department of Electrical Engineering
論文出版年: 2021
畢業學年度: 109
語文別: 英文
論文頁數: 81
中文關鍵詞: 基於圖像的影像伺服控制微分運動學SLAM導航深度學習
外文關鍵詞: Image-based visual servo, Differential kinematics, SLAM navigation, Deep learning
相關次數: 點閱:263下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

由於自主移動機器人(AMR)的跨樓層導航能力的高度發展,本研究主旨是擴展機器人在具有無障礙電梯的建築物中的移動能力。具備這樣進入電梯的能力,移動機器人(AMR)能夠在不同樓層的空間提供服務。面板按鈕按壓(PBP)是AMR進入無障礙電梯的關鍵技術。通常PBP是一種圖像伺服跟踪任務,它包括基於圖像的按鈕檢測和使用移動機械手進行的手眼 (EIH) 圖像伺服跟踪。 在本研究中的跨樓層PBP任務分為三個階段:第一階段為AMR導航至電梯車廂外並按下上/下按鈕的階段,第二階段為在關門前導航進入電梯車廂的階段,第三階段為AMR在電梯門延後關閉的時間內按下目標樓層按鈕的階段。隨著新型深度學習技術的進步,按鈕檢測可以很容易地透過開源代碼進行開發。除此之外,用於 PBP 任務的基於圖像的視覺伺服 (IBVS) 技術在文獻中已經有很好的發展。儘管如此,實現PBP的挑戰不僅在於完成整個任務,還在於IBVS運行時間要滿足無障礙電梯規範的運行時間的限制,尤其是在PBP任務的第2階段和第3階段。 在實際電梯環境的測量中,從按下向上/向下按鈕到電梯關門所需的最短時間為 16 秒。這種情形會在電梯車廂與機器人在同一樓層待命時發生。而從AMR通過電梯門到按下目的地樓層的延長時間為6.5秒,因此最差的狀況下AMR要在22.5秒的時間內按到電梯樓層的按紐。為了實現操作時間限制,本研究使用來自 LiDAR 同步定位和映射 (SLAM) 的位置信息來提供 EIH 相機大致的初始位置。此外,我們提出了一種改進的 IBVS 控制器,它使用具有 XY/Z 軸分離的 IBVS 解決方案,並結合自適應可變增益控制法則來加速圖像跟踪效率。初步程序完成後,IBVS 將會進行兩種模式的運行:分別為具有純機械手操作 (PMO) 的 IBVS 和具有結合車輪和機械手操作 (CWMO) 的 IBVS。實驗結果表明PMO模式的PBP任務平均執行時間為20.25秒,CWMO模式PBP任務平均執行時間為19.11秒。兩種控制模式都在 22.5 秒的最壞情況下完成PBP任務。


Owing to the high development of the cross-floor navigation ability for autonomous mobile robot (AMR), this study aims to extend the moving capability of AMR in different floor with barrier-free elevators. The key technology of accessing the barrier-free elevator is panel button pressing (PBP). The PBP tasks in our study are divided into three stages: elevator external panel pressing, elevator cabin accessing, and elevator floor button pressing. With the advancement of novel deep learning technologies, button detection could be easily developed in terms of open source codes. However, the real challenges for the PBP task is to meet the barrier-free elevator time condition of actual scene in stage2 and stage3. From practical measurement of the barrier-free elevator for the experiments in this study, the minimum time required from pressing external panel button to door closing is 16 seconds. This scenario occurs when the elevator cabin is standby on the same floor as the robot. In addition, the extension of door closing time while AMR passing through the cabin door is 6.5 seconds. Therefore, in the worst case, the operation time of the system in the second and third stages is 22.5 seconds. To achieve the operating time limitation, this study uses the location information from LiDAR simultaneous localization and mapping (SLAM) to provide a roughly initial pose of the EIH camera. Moreover, we propose an improved IBVS controller which uses the IBVS solution with XY/Z axis separation and combines the adaptive variable gain control law to accelerate the image tracking efficiency. When the preliminary procedure is finished, the IBVS is operated in two different scenarios: IBVS with pure manipulator operation (PMO) and IBVS with combined wheel and manipulator operation (CWMO). The experimental results show that the average time PMO mode cost is 20.25 seconds and that the CWMO mode cost is 19.11 seconds. Both control modes are within the worst case of 22.5 seconds.

指導教授推薦書 II 口試委員會審定書 III 誌謝 IV 摘要 V Abstract VI List of Table IX List of figures X Nomenclature XII Chapter Introduction 1 1.1 Background and Motivation 1 1.2 System Introduction 2 1.3 Chapter Review 4 1.4 Literature Review 5 1.4.1 Related papers on AMR with Elevator Accessing Function 5 1.4.2 Related papers on Improving IBVS System Structure 6 1.4.3 Related papers on Improving IBVS Gain Control 7 Chapter 2 System Architecture and Design 9 2.1 System Architecture 9 2.2 Hardware Architecture 10 2.2.1 AMR Mobile Platform 11 2.2.2 4-DOF SCARA Manipulator 15 2.3 Robot Operating System 19 2.4 Simultaneous Localization and Mapping Technology 20 2.5 System Operation Process and Design 22 Chapter 3 Visual Servo Control System Design 26 3.1 Introduction of Visual Servo 26 3.2 Image-based Visual Servo Control 28 3.2.1 Image Jacobian Matrix 29 3.2.2 Jacobian Matrix of Robots 32 3.3 Classic IBVS Controller Design 36 3.3.1. Coordinate Conversion and Jacobian Matrix Combination 36 3.3.2. Controller Design 37 3.3.3. Pseudoinverse Matrix Solution and Singularity Analysis 37 3.4 Improved IBVS Controller Design 38 3.4.1 XY/Z-Partitioned IBVS Scheme 38 3.4.2 Adaptive Gain Control Law 40 3.4.3 Improved IBVS Controller Design 41 Chapter 4 Image Feature Extraction and Recognition 42 4.1 You Only Look Once(YOLO) 42 4.2 Two-layer YOLO Recognition Network Architecture 43 4.3 Elevator Button Database Collection 44 4.4 Image Extraction and Image Enhancement 46 Chapter 5 Experimental Results and Analysis 47 5.1 Comparison of Traditional and Improved IBVS Controller 47 5.2 Cross-Floor PBP Experiment 53 5.2.1 Stage 1: External Panel Button Pressing Stage 54 5.2.2 Stage 2: AMR Entering Stage 56 5.2.3 Stage 3: Floor Button Pressing Stage 57 5.2.4 System Stability Test 62 Chapter 6 Conclusions and Future Work 64 6.1 Conclusion 64 6.2 Future Work 65 Reference 66

[1] H. Durrant-Whyte and T. Bailey, “Simultaneous localization and mapping: part I, ” IEEE Robotics & Automation Magazine, vol. 13, no. 2, pp. 99-110, June 2006.
[2] T. Bailey and H. Durrant-Whyte, “Simultaneous localization and mapping (SLAM): part II, ” IEEE Robotics & Automation Magazine, vol. 13, no. 3, pp. 108-117, Sept. 2006.
[3] E. Zalama1, J. G. García-Bermejo, “Sacarino, a Service Robot in a Hotel Environment,” ROBOT2013: First Iberian Robotics Conference, Advances in Intelligent Systems and Computing, vol 253. Springer, 2013
[4] Q. X. Yu, Y. Can, and F.Zhuang, “Research of the Localization of Restaurant Service Robot,” International Journal of Advanced Robotic Systems, Vol. 7, No. 3, ISSN 1729-8806, pp. 227-238, September 2010.
[5] D. Portugal, P. Alvito, E. Christodoulou, “A Study on the Deployment of a Service Robot in an Elderly Care Center,”Int J of Soc Robotics, pp. 317–341, 2019.
[6] S. Kohlbrecher, O. von Stryk, J. Meyer and U. Klingauf, “A flexible and scalable SLAM system with full 3D motion estimation,” IEEE International Symposium on Safety, Security, and Rescue Robotics, pp. 155-160, 2011.
[7] T. Martinez-Marin and T. Duckett, “Robot docking by reinforcement learning in a visual servoing framework,” IEEE Conference on Robotics, Automation and Mechatronics, vol.1, pp. 159-164, 2004.
[8] S. Hutchinson, G. D. Hager and P. I. Corke, “A tutorial on visual servo control,” IEEE Transactions on Robotics and Automation, vol. 12, no. 5, pp. 651-670, Oct. 1996.
[9] C. Hu, C. Chen, C. Tseng, A. P. Yudha and C. Kuo, “Visual servoing spanner picking and placement with a SCARA manipulator,”IEEE International Conference on Industrial Technology (ICIT), pp. 1632-1637, 2016.
[10] Y. Wang, H. Lang and C. W. de Silva, “A Hybrid Visual Servo Controller for Robust Grasping by Wheeled Mobile Robots,” IEEE/ASME Transactions on Mechatronics, vol. 15, no. 5, pp. 757-769, Oct. 2010.
[11] Joseph Redmon and Ali Farhadi, “YOLOv3: An Incremental Improvemen,” Computer Vision and Pattern Recognition, pp. 1-5, 2018.
[12] K. A. Yu, “Development of a Deep Learning Classification Approach for Indoor Personal Image Servo Manipulation Objects,” Master thesis of National Taiwan University of Science and Technology, July 2017.
[13] P. I. Corke and S. A. Hutchinson, “A new partitioned approach to image-based visual servo control,” IEEE Transactions on Robotics and Automation, vol. 17, no. 4, pp. 507-515, Aug. 2001.
[14] M. G. Krishnan and S. Ashok, “Adaptive Gain Control law for Visual Servoing of 6DOF Robot Manipulator,” International Conference on Control, Power, Communication and Computing Technologies (ICCPCCT), pp. 57-64, 2018.
[15] A. A. Abdulla, H. Liu, N. Stoll and K. Thurow, “A robust method for elevator operation in semi-outdoor environment for mobile robot transportation system in life science laboratories, ”IEEE 20th Jubilee International Conference on Intelligent Engineering Systems (INES), pp. 45-50, 2016.
[16] P. Y. Yang, and T. H. Chang, “Intelligent Mobile Robot Controller Design for Hotel Room Service with Deep Learning Arm-Based Elevator Manipulator,” International Conference on System Science and Engineering (ICSSE), November 2018
[17] J. G. Juang , C. L. Yu , C. M. Lin , R. G Yeh ,and I. J. Rudas, “Real-Time Image Recognition and Path Tracking of a Wheeled Mobile Robot for Taking an Elevator,” Acta Polytechnica Hungarica, vol. 10, No. 6,pp. 5-23, 2013.
[18] K. Deguchi, “Optimal Motion Control for Image-Based Visual Servoing by Decoupleing Translation and Rotation,” IEEE/RSJ International Conference on Intelligent Robots and Systems. Innovations in Theory, Practice and Applications (Cat. No.98CH36190), pp. 705-711, 1998.
[19] Y. He , “A direct-drive SCARA robot for wafer&ceramic-substrate handling based on visual servoing,” IEEE International Conference on Cybernetics and Intelligent Systems (CIS) and IEEE Conference on Robotics, Automation and Mechatronics (RAM), pp. 428-433, 2017.
[20] H. Nobakht and Y. Liu, “A hybrid positioning method for eye-in-hand industrial robot by using 3D reconstruction and IBVS,” IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 2609-2614, 2015.
[21] M. G. Krishnan and S. Ashok, “Adaptive Gain Control law for Visual Servoing of 6DOF Robot Manipulator,” International Conference on Control, Power, Communication and Computing Technologies (ICCPCCT), pp. 57-64, 2018.
[22] T. Yüksel, “IBVS with fuzzy sliding mode for robot manipulators,” International Workshop on Recent Advances in Sliding Modes (RASM), pp. 1-6, 2015.
[23] A. Ghasemi and W. Xie, “Decoupled image-based visual servoing for robotic manufacturing systems using gain scheduled switch control,” International Conference on Advanced Mechatronic Systems (ICAMechS), pp. 94-99, 2017.
[24] K. Deguchi, “Optimal motion control for image-based visual servoing by decoupling translation and rotation,” Proc. Int. Conf. Intelligent Robots and Systems, pp. 705-711, Oct. 1998.
[25] Y. Wang, H. Lang, and C. W. de Silva, “A Hybrid Visual Servo Controller for Robust Grasping by Wheeled Mobile Robots,” in IEEE/ASME Transactions on Mechatronics, vol. 15, no. 5, pp. 757-769, Oct. 2010.

無法下載圖示 全文公開日期 2024/07/11 (校內網路)
全文公開日期 本全文未授權公開 (校外網路)
全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
QR CODE