簡易檢索 / 詳目顯示

研究生: 李勇緒
Yong-Syu Li
論文名稱: 基於Kinect相機與卡爾曼濾波器之接球移動機器人之設計
Design a Ball-catching Mobile Robot with Kinect and Kalman filter
指導教授: 施慶隆
Ching-Long Shih
口試委員: 施慶隆
Ching-Long Shih
黃志良
Chih-Lyang Hwang
李文猶
Wen-Yo Lee
學位類別: 碩士
Master
系所名稱: 電資學院 - 電機工程系
Department of Electrical Engineering
論文出版年: 2021
畢業學年度: 109
語文別: 中文
論文頁數: 79
中文關鍵詞: 移動機器人Kinect相機卡爾曼濾波器多項式擬合
外文關鍵詞: Mobile Robot, Kinect Camera, Kalman Filter, Polynomial Fitting
相關次數: 點閱:187下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本文旨在運用卡爾曼濾波與多項式擬合以及Kinect相機實現接球機器人。透過Kinect相機所提供之影像對目標球與機器人進行偵測與定位。Kinect提供彩色影像與深度影像兩種不同類型的影像資訊,經由彩色影像能夠偵測目標球在影像中的像素位置,接著藉由深度影像找到目標球的在世界座標中的真實位置,另外對於機器人的實際位置則是透過ARTag標籤進行定位。首先在不同的時刻得到目標球的實際位置,接著將這些位置資訊進行卡爾曼濾波更新,然後預測可能會出現多個的位置,經由多項式擬合,找到球的運動路線方程式,進而找到目標球的落點位置,最後在命令移動機器人移動至落點處,以完成機器人接球之動作。


    The objective of the thesis is to use Kalman filter, polynomial fitting and Kinect camera to realize the ball-catching robot. Detect and locate target balls and robots using images provided by Kinect cameras. Kinect provides two different types of image information, color image and depth image, through which the target ball can detect the pixel position in the image, and then through the depth image to find the real position of the target ball in the world coordinates, and for the robot’s actual position is located by the ARTag. First get the actual position of the target ball at different times, then update the position information for Kalman filtering, and then predict that there may be multiple positions, through polynomial fitting, find the ball’s course equation, and then find the target ball drop position, and finally in the command mobile robot to move to the drop point, to complete the robot’s catch action.

    摘要 I Abstract II 目錄 III 圖目錄 V 表目錄 VI 第1章 緒論 1 1.1 研究動機與目的 1 1.2 文獻回顧 1 1.3 論文大綱 3 第2章 移動機器人系統架構與控制 4 2.1 系統架構 4 2.2 硬體介紹 5 2.3 步進馬達驅動器 7 2.4 移動機器人控制 10 2.5 Modbus Protocol通訊格式 11 2.5.1 Modbus訊息構成 11 2.6 樹莓派(Raspberry PI) 14 2.7 ROS機器人系統 16 第3章 Kinect目標追蹤 17 3.1 Kinect簡介 17 3.1.1 Kinect硬體規格 18 3.1.2 軟體開發流程 20 3.2 彩色與深度影像匹配 21 3.2.1 相機內外部參數 22 3.2.2 彩色與深度影像空間轉換 25 3.2.3 彩色影像與世界座標轉換 27 3.3 目標軌跡追蹤 28 3.3.1 前後背景分離 28 3.3.2 目標偵測 29 3.4 機器人定位 31 3.4.1 ARTag位姿估計 34 第4章 目標運動軌跡與落點預測 37 4.1 卡爾曼濾波器 40 4.2 RANSAC多項式擬合 48 4.3 多項式擬合 50 4.4 落點預測 52 第5章 實驗結果與討論 55 5.1 相機影像實驗 55 5.1.1 彩色深度影像對齊 55 5.1.2 前後背景分離與目標球偵測 57 5.1.3 目標球的世界座標 58 5.2 預測模型實驗 58 5.2.1 路線預測 58 5.2.2 移動方程式 60 5.3 落點預測 61 5.4 車體定位 65 第6章 結論與建議 67 6.1 結論 67 6.2 建議 68 參考文獻 69

    [1] K.-Y. Lin and H.-M. Hang, ‘Depth map enhancement on RGB-D video captured by Kinect v2’, in 2018 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC), 2018, pp. 1530–1535.
    [2] Y. Su, W. Gao, Z. Liu, S. Sun, and Y. Fu, ‘Hybrid marker-based object tracking using Kinect v2’, IEEE Trans. Instrum. Meas., vol. 69, no. 9, pp. 6436–6445, Sep. 2020, doi: 10.1109/TIM.2020.2968756.
    [3] V. Viacheslav, F. Alexander, M. Vladimir, T. Svetlana, and L. Oksana, ‘Kinect depth map restoration using modified exemplar-based inpainting’, in 2014 12th International Conference on Signal Processing (ICSP), 2014, pp. 1175–1179.
    [4] A. Prahara and A. Pranolo, ‘Depth inpainting scheme based on edge guided non local means’, in 2017 3rd International Conference on Science in Information Technology (ICSITech), 2017, pp. 706–710.
    [5] N.-E. Yang, Y.-G. Kim, and R.-H. Park, ‘Depth hole filling using the depth distribution of neighboring regions of depth holes in the Kinect sensor’, in 2012 IEEE International Conference on Signal Processing, Communication and Computing (ICSPCC 2012), Hong Kong, China, Aug. 2012, pp. 658–661, doi: 10.1109/ICSPCC.2012.6335696.
    [6] K. R. Vijayanagar, M. Loghman, and J. Kim, ‘Real-time refinement of Kinect depth maps using multi-resolution anisotropic diffusion’, Mob. Netw. Appl., vol. 19, no. 3, pp. 414–425, Jun. 2014, doi: 10.1007/s11036-013-0458-7.
    [7] Z. Zivkovic and F. van der Heijden, ‘Efficient adaptive density estimation per image pixel for the task of background subtraction’, Pattern Recognit. Lett., vol. 27, no. 7, pp. 773–780, May 2006, doi: 10.1016/j.patrec.2005.11.005.
    [8] Z. Zivkovic, ‘Improved adaptive Gaussian mixture model for background subtraction’, in Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004., Aug. 2004, vol. 2, pp. 28-31 Vol.2, doi: 10.1109/ICPR.2004.1333992.
    [9] S. Suzuki, ‘Topological structural analysis of digitized binary images by border following’, Comput. Vis. Graph. Image Process., vol. 30, no. 1, pp. 32–46, 1985.
    [10] G. Welch and G. Bishop, An introduction to the Kalman filter. Citeseer, 1995.
    [11] E. A. Wan and R. Van Der Merwe, ‘The unscented Kalman filter for nonlinear estimation’, in Proceedings of the IEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control Symposium (Cat. No.00EX373), Lake Louise, Alta., Canada, 2000, pp. 153–158, doi: 10.1109/ASSPCC.2000.882463.
    [12] Y.-W. Chang, C.-J. Hsieh, K.-W. Chang, M. Ringgaard, and C.-J. Lin, ‘Training and testing low-degree polynomial data mappings via linear SVM.’, J. Mach. Learn. Res., vol. 11, no. 4, 2010.
    [13] J. D. Gergonne, ‘The application of the method of least squares to the interpolation of sequences’, Hist. Math., vol. 1, no. 4, pp. 439–447, 1974.
    [14] B. Li, J. Wu, X. Tan, and B. Wang, ‘ArUco marker detection under occlusion using convolutional neural network’, in 2020 5th International Conference on Automation, Control and Robotics Engineering (CACRE), Sep. 2020, pp. 706–711, doi: 10.1109/CACRE50138.2020.9230250.
    [15] Y. Wang, Z. Zheng, Z. Su, G. Yang, Z. Wang, and Y. Luo, ‘An improved ArUco marker for monocular vision ranging’, in 2020 Chinese Control And Decision Conference (CCDC), Aug. 2020, pp. 2915–2919, doi: 10.1109/CCDC49329.2020.9164176.
    [16] S. Garrido-Jurado, R. Muñoz-Salinas, F. J. Madrid-Cuevas, and M. J. Marín-Jiménez, ‘Automatic generation and detection of highly reliable fiducial markers under occlusion’, Pattern Recognit., vol. 47, no. 6, pp. 2280–2292, 2014.
    [17] T. Collins and A. Bartoli, ‘Infinitesimal plane-based pose estimation’, Int. J. Comput. Vis., vol. 109, no. 3, pp. 252–286, 2014.
    [18] P. Ranade, ‘Linear motor control without the math’, EETimes Com Retrived, vol. 12, 2015.
    [19] H. Tan, M. Huang, Q. Zhang, and Y. Liu, ‘Controlling system design of high speed stepping motor based on serial communication’, in 2011 4th IEEE International Symposium on Microwave, Antenna, Propagation and EMC Technologies for Wireless Communications, 2011, pp. 282–285.

    無法下載圖示 全文公開日期 2026/01/28 (校內網路)
    全文公開日期 2031/01/28 (校外網路)
    全文公開日期 2031/01/28 (國家圖書館:臺灣博碩士論文系統)
    QR CODE