簡易檢索 / 詳目顯示

研究生: 梁平翰
Ping-Han Liang
論文名稱: 天花板影像導航之自主移動機器人
Ceiling-guided Robot Navigation
指導教授: 林其禹
Chyi-Yeu Lin
口試委員: 林其禹
Chyi-Yeu Lin
邱士軒
Shih-Hsuan Chiu
郭重顯
Chung-Hsien Kuo
林柏廷
Po-Ting Lin
學位類別: 碩士
Master
系所名稱: 工程學院 - 機械工程系
Department of Mechanical Engineering
論文出版年: 2018
畢業學年度: 106
語文別: 英文
論文頁數: 53
中文關鍵詞: 移動機器人導航視覺導引機器人室內定位系統機器人定位特徵匹配
外文關鍵詞: Autonomous navigation, Visual navigation of mobile robot, Indoor localization system, Feature based localization, Feature matching
相關次數: 點閱:289下載:7
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 現今許多研究人員致力於開發自主移動機器人之室內定位技術。透過不同的傳感器、設置及演算法,他們提出各種解決方案來實現不同的需求。而本篇論文提出一種基於天花板特徵的導航系統,藉由裝置於自主移動機器人上方之2D攝影機獲取天花板的影像資訊,進而達到簡化並且通用的室內定位方法。
    此定位方法乃透過一裝於機器人上方且朝向天花板的攝影機來獲取天花板的影像特徵,藉由此資訊機器人即可判斷其位置。此方法的好處是相較於其他定位方法,無需複雜的環境設置,且不受輪胎打滑的影響,因此儘管持續運行一段時間後,仍然沒有累計誤差。且對於任何不同的環境,可以在簡易的初始化過程之後快速地建立系統,並在指定的工作區中進行機器人之室內定位及導航。
    本論文主要利用影像拼接演算法來建立天花板之全景資訊,並且採用改進過後的影像匹配演算法以計算機器人在地板上的位置及姿態。其中,在缺乏自然特徵的天花板中,利用雷射投射裝置在天花板上投射人工特徵。藉由此裝置,機器人即可在天花板缺少特徵的環境中進行室內定位、導航。論文的最後,透過一連串的實驗來驗證此方法的可行性。


    Nowadays, many researchers work on indoor localization techniques for an autonomous mobile robot. By means of different sensors, setups, and algorithms, they proposed a variety of solutions to reach diverse goals. Focusing on providing a simpler setup and general solution, this thesis proposed a ceiling-feature based navigation system by using a 2D camera to capture ceiling images and localize the robot.
    The proposed localization method uses an overhead camera on the robot to capture the image features on the ceiling for localizing itself. The advantage of this method is unsophisticated preprocessing, not affected by tire-skidding and no accumulation errors after executing a period of time. The system could be easily and rapidly set up after a simple initialization process for any different environment and could localize and navigate the robot in the required workspace.
    This thesis uses the stitching algorithm to build the panorama of the ceiling information and uses the improved matching algorithm to calculate the robot position and pose on the floor.
    A laser projection device is implemented in this work to generate artificial landmarks on the ceiling which lacks features for recognition. With it, the robot can navigate itself in a designated floor space under a ceiling without features.
    A number of experiments show the feasibility of the proposed navigation system.

    Abstract 中文摘要 Contents List of Figures Chapter 1 Introduction Chapter 2 Background of Navigation 2.1 Using Ceiling Landmarks and Infrared Light 2.2 Using Artificial Landmarks and Natural Features 2.3 Design and recognition of artificial landmarks 2.4 Camera Space Particle Filter for Localization 2.5 Ceiling-Based Visual Positioning 2.6 Using Lampshade Corners as Landmarks 2.7 Illumination-Invariant Localization 2.8 Localization in an Indoor iSpace 2.9 Robust mapping and localization 2.10 Localization for Multirobot Formations 2.11 Efficient Simultaneous Localization and Mapping 2.12 RGB-D mapping 2.13 Using multi-hypotheses and a matching algorithm 2.14 Monocular Vision and Odometry/AHRS Sensors Chapter 3 Navigation System and Implementation 3.1 Proposed System 3.2 Implementation 3.2.1 Hardware Basics 3.2.2 Camera calibration 3.2.2.1 Lens Distortion 3.2.2.2 Theory of calibration 3.2.3 Stitching 3.2.4 Localization 3.2.5 Pinhole camera model Chapter 4 Experiments and Results 4.1 Experimental Setting 4.2 Stitching and localization result 4.2.1 Result without laser pattern 4.2.2 Result with laser pattern 4.2.3 Precision of localization while moving Chapter 5 Conclusions and Future work 5.1 Conclusions 5.2 Future work References

    [1] Kevin Curran, Eoghan Furey, Tom F. Lunney, Jose Santos, Derek Woods and Aiden J. McCaughey,“An evaluation of indoor location determination technologies,” Journal of Location Based Services, Vol. 5, No. 2, pp. 61-78, June 2011.
    [2] Johann Borenstein and Liqiang Feng,“Measurement and correction of systematic odometry errors in mobile robots,” IEEE Transactions on Robotics and Automation, Vol. 12, No. 6, pp. 869-880, 1996.
    [3] Agostino Martinelli,“The odometry error of a mobile robot with a synchronous drive system,” IEEE Transactions on Robotics and Automation, Vol. 18, No. 3, pp. 399-405, June 2002.
    [4] Soonshin Han, HyungSoo Lim and Jangmyung Lee,“An efficient localization scheme for a differential-driving mobile robot based on RFID system,” IEEE Transactions on Industrial Electronics, Vol. 54, No. 6, pp. 3362-3369, Dec 2007.
    [5] Joydeep Biswas and Manuela M. Veloso, “WiFi localization and navigation for autonomous indoor mobile robots,” 2010 IEEE International Conference on Robotics and Automation, May 2010.
    [6] Arie Sheinker, Boris Ginzburg, Nizan Salomonski, Lev S. Frumkis and Ben Zion Kaplan,“Localization in 3-D using beacons of low frequency magnetic field,” IEEE Transactions on Instrumentation and Measurement, Vol. 62, No. 12, pp. 3194-3201, 2013.
    [7] Hartmut Surmann, Andreas Nüchter and Joachim Hertzberg,“An autonomous mobile robot with a 3D laser range finder for 3D exploration and digitalization of indoor environments,” Robotics and Autonomous Systems, Vol. 45, No. 3-4, pp.181-198, Dec 2003.
    [8] Joel Vidal and Chyi-Yeu Lin,“Simple and Robust Localization System Using Ceiling Landmarks and Infrared Light,” 12th IEEE International Conference on Control & Automation (ICCA), 2016.
    [9] Angga Rusdinar and Sungshin Kim,“Vision-Based Indoor Localization Using Artificial Landmarks and Natural Features on the Ceiling with Optical Flow and a Kalman Filter,” International Journal of Fuzzy Logic and Intelligent Systems, Vol. 13, No. 2, pp. 133-139, June 2013.
    [10] Xu Zhong, Yu Zhou and Hanyu Liu,“Design and recognition of artificial landmarks for reliable indoor self-localization of mobile robots,” International Journal of Advanced Robotic Systems, Vol. 14, No. 1, Jan 2017.
    [11] Raul Chavez-Romero, Antonio Cardenas, MauroMaya, Alejandra Sanchez, and Davide Piovesan,“Camera Space Particle Filter for the Robust and Precise Indoor Localization of a Wheelchair,” Journal of Sensors, 2016.
    [12] De Xu, Liwei Han, Min Tan, and You Fu Li,“Ceiling-Based Visual Positioning for an Indoor Mobile Robot With Monocular Vision,” IEEE Transactions on Industrial Electronics, Vol. 56, No. 5, pp. 1617-1628, May 2009.
    [13] Xiaohan Chen and Yingmin Jia, “Indoor Localization for Mobile Robots using Lampshade Corners as Landmarks: Visual System Calibration, Feature Extraction and Experiments,” International Journal of Control, Automation, and Systems, Vol. 12, No. 6, pp. 1313-1322, Dec 2014.
    [14] Seongsoo Lee , Sukhan Lee and Jason Jeongsuk Yoon,“Illumination-Invariant Localization Based on Upward Looking Scenes for Low-Cost Indoor Robots,” Advanced Robotics, Vol. 26, No. 13, pp. 1443-1469, 2012.
    [15] Jorge Rodríguez-Araújo, Juan J. Rodríguez-Andina, José Fariña, and Mo-Yuen Chow,“Field-Programmable System-on-Chip for Localization of UGVs in an Indoor iSpace,” IEEE Transactions on Industrial Informatics, Vol. 10, No. 2, pp. 1033-1043, May 2014.
    [16] Minkuk Jung and Jae-Bok Song,“Robust mapping and localization in indoor environments,” Intelligent Service Robotics, Vol. 10, No. 1, pp. 55-66, Jan 2017.
    [17] Haoyao Chen, Dong Sun, Jie Yang, and Jian Chen,“Localization for Multirobot Formations in Indoor Environment,” IEEE/ASME Transactions on Mechatronics, Vol. 15, No. 4, Aug 2010.
    [18] Hyukdoo Choi , Dong Yeop Kim , Jae Pil Hwang , Chang-Woo Park and Euntai Kim,“Efficient Simultaneous Localization and Mapping Based on Ceiling-View: Ceiling Boundary Feature Map Approach,” Advanced Robotics, Vol. 26, No. 5-6, pp. 653-671, Apr 2012.
    [19] Peter Henry, Michael Krainin, Evan Herbst, Xiaofeng Ren and Dieter Fox,“RGB-D mapping: Using Kinect-style depth cameras for dense 3D modeling of indoor environments,” International Journal of Robotics Research, Vol. 31, No. 5, pp. 647-663, Apr 2012.
    [20] Simultaneous localization and mapping, https://en.wikipedia.org/wiki/Simultaneous_localization_and_mapping
    [21] Miguel Pinto, Héber Sobreira, A. Paulo Moreira, Hélio Mendonça and Aníbal Matos,“Self-localisation of indoor mobile robots using multi-hypotheses and a matching algorithm,” Mechatronics, Vol. 23, No. 6, pp. 727-737, Sep 2013.
    [22] Kai Wang, Yun-Hui Liu and Luyang Li,“A Simple and Parallel Algorithm for Real-Time Robot Localization by Fusing Monocular Vision and Odometry/AHRS Sensors,” IEEE/ASME Transactions on Mechatronics, Vol. 19, No. 4, Aug 2014.
    [23] Ethan Rublee, Vincent Rabaud, Kurt G. Konolige and Gary R. Bradski, “ORB: An efficient alternative to SIFT or SURF,” 2011 IEEE International Conference on Computer Vision, 2011.
    [24] Jefri Efendi Mohd Salih, Mohamed Rizon, Sazali Yaacob, Abdul Hamid Adom and Mohd Rozailan Mamat,“Designing Omni-Directional Mobile Robot with Mecanum Wheel,” American Journal of Applied Sciences, Vol. 3, No. 5, pp. 1831-1835, 2006.
    [25] Olaf Diegel, Apama Badve, Glen Bright, Johan Potgieter and Sylvester Tlale,“Improved Mecanum Wheel Design for Omni-directional Robots,” Australasian Conference on Robotics and Automation Auckland, pp. 27-29, 2002.
    [26] Camera calibration With OpenCV, http://docs.opencv.org/2.4/index.html#
    [27] Image stitching, https://en.wikipedia.org/wiki/Image_stitching

    QR CODE