簡易檢索 / 詳目顯示

研究生: 鄧明隆
Ming-Long Deng
論文名稱: 基於ORB-SLAM3 之夜間機器人導航研究
Study of Night Time Mobile Robot Navigations Based on ORB-SLAM3
指導教授: 郭重顯
Chung-Hsien Kuo
口試委員: 翁慶昌
蘇順豐
劉益宏
郭重顯
學位類別: 碩士
Master
系所名稱: 電資學院 - 電機工程系
Department of Electrical Engineering
論文出版年: 2021
畢業學年度: 109
語文別: 英文
論文頁數: 83
中文關鍵詞: 視覺里程計視覺即時定位與地圖建構(VSLAM)自主導航可調控 的紅外線燈光模組震動標籤
外文關鍵詞: Visual odometry, Visual SLAM, Autonomous navigation, Controllable infrarded illumination array, Vibration tag
相關次數: 點閱:253下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 視覺是相當直觀的一種感官,因這個原因視覺里程計成為了一個重要的開發項目之一,相較於lidar 等較高的設備成本,相機設備成本低廉以及其方便部屬的優點,以視覺為主的相關演算法成為了開發者重點的研究對象。然而,在夜晚的環境中,低照度造成影像的影響是相當大的,尤其是在發生模糊現象的時候。因此為了改善影像造成的影響,本文提出了自動調整的紅外線大燈照明以及使用加速度計檢測地面震動的震動標籤方法,藉由行經路線的震動資訊,我們也將她加入倒ORBSLAM3 的Atlas 資料庫中,提供給機器人在導航時的線速度判斷調整依據。並且也提出了一個由模糊控制的閥值自適應調整機制,用來調整ORB特徵點提取演算法中的FAST 角點檢測數量,提高視覺里程計的定位性能。在校內室內以及室外的環境中,我們以實驗室製造的差速輪移動機器人作為實驗的設備,機器人上也具有經過校正的相機以及慣性量測單元,除此之外,使用pure pursuit 做為追蹤方法,以自主移動的方法驗證本文提出的視覺里程計性能。在實驗比較中,整合我們提出的視覺導航方法,可以使得移動機器人在夜間時自主導航到戶外的目標地點。


    Vision is a very intuitive sense. For this reason, visual odometry (VO) has become one of the important development projects. Compared with the higher equipment cost such as lidar, low cost and conveniently deployable become the mainly advantage of camera. So that, the related algorithm has become a key research object of developers. However, in the night time environment, the impact of low illumination on the image is considerable, especially when blurring occurs. To improve the impact of the image, this study proposes an automatically adjusted infrared headlight lighting method and a vibration tag method that uses an accelerometer to detect ground vibration. Based on the vibration information of the traveling route, we also add it to the Atlas database of ORB-SLAM3. Provide the basis for the robot to judge and adjust the linear velocity during navigation. We also proposed a fuzzy-based threshold adaptive adjustment mechanism, used to adjust the number of FAST corner detection in the ORB feature
    point extraction algorithm, and improve the positioning performance of the visual
    odometry. In the indoor and outdoor environments of the NTUST campus, we use the
    differential wheel mobile robot which manufactured in the laboratory as the
    experimental equipment. The mobile robot equips a calibrated camera and a calibrated inertial measurement unit. In addition, pure pursuit is also used to verifie the performance of the visual odometer proposed in this study. In the experimental comparison, integrating the visual navigation method that we proposed can led the mobile robot autonomously navigate to an outdoor target location at night.

    指導教授推薦書 i 口試委員會審定書 ii 誌謝 iii 摘要 iv Abstract v List of Tables viii List of Figures ix Nomenclature xii Chapter 1 Introduction 1 1.1 Motivation and Purpose 1 1.2 Literature Review 4 1.2.1 Visual Simultaneous Localization and Mapping (VSLAM) 4 1.2.2 Controller Design 8 1.2.3 Inertial Measurement Unit Signal Processing 9 1.3 Organization of the Thesis 10 Chapter 2 System Architecture and Operation 11 2.1 System Hardware Architecture 11 2.1.1 Vision Device 11 2.1.2 Multi-Sensor 14 2.1.3 Core Decision 18 2.1.4 IR Headlight Array 18 2.1.5 Vehicle Control 20 2.2 Software and Operation Flowchart 22 2.2.1 First-generation Robot Operating System (ROS) 22 2.2.2 Operation Flowchart 22 Chapter 3 Visual Simultaneous Localization and Mapping System Design 24 3.1 ORB-SLAM3 Introduction 24 3.1.1 ORB-SLAM3 System Architecture 25 3.1.2 Oriented FAST and Rotated BRIEF (ORB) 25 3.1.3 Bundle Adjustment (BA) 26 3.2 Proposed VO Architecture 27 3.3 Adaptive Feature Extraction using Fuzzy Control 31 3.4 Fuzzy-based IR Headlight Illumination Control 36 3.5 Reducing Motion Blur with Vibration Tags 42 Chapter 4 Implementation of Autonomous Navigation 46 4.1 Tracking Algorithm 46 4.1.1 Normal Pure Pursuit 46 4.1.2 Improved Pure Pursuit 47 4.2 Vehicle Controller Design 48 Chapter 5 Experiments and Results 53 5.1 Map Construction Completeness Experiment 53 5.2 Experiment of Fuzzy-based Feature Threshold 54 5.3 Experiment of Fuzzy-based Headlights Control 57 5.4 Experiment of Vibration Tag Performance 58 5.5 Experiment of Cross-building Navigation 61 Chapter 6 Conclusions and Future Works 65 References 66

    [1] G. Klein, and D. Murray, “Parallel tracking and mapping for small AR workspaces,” IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR), Nara, Japan, pp. 225-234, 2007.
    [2] J. Engel, J. Stückler, and D. Cremers, “Large-scale direct SLAM with stereo cameras,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, pp. 1935-1942, 2015.
    [3] C. Forster, M. Pizzoli, and D. Scaramuzza, “SVO: Fast semi-direct monocular visual odometry,” IEEE Intl. Conf. on Robotics and Automation, Hong Kong, China, pp. 15-22, 2014.
    [4] J. Engel, V. Koltun, and D. Cremers, “Direct Sparse Odometry,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 40, no. 3, pp. 611-625, 2018.
    [5] R. Mur-Artal, J. M. M. Montiel, and J. D. Tardos, “ORB-SLAM: a versatile and accurate monocular SLAM system,” IEEE Transactions on Robotics, vol. 31, no. 5, pp. 1147-1163, 2015.
    [6] R. Mur-Artal, and J. D. Tardos, “Visual-inertial monocular SLAM with map reuse,” IEEE Robotics and Automation Letters, vol. 2, no. 2, pp. 796-803, 2017.
    [7] R. Mur-Artal, and J. D. Tardos, “ORB-SLAM2: An open-source SLAM system for monocular, stereo, and RGB-D cameras,” IEEE Transactionson Robotics, vol. 33, no. 5, pp. 1255-1262, 2017.
    [8] R. Elvira, J. M. M. Montiel, and J. D. Tardós, “ORBSLAM-Atlas: a robust and accurate multi-map system,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, pp. 6253-6259, 2019.
    [9] C. Campos, R. Elvira, J. J. G. Rodríguez, J. M. M. Montiel, and J. D. Tardós, “ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial and Multi-Map SLAM,” arXiv preprint arXiv:2007.11898, 2020.
    [10] F. Nobis, O. Papanikolaou, J. Betz, and M. Lienkamp, “Persistent Map Saving for Visual Localization for Autonomous Vehicles: An ORB-SLAM 2 Extension,” 2020 Fifteenth International Conference on Ecological Vehicles and Renewable Energies (EVER), Monte-Carlo, Monaco, pp. 1-9, 2020.
    [11] M. J. Milford, and G. F. Wyeth, “SeqSLAM: Visual route-based navigation for sunny summer days and stormy winter nights,” IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, pp. 1643-1649, 2012.
    [12] J. Eunah, N. Yang, and D. Cremers, “Multi-frame GAN: Image enhancement for stereo visual odometry in low light,” Conference on Robot Learning. PMLR, Osaka, Japan, pp. 651-660, 2020.
    [13] A. Sujiwo, E. Takeuchi, L. Y. Morales, N. Akai, H. Darweesh, Y. Ninomiya, and M. Edahiro, “Robust and accurate monocular vision-based localization in outdoor environments of real-world robot challenge,” Journal of Robotics and Mechatronics 29.4, pp.685-696, 2017.
    [14] S. J. Haddadi, and E. B. Castelan, “Visual-Inertial Fusion for Indoor Autonomous Navigation of a Quadrotor Using ORB-SLAM,” Latin American Robotic Symposium, Brazilian Symposium on Robotics (SBR) and Workshop on Robotics in Education (WRE), Pessoa, Brazil, pp. 106-111, 2018.
    [15] L. Xu, C. Feng, V. R. Kamat, and C.C. Menassa, “An occupancy grid mapping enhanced visual SLAM for real-time locating applications in indoor GPS-denied environments,” Automation in Construction, 104, pp. 230-245, 2019.
    [16] Y. Yang, D. Tang, D. Wang, W. Song, J. Wang, and M. Fu, “Multi-camera visual SLAM for off-road navigation,” Robotics and Autonomous Systems, vol. 128, p. 103505, 2020.
    [17] W. Kwon, J. H. Park, M. Lee, J. Her, S. H. Kim, and J. W. Seo, “Robust Autonomous Navigation of Unmanned Aerial Vehicles (UAVs) for Warehouses’ Inventory Application,” IEEE Robotics and Automation Letters, vol. 5, no. 1, pp. 243-249, 2020.
    [18] A. Ligocki, and A. Jelínek, “Fusing the RGBD SLAM with Wheel Odometry,” IFAC-PapersOnLine 52.27, pp. 7-12, 2019.
    [19] S. Eum, and H. G. Jung, “Enhancing Light Blob Detection for Intelligent Headlight Control Using Lane Detection,” IEEE Transactions on Intelligent Transportation Systems, vol. 14, no. 2, pp. 1003-1011, 2013.
    [20] ThetKoKo, Z. M. Tun, and H. M. Tun, “Implementation Of Automatic Wiper Speed Control And Headlight Modes Control Systems Using Fuzzy Logic,” International Journal of Scientific & Technology Research, vol. 4, no. 7, 2015.
    [21] Y. S. Chen, and S. C. Chiu, “New method of automatic control for vehicle headlights,” Optik 157, pp.718-723, 2018.
    [22] H. A. Harindu Y. Sarathchandra, J. R. Kithsiri C. Jayakody, and C. Premachandra, “Improvement of Driver Visibility at Night by Ego Vehicle Headlight Control,” International Conference on Image Processing and Robotics (ICIP), Negombo, Sri Lanka, pp. 1-6, 2020.
    [23] W. J. M. Kickert, and E. H. Mamdani, “Analysis of a fuzzy logic controller,” Readings in Fuzzy Sets for Intelligent Systems. Morgan Kaufmann, pp.290-297, 1993.
    [24] C. Chiu, and C. Chang, “Design and Development of Mamdani-Like Fuzzy Control Algorithm for a Wheeled Human-Conveyance Vehicle Control,” IEEE Transactions on Industrial Electronics, vol. 59, no. 12, pp. 4774-4783, 2012.
    [25] C. Zhang, J. Hu, J. Qiu, W. Yang, H. Sun, and Q. Chen, “A Novel Fuzzy Observer-Based Steering Control Approach for Path Tracking in Autonomous Vehicles,” IEEE Transactions on Fuzzy Systems, vol. 27, no. 2, pp. 278-290, 2019.
    [26] C. H. Chiu, “Adaptive Fuzzy Control Strategy for a Single-Wheel Transportation Vehicle,” IEEE Access, vol. 7, pp. 113272-113283, 2019.
    [27] E. Horváth, C. Hajdu, and P. Kőrös, “Enhancement of pure-pursuit path-tracking algorithm with multi-goal selection,” 1st IEEE International Conference on Gridding and Polytope Based Modeling and Control (GPMC), Budapest, Hungary, pp. 13-18, 2019.
    [28] M. Euston, P. Coote, R. Mahony, J. Kim, and T. Hamel, “A complementary filter for attitude estimation of a fixed-wing UAV,” IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France, pp. 340-345, 2008.
    [29] P. Gui, L. Tang, and S. Mukhopadhyay, “MEMS based IMU for tilting measurement: Comparison of complementary and kalman filter based data fusion,” IEEE 10th Conference on Industrial Electronics and Applications (ICIEA), Auckland, New Zealand, pp. 2004-2009, 2015.
    [30] S. Bondan, T. Kitasuka, and M. Aritsugi, “Vehicle vibration error compensation on imu-accelerometer sensor using adaptive filter and low-pass filter approaches,” Journal of Information Processing 27, pp. 33-40, 2019.
    [31] M. Basso, M. Galanti, G. Innocenti, and D. Miceli, “Pedestrian Dead Reckoning Based on Frequency Self-Synchronization and Body Kinematics,” IEEE Sensors Journal, vol. 17, no. 2, pp. 534-545, 2017.
    [32] R. M. Sousa, M. Wany, P. Santos, and F. Morgado-Dias, “Automatic Illumination Control for An Endoscopy Sensor,” Microprocessors and Microsystems, vol. 72, no. 2020, pp. 1-10, 2020.
    [33] H. Demirel, G. Anbarjafari, and M. N. S. Jahromi, “Image equalization based on singular value decomposition,” 23rd International Symposium on Computer and Information Sciences, pp. 1-5, 2008.
    [34] H. Li, J. Luo, S. Yan, M. Zhu, Q. Hu, and Z. Liu, “Research on Parking Control of Bus Based on Improved Pure Pursuit Algorithms,” 18th International Symposium on Distributed Computing and Applications for Business Engineering and Science (DCABES), Wuhan, China, pp. 21-26, 2019.
    [35] J. Liao, Development of a Mobile Robot’s Model Predictive Controller and Its Application on Cross-area Navigation, Master Thesis, Department of Electrical Engineering, National Taiwan University of Science and Technology, Taipei, Taiwan, 2020.
    [36] J. Yang, Design of Positioning and Trajectory Tracking Controller for an Autonomous Vehicle, Master Thesis, Department of Electrical Engineering, National Taiwan University of Science and Technology, Taipei, Taiwan, 2021.

    無法下載圖示 全文公開日期 2024/07/10 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)
    全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
    QR CODE