簡易檢索 / 詳目顯示

研究生: 高銘良
Ming-Liang Gao
論文名稱: 應用三維光達里程與地圖建構技術於校園自主無人車導航系統開發
Applying 3D Lidar Odometry and Mapping Technique for Campus Autonomous Vehicle Navigation System Development
指導教授: 郭重顯
Chung-Hsien Kuo
口試委員: 蕭俊祥
Jin-Siang Shaw
蘇順豐
Shun-Feng Su
劉孟昆
Meng-Kun Liu
學位類別: 碩士
Master
系所名稱: 電資學院 - 電機工程系
Department of Electrical Engineering
論文出版年: 2019
畢業學年度: 107
語文別: 中文
論文頁數: 59
中文關鍵詞: 自主無人車同步定位與地圖建構自動駕駛路徑規劃
外文關鍵詞: autonomous vehicle, SLAM, autopilot, route plan
相關次數: 點閱:283下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本論文以設計一台校園自主無人車為目的,分為兩部分設計與開發,分別為無人車的機構設計以及自動駕駛系統;在前輪轉向模組機構設計中,運用運動學計算出誤差最小、最適合的參數當作設計機構之依據。無人車的控制方式採用前輪轉向與後輪驅動的模式,並於車輛裝設一三維光達與二維雷射感測器,用來感知當前的環境。自動駕駛系統開發中具有能夠透過使用者介面設定地圖相關資訊與A*演算法達到路徑規劃功能,車輛定位上應用LOAM(Lidar Odometry and Mapping)演算法達成即時定位及建構地圖,搭配二維雷射偵測當前道路上的障礙物狀況,因應不同的情況下,採取不同的避障機制,若是空間足夠閃避,在避障時會根據障礙物位置達成動態規劃臨時避障之路徑,讓車子能在校園中更彈性的行駛,最後達成校園公車依照時間發車與多站點停靠的概念。經過驗證,本校園自主無人車根據美國汽車工程師協會之定義,屬於Level 3,即在一定條件下可以監控路面情況,且車輛可以完成部分駕駛任務。


    In this thesis, a campus autonomous vehicle project is proposed. The proposed system is divided into two parts: autonomous electric vehicle design and autopilot system. The front wheel steering and suspension system was done in terms of kinematics evaluation based on different dimension combination to meet minimum error of Ackermann steering geometry. This work used a front-wheel-steering with rear-wheel-drive configuration. To perform autonomous driving, a 3D LiDAR and a 2D LiDAR sensors were used for the map creation, localization and obstacle avoidance. In addition, the autopilot system also used the A* algorithm to realize route planning in our campus. The map creation and localization were carried out with the We use LiDAR odometry and mapping (LOAM). Meanwhile, the 2D LiDAR was used to detect obstacle in front of the vehicle so that the autonomous vehicle could properly avoid obstacles and pedestrians in from of the vehicle by either stopping or changing its driving trajectory according to the obstacle location. Finally, this study accomplished a bus route operation prototype system in our campus. The autonomous electric vehicle was able to depart and stop at the desired stops. Finally, the proposed autonomous electric vehicle system is a level 3 system defined by the society of automotive engineers (SAE); that means the drivers may take their hands off the wheel and feet off the pedals based on some specific situations with a standby driver ready for intervention when needed.

    指導教授推薦書 i 口試委員會審定書 ii 致謝 iii 摘要 iv Abstract v 目錄 vi 圖目錄 viii 表目錄 xi 符號說明 xii 緒論 1 1.1 研究背景與動機 1 1.2 研究目的 4 1.3 文獻回顧 5 1.3.1 車輛定位 5 1.3.2 障礙物檢測 8 1.3.3 軌跡追蹤 10 1.4 論文架構 12 第二章 系統架構與方法 13 2.1 系統組織 13 2.2 ROS平台 16 2.3 導入LOAM演算法 17 2.3.1 提取特徵點 18 2.3.2 特徵點匹配 19 2.3.3 運動估測 21 2.3.4 更新地圖 23 2.4 使用者情境 24 2.5 系統運作流程 25 2.6 使用者介面 26 第三章 電動無人車設計 29 3.1 前輪轉向機構設計 30 3.2 運動學 32 3.3 車輛控制架構 37 第四章 車輛導航系統 39 4.1 路徑規劃 39 4.2 路徑追蹤 42 4.3 障礙物分析 44 4.4 避障系統 46 第五章 實驗結果與分析 48 5.1 前輪轉向機構分析 48 5.2 避障系統實驗 50 5.3 自主導航定位重現精度實驗 52 5.4 自主導航軌跡精度實驗 54 第六章 結論與未來研究 55 6.1 結論 55 6.2 未來研究方向 55 參考文獻 56

    [1]K. Ji, H. Chen, H. Di, J. Gong, G. Xiong, J. Qi and T. Yi ,“CPFG-SLAM:a Robust Simultaneous Localization and Mapping based on LIDAR in Off-Road Environment,” IEEE Intelligent Vehicles Symposium (IV), pp. 650– 655,2018.
    [2]J. Park, JY. Kim, BC. Kim and S. Kim ,“Global Map Generation using LiDAR and Stereo Camera for Initial Positioning of Mobile Robot,” International Conference on Information and Communication Technology Robotics (ICT-ROBOT), pp. 1–4,2018.
    [3]Z. Liu, H. Chen, H. Di, Yi Tao, J. Gong, G. Xiong and J. Qi,“Real-Time 6D Lidar SLAM in Large Scale Natural Terrains for UGV,” IEEE Intelligent Vehicles Symposium (IV), pp.662–667,2018.
    [4]P. Agrawal, A. Iqbal, B, Russell, M. Hazrati, V. Kashyap and F. Akhbari, “PCE-SLAM: A real-time simultaneous localization and mapping using LiDAR data,” IEEE Intelligent Vehicles Symposium (IV), pp 1752–1757,2017.
    [5]J. Bao, Y. Gu and S. Kamijo, “Vehicle positioning with the integration of scene understanding and 3D map in urban environment,” IEEE Intelligent Vehicles Symposium (IV), pp.68–73,2017.
    [6]F. Demim, A. Boucheloukh, A. Nemra, K. Louadj, M.Hamerlain, A. Bazoula and Z. Mehal,“A new adaptive smooth variable structure filter SLAM algorithm for unmanned vehicle,” International Conference on Systems and Control (ICSC), pp.6–13,2017.
    [7]A. Pfrunder, P.V.K. Borges, A.R. Romero, G. Catt and A. Elfes, “Real-time autonomous ground vehicle navigation in heterogeneous environments using a 3D LiDAR,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp.2601–2608,2017.
    [8]E. Javanmardi, M. Javanmardi, Y. Gu and S. Kamijo,“Autonomous vehicle self-localization based on probabilistic planar surface map and multi-channel LiDAR in urban area,” IEEE 20th International Conference on Intelligent Transportation Systems (ITSC), pp.1–8,2017.
    [9]J. Jeong, Y. Cho and A. Kim,“Road-SLAM : Road marking based SLAM with lane-level accuracy,” IEEE Intelligent Vehicles Symposium (IV), pp.1736–1473,2017.
    [10]X. Chen, H. Zhang, H. Lu, J. Xiao, Q. Qiu and Y. Li,“Robust SLAM system based on monocular vision and LiDAR for robotic urban search and rescue,” IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR), pp.41–47,2017.
    [11]D. Cui, J. Xue and N. Zheng, “Real-Time Global Localization of Robotic Cars in Lane Level via Lane Marking Detection and Shape Registration,” IEEE Transactions on Intelligent Transportation Systems, vol. 17, issue. 4, pp. 1039–1050, 2016.
    [12]Y. Gu, Y. Wada, L. Hsu and S. Kamijo, “Vehicle self-localization in urban canyon using 3D map based GPS positioning and vehicle sensors,” IEEE Conferences on Connected Vehicles and Expo, pp. 792–798, 2014
    [13]Y. Tsai, K. Chen, Y. Chen and J. Cheng,“Accurate and fast obstacle detection method for automotive applications based on stereo vision,” International Symposium on VLSI Design, Automation and Test (VLSI-DAT), pp.1–4, 2018.
    [14]N. Morales, J. Toledo, L. Acosta and J.S. Medina, “A Combined Voxel and Particle Filter-Based Approach for Fast Obstacle Detection and Tracking in Automotive Applications,” IEEE Transactions on Intelligent Transportation Systems, vol. 18, issue. 7, pp. 1824–1834, 2017.
    [15]V.D. Nguyen, H.V. Nguyen, D.T. Tran, S.J. Lee and J.W. Jeon, “Learning Framework for Robust Obstacle Detection, Recognition, and Tracking,” IEEE Transactions on Intelligent Transportation Systems, vol. 18, issue. 7, pp. 1633–1646, 2017.
    [16]H. Mengwen, E. Takeuchi, Y. Ninomiya and S. Kato “Robust virtual scan for obstacle Detection in urban environments,”IEEE Intelligent Vehicles Symposium (IV), pp. 683–690, 2016.
    [17]J.H. Aceituno, R. Arnay, J. Toledo and L. Acosta, “Using Kinect on an Autonomous Vehicle for Outdoors Obstacle Detection,” IEEE Sensors Journal, vol. 16, issue. 10, pp. 3603–3610, 2016.
    [18]Y. Huang and S. Liu, “Multi-class obstacle detection and classification using stereovision and improved active contour models,” IET Intelligent Transport Systems, vol. 10, issue. 3, pp. 197–205, 2016.
    [19]A. Alam and Z. Jaffery,“A computer vision system for detection and avoidance for automotive vehicles,” Annual IEEE India Conference (INDICON), pp. 1–6, 2015.
    [20]D. Cong, C. Liang, Q. Gong, X. Yang and J. Liu,“Path Planning and Following of Omnidirectional Mobile Robot Based on B-spline,” Chinese Control And Decision Conference (CCDC), pp. 4931–4936, 2018.
    [21]A.M. Sakti, A.I. Cahyadi and I. Ardiyanto,“Path Planning and Path Following Using Arrival Time Field for Nonholonomic Mobile Robot,” International Conference on Advanced Computing and Applications (ACOMP), pp. 143–148, 2017.
    [22]E. Elsheikh, M. El-Bardini and M. Fkirin “Practical path planning and path following for a non-holonomic mobile robot based on visual servoing,” IEEE Information Technology, Networking, Electronic and Automation Control Conference, pp.401–406, 2016.
    [23]F. Safia and C. Fatima,“Visual path following by an omnidirectional mobile robot using 2d visual servoing,” 5th International Conference on Electrical Engineering - Boumerdes (ICEE-B), pp.1–7, 2017.
    [24]G. Huang, C. Wu, C. Lai, S. Pan, C. Tsai and S. Jue,“Track model regression using genetic expression programming for visual-based path-following of mobile robots,” Asia-Pacific Conference on Intelligent Robot Systems (ACIRS), pp.47–51, 2016.
    [25]J. Zhang and S. Singh, “LOAM: Lidar Odometry and Mapping in Real-time” Robotics: Science and Systems, pp.1-9 , 2014

    無法下載圖示 全文公開日期 2024/02/12 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)
    全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
    QR CODE