研究生: |
BUI QUANG HUY BUI - QUANG HUY |
---|---|
論文名稱: |
以智慧手機為基礎的室內輪型機器人視覺定位同步與建圖 Indoor Navigation with Smartphone-based Visual SLAM and Bluetooth-connected Wheel-Robot |
指導教授: |
高維文
Wei-Wen Kao |
口試委員: |
徐繼聖
Gee-Sern Hsu 陳亮光 Liang-Kuang Chen |
學位類別: |
碩士 Master |
系所名稱: |
工程學院 - 機械工程系 Department of Mechanical Engineering |
論文出版年: | 2013 |
畢業學年度: | 101 |
語文別: | 英文 |
論文頁數: | 74 |
中文關鍵詞: | Indoor positioning 、smartphone positioning 、Visual SLAM |
外文關鍵詞: | Indoor positioning, smartphone positioning, Visual SLAM |
相關次數: | 點閱:271 下載:10 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
Indoor Robot Navigation System is the problem of integrating sensor measurements to determine the robot and landmarks location using SLAM technique. Indoor navigation in an unknown environment for a robot can be realized by using Visual SLAM technique, in which image sensor measurements taken at different locations with respect to various landmarks are integrated with other robot motion sensors to determine the robot and landmark positions simultaneously. This is typically implemented with expensive image sensors and dedicated hardware and required extensive computing power. In recent years, smartphones have become the portal devices for consumers to access Internet and cloud-based IT services under mobile computing environments. In current smartphone setting, only GPS as well as cellular network and Wi-Fi network are utilized to provide position information in outdoor and indoor locations with moderate accuracy. With digital camera and MEMS-based motion sensors becoming standard features and computing power being improved rapidly, smartphones have the potential to play key roles in realizing indoor robot navigation.
A smartphone-based positioning system suitable for indoor robot application is proposed in this research by integrating sensors available on a smartphone with wheel odometer feedback from a Bluetooth-connected robot vehicle. In this system, the smartphone is attached to a wheel-robot and the robot is performing only 2-dimensional planar motions in indoor home or office environments. The planar motion assumption greatly reduces the complexity of the navigation system equations. WiFi signal and the inertial sensors from the smartphone were considered to provide additional measurements for positioning calculations. But they were determined to be inadequate for the application in this thesis. The odometer sensors in the wheel-robot are used in the robot motion equations. For the phone camera measurements, the real-time performances of SIFT, SURF, and ORB feature detection and tracking algorithms are compared and one was chosen to implement on the smartphone. A state convergence analysis was conducted based on the result of a Visual SLAM experiment.
Indoor Robot Navigation System is the problem of integrating sensor measurements to determine the robot and landmarks location using SLAM technique. Indoor navigation in an unknown environment for a robot can be realized by using Visual SLAM technique, in which image sensor measurements taken at different locations with respect to various landmarks are integrated with other robot motion sensors to determine the robot and landmark positions simultaneously. This is typically implemented with expensive image sensors and dedicated hardware and required extensive computing power. In recent years, smartphones have become the portal devices for consumers to access Internet and cloud-based IT services under mobile computing environments. In current smartphone setting, only GPS as well as cellular network and Wi-Fi network are utilized to provide position information in outdoor and indoor locations with moderate accuracy. With digital camera and MEMS-based motion sensors becoming standard features and computing power being improved rapidly, smartphones have the potential to play key roles in realizing indoor robot navigation.
A smartphone-based positioning system suitable for indoor robot application is proposed in this research by integrating sensors available on a smartphone with wheel odometer feedback from a Bluetooth-connected robot vehicle. In this system, the smartphone is attached to a wheel-robot and the robot is performing only 2-dimensional planar motions in indoor home or office environments. The planar motion assumption greatly reduces the complexity of the navigation system equations. WiFi signal and the inertial sensors from the smartphone were considered to provide additional measurements for positioning calculations. But they were determined to be inadequate for the application in this thesis. The odometer sensors in the wheel-robot are used in the robot motion equations. For the phone camera measurements, the real-time performances of SIFT, SURF, and ORB feature detection and tracking algorithms are compared and one was chosen to implement on the smartphone. A state convergence analysis was conducted based on the result of a Visual SLAM experiment.
[1]S. Hilsenbeck, A. Moller, R. Huitl, G. Schroth, M. Kranz, and E. Steinbach, "Scale-preserving long-term visual odometry for indoor navigation," in 2012 International Conference on Indoor Positioning and Indoor Navigation, IPIN 2012, November 13, 2012 - November 15, 2012, Sydney, NSW, Australia, 2012.
[2]M. Werner, M. Kessel, and C. Marouane, "Indoor positioning using smartphone camera," in 2011 International Conference on Indoor Positioning and Indoor Navigation, IPIN 2011, September 21, 2011 - September 23, 2011, Guimaraes, Portugal, 2011.
[3]D. Kim, E. Hwang, and S. Rho, "Location-based large-scale landmark image recognition scheme for mobile Devices," in 2012 3rd FTRA International Conference on Mobile, Ubiquitous, and Intelligent Computing, MUSIC 2012, June 26, 2012 - June 28, 2012, Vancouver, BC, Canada, 2012, pp. 47-52.
[4]D. G. Lowe, "Distinctive image features from scale-invariant keypoints," International Journal of Computer Vision, vol. 60, pp. 91-110, 2004.
[5]K. Matusiak and P. Skulimowski, "Comparison of key point detectors in SIFT implementation for mobile devices," in International Conference on Computer Vision and Graphics, ICCVG 2012, September 24, 2012 - September 26, 2012, Warsaw, Poland, 2012, pp. 509-516.
[6]H. Bay, T. Tuytelaars, and L. Van Gool, "SURF: Speeded up robust features," in 9th European Conference on Computer Vision, ECCV 2006, May 7, 2006 - May 13, 2006, Graz, Austria, 2006, pp. 404-417.
[7]E. Rublee, V. Rabaud, K. Konolige, and G. Bradski, "ORB: An efficient alternative to SIFT or SURF," in 2011 IEEE International Conference on Computer Vision, ICCV 2011, November 6, 2011 - November 13, 2011, Barcelona, Spain, 2011, pp. 2564-2571.
[8]L. Tsung-Nan and L. Po-Chiang, "Performance comparison of indoor positioning techniques based on location fingerprinting in wireless networks," in Wireless Networks, Communications and Mobile Computing, 2005 International Conference on, 2005, pp. 1569-1574 vol.2.
[9]K. Mouratidis, S. Bakiras, and D. Papadias, "Continuous Monitoring of Spatial Queries in Wireless Broadcast Environments," Mobile Computing, IEEE Transactions on, vol. 8, pp. 1297-1311, 2009.
[10]L. Hui, H. Darabi, P. Banerjee, and L. Jing, "Survey of Wireless Indoor Positioning Techniques and Systems," Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, vol. 37, pp. 1067-1080, 2007.
[11]W. M. Yeung and J. K. Ng, "Wireless LAN Positioning based on Received Signal Strength from Mobile device and Access Points," in Embedded and Real-Time Computing Systems and Applications, 2007. RTCSA 2007. 13th IEEE International Conference on, 2007, pp. 131-137.
[12]Z. Zhang, "A flexible new technique for camera calibration," Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol. 22, pp. 1330-1334, 2000.
[13]OpenCV Developers Team. (2013). Open Source Computer Vision Library. Available: http://opencv.org/
[14]A. J. Davison, I. D. Reid, N. D. Molton, and O. Stasse, "MonoSLAM: Real-time single camera SLAM," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, pp. 1052-1067, 2007.
[15]J. Civera, A. J. Davison, and J. M. M. Montiel, "Inverse depth parametrization for monocular SLAM," IEEE Transactions on Robotics, vol. 24, pp. 932-945, 2008.
[16]Google Inc. (2013). Android Developers Website. Available: http://developer.android.com