研究生: |
羅基文 Chi-Wen Lo |
---|---|
論文名稱: |
人機互動式自主移動平台整合系統設計與開發 Design and Development of a Human-Machine Interactive Integration System for an Autonomous Mobile Platform |
指導教授: |
郭永麟
Yong-Lin Kuo |
口試委員: |
蔡明忠
Ming-Jong Tsai 楊振雄 Cheng-Hsiung Yang 葉仲基 Chung-Kee Yeh 郭永麟 Yong-Lin Kuo |
學位類別: |
碩士 Master |
系所名稱: |
工程學院 - 自動化及控制研究所 Graduate Institute of Automation and Control |
論文出版年: | 2018 |
畢業學年度: | 106 |
語文別: | 中文 |
論文頁數: | 130 |
中文關鍵詞: | 人機互動 、多點觸控 、無線通訊 、自我辨識 、自主運動 |
外文關鍵詞: | mobile platform, smart device, man-machine interaction, wireless, autonomous movement |
相關次數: | 點閱:392 下載:1 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本論文研究目的主要整合開發一台人機互動式移動平台由平台上的微控制開發板、無線通訊裝置、定位裝置和姿態感測器模組來建構出人機互動式的移動平台;針對現有移動平台操作方式,進行新創驅動模式的使用方法如手推車、嬰兒車、輪椅、搬運車、購物車、行李車…等由接觸式人類出力的操控模式轉變為非接觸式人機互動操控模式;研究執行以下各項功能的可行性:
(1) 可由智慧型行動裝置多點觸控技術在螢幕上讓操作者方便與平台直覺式互動控制透過無線通訊介面傳輸直接控 制平台行進的運動方向。
(2) 由平台上和操作者身上穿戴或攜帶的偵測器模組收發訊號形成非接觸式人機互動自走平台;藉由使用者身上收發器設定各運動模式,控制移動平台接收器達到直行、左行、右行、前進、後退能力。並建立平台對環境自我辨識,偵側路面的平坦度、上坡增加扭力放緩速度、下坡自我煞車減緩衝力、平坦路面隨步態快慢來加減平台速度、進行障礙物的閃避運動模式。達到自主式變換平台因環境、空間轉換的運動模式,進而達到自主運動目標。
經由實務製作本研究之互動式移動平台大致可完全展現出研究的各項功能,在人機互動模式中能夠依照整合開發的成品模型與功能來改變廣泛應用於現有平台操作方式,如手推車、嬰兒車、輪椅、農用車、購物車、巡房車、行李車…等。
This study develops a human-machine interactive system by integrating an autonomous mobile platform with an microcontroller open platform, wireless communication devices, positioning devices, and attitude sensor modules. The operation methods of the mobile platform is designed for trolleys, baby carriages, wheelchairs, vans, shopping carts, luggage carts, etc. Thus, the contact human-powered operation mode is changed to non-contact human-machine interactive mode. This study investigates the feasibilities of performing the following functions:
1. The smart mobile device with the multi-touch technology allows the operator to intuitively interact with the platform on the screen, which can control the direction of movement of the platform directly through the wireless communication interface.
2. A non-contact human-machine interactive autonomous platform is established by using sensor modules mounted on the platform and worn by operators, which can send and receives signals between them. By setting various motion modes on the transducers on the operators, the mobile platform can move straightly, leftward, rightward, forward, and backward. Besides, by establishing the self-sensing capability of the environment such as the detecting the road flatness, uphill, downhill, and obstacles to increase or decrease the speed of the mobile platform or to avoid the obstacles. Therefore, the autonomous movement can be fulfilled by self-changing the motion modes due to the environmental effects.
The interactive mobile platform of this study is designed and completed, and various functions are tested and demonstrated. Thus, the mobile platform can be widely applied to the existing human-powered platforms, such as trolleys, strollers, wheelchairs, agricultural vehicles, shopping carts, etc., according to the actual requirements.
[1] Wenjing Shuai, Patrick Maill, Alexander Pelov, “Charging electric vehicles in the Smart city: a survey of economy-driven approaches,” arXiv:1601.03925v1 [cs.GT] 15 Jan 2016.
[2] J. Hu, G. Yang, and H. Bindner, “Network constrained transactive control for electric vehicles integration,” in IEEE Power Energy Society General Meeting, pp. 1–5, July 2015.
[3] J. Hu, S. You, M. Lind, and J. Ostergaard, “Coordinated charging of electric vehicles for congestion prevention in the distribution grid,” IEEE Trans. Smart Grid, vol. 5, no. 2, pp. 703–711, Mar. 2014.
[4] Powered by DHL Trend Research ,“ Selef-driving vehicles in logistic,” A DHL perspective on implications and use cases for the logistics industry 2014,IN COOPERATION WITH:BOSCH.
[5] Jun Miura, Koji Iwase, and Yoshiaki Shirai ,“Interactive Teaching of a Mobile Robot,” Proceedings of the 2005 IEEE International Conference on Robotics and Automation Barcelona, Spain, April 2005.
[6] Bourlard, Hervé A., Morgan, Nelson,“Connectionist speech recognition a hybrid approach”, The Kluwer International Series in Engineering and Computer Science,1994.
[7] H.K.Maganti,Gatica-Perez and I.McCowan, ”Speech enhancement and recognition in meetings with an audio–visual sensor array,” IEEE Transactions on Audio, Speech and Language Processing ,vol.15,no.8,pp.2257-2269,2007.
[8] H.Xu,X.Hou,R.Su and Q. Ni, “Real-time hand gesture recognition system based on Associative Processors,” IEEE International Conference on Computer Science and Information Technology, pp. 14-18, 2009.
[9] C.H.Yean, and S.Yu, “Developing a smart camera for gesture recognition in HCI applications”,IEEE International Conference on Consumer Electronics, 2009, pp. 994-998.
[10] Peter N.Belhumeur, Joao P. Hespanha, and David J.Kriegman, “Eigenfaces vs. fisherfaces: recognition using class specific linear projection,” IEEE Transactions on pattern analysis and machine intelligence,vol.19,no.7,1997.
[11] Phillip Ian Wilson, and John Fernandez,“ Facial feature detection using HAAR classifiers, “ CCSC: South Central Conference, 2006.
[12] Chao Liu, Guang-xian Xu, “Design of Disaster Relief Robot Assistant,”The 2013 AASRI Winter International Conference on Engineering and Technology, 2013.
[13] Erico Guizzo, ”Honda using experimental new ASIMO for disaster response research”, IEEE Spectrum.
[14] Robert Bogue, “ Search and rescue and disaster relief robots : has their time finally come?,”43 Issue: 2, pp.138-143, Emerald Group Publishing Limited.
[15] Alonzo Kelly, Bryan Nagy, David Stager, Ranjith Unnikrishnan, “An infrastructure-free automated guided vehicle based on computer vision”, at the robotics institute of carnegie mellon university under contract to NASA and Ford Motor Company as part of the National Robotics Engineering Center,2007.
[16] “Smart Autonomous Mobile Robot,”www.aethon.com inquiries@aethon.com,Aethon Inc.
[17] ] Kyoung Taik Park,Doo Hyung Kim,“ Technology trend of smart mobile robot ”, 2013 13th International Conference on Control, Automation and Systems (ICCAS 2013),2013.
[18] T. Samad and A.M. Annaswamy (eds.), “Mobile-robot-enabled smart warehouses”, From: The Impact of Control Technology, 2nd ed., Available at www.ieeecss.org , 2014.
[19] Yingjie Sun, Qixin Cao and Weidong Chen, “An object tracking and global localization method using omnidirectional vision system,” Proceedings of the 5th World Congress on Intelligent Control and Automation, Hangzhou, P.R. China, Vol. 6, pp. 4730-4735, 2004.
[20] Ernesto Martín Gorostiza, José Luis Lázaro Galilea, ”Infrared Sensor System for Mobile-Robot Positioning in Intelligent Spaces”,Sensors 2011, 11, 5416-5438; doi:10.3390/s110505416.
[21] de Marziani, C., Ureña, J., Hernández, A.; Mazo, M., García, J.J., Jiménez, A., Villadangos, J.M.,Pérez, M.C., Ochoa, A., Álvarez, F. “Hardware implementation of acoustic sensor network for relative positioning system,”In Proceedings of the IEEE International Symposium on Industrial Electronics, Vigo, Spain, 4–7 June 2007.
[22] J. Y. Choi, S. H. Ock, S. Kim and D. H. Kim, “Autonomous omni-directional cleaning robot system design,” International Conference on Control, Automation, and Systems, Kintex, Korea, pp. 2019-2023, 2005.
[23] Y. J. Oh ,Y. Watanabe , “Development of Small Robot for Home Floor Cleaning,” Y.Aug.5-7,SICE, pp 3222–3223. 2002.
[24] R. Ouellette, K. Hirasawa, “A comparison of SLAM implementations for indoor mobile robots,” IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1479-1484.
[25] D. Hahnel, W. Burgard, D. Fox, K. Fishkin and M.Philipose , “Mapping and localization with RFID technology”, Proceedings of the 2004 IEEE International Conference on Robotics & Automation, New Orleans, LA, pp. 1015-1020,2004..
[26] C. Thorpe, M. H. Hebert, T. Kanade, and S. A.Shafer, “Vision and navigation for the Carnegie-Mellon Navlab,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 10,no. 3, pp. 362-373, May. 1988 .
[27] J. Choi, K., Lee, S., Ahn, M. Choi, and W. K.Chung, “A Practical Solution to SLAM and Navigation in Home Environment,” SICE-ICASE International Joint Conference, pp. 2015-2021, 2006.
[28] A. Elfes. “ Sonar-Based Real-World Mapping and Navigation,” Int. J. of Robotics and Automat., Vol. 3, No. 3, pp. 249–265, 1987.
[29] S. Iba, C.J. Paredis, and P.K. Khosla,“ Interactive multi-modal robot,” Programming. In Proceedings of 2002 IEEE Int. Conf. on Robotics and Automation, pp. 161–168, 2002.
[30] T. Inamura, M. Inaba, and H. Inoue., “ Integration Model of Learning Mechanism and Dialog Strategy Based on Stochastic Experience Representation Using Bayesian Network,” In Proceedings of 2000 Int. Workshop on Robot and Human Interactive Communication, pp.
[31] Rudzuan M. Nor , “Differential steering control for an autonomous mobile robot: A preliminary experimental study,” International Symposium on Robotics and Intelligent Sensors 2012 (IRIS 2012).
[32] Abujawad Rafid Siddiqui ,“A Vision and Differential Steering System for a Mobile Robot Platform,” Master Thesis Computer Science Thesis no: MCS-2010-22 May 2010.
[33] Thomas Hellström ,“ Kinematics equations for differential drive and articulated steering ,” Department of Computing Science Umeå University 2011-08-28.
[34] Sandeep Kumar Malu & Jharna Majumdar , “Kinematics, Localization and control of differential drive mobile robot ,” Global Journal of Researches in Engineering,Volume 14 Issue 1 Version 1.0 Year 2014 .
[35] Mao Xu, Wang Xin, “ Design of electric orchard vehicle four-wheel steering control system advanced materials research,” ISSN: 1662-8985, Vols. 753-755, pp 1966-1969 doi:10.4028/www.scientific.net/AMR.753-755.1966 © 2013 Trans Tech Publications, Switzerland.
[36] G.W. Lucas,“ A tutorial and elementary trajectory model for the differential steering system of robot wheel a ctuators ,”The Rossum Project Open-Source Robotics Software, http://rossum.sourceforge.net/
[37] J. Borenstein, H. R. Everett, and L. Feng, “ Where am I ? Systems and Methods for Mobile Robot Positioning ,” Edited and compiled by J. Borenstein March 1996.
[38] Nguyen Duy Cuong,Gia Thi Dinh and Tran Xuan Minh , “ Direct MRAS based an adaptive control system for a two-wheel mobile robot ,” Journal of Automation and Control Engineering Vol. 3, No. 3, June 2015.
[39] Dennis Sprute, Robin Rasch, Klaus Tonnies and Matthias Konig , “A framework for interactive Teaching of virtual borders to mobile robots,” This work originates from the IoT-Lab of Bielefeld University of Applied Sciences, Campus Minden: http://iot-minden.de ,arXiv:1702.04970v1 [cs.RO] 16 Feb 2017.
[40] Arduino官方網站, http://arduino.cc/
[41] Elektor 網站, https://www.elektor.com/arduino-mega-adk
[42] Aduino官方網站的下載頁面http://arduino.cc/en/Main/Software
[43] Parallax Inc. ,“PING))) Ultrasonic Distance Sensor (#28015) ,” Web Site: www.parallax.com
[44] Analog Devices, Inc., “Small, Low Power, 3-Axis ±3 g Accelerometer ,”www.analog.com
[45] Vnh3sp30,” Fully integrated H-bridge motor driver, ” STMicroelectronics GROUP OF COMPANIES, http://www.st.com
[46] 維基百科,自由的百科全書,https://zh.wikipedia.org/zh-tw
[47] 鄒應嶼教授,電動機控制簡介,交通大學電機與控制工程系所,1996。
[48] 梅克工作室,輪型機器人應用與專題製作,新北市:台科大圖書股份有限公司,2015。
[49] 趙英傑,Arduino 互動設計入門,台北市:旗標出版股份有限公司,2014。
[50] 財經MoneyDJ知識庫,http://www.moneydj.com/KMDJ/Wiki/WikiViewer.aspx?KeyID=38e2a33a-9a61-46c4-8bcd-ea64d51969c5#ixzz3OPdyAE00(檢索日期2018/1/19)。
[51] 泛用型無人搬運車型錄,易控機器人股份有限公司,www.e-con.com.tw/
[52] 無人搬運車系統型錄,廣運機械工程股份有限公司,www.kenmec.com/tw/AGV.aspx
[53] 2018 Bluetooth SIG, Inc.
[54] 林家德,InnoBasic 單板電腦專題製作, 新北市:台科大圖書股份有限公司,2011。
[55] 文淵閣工作室,Android 初學特訓班,台北市:碁峰資訊股份有限公司,2014。
[56] M. Sugisaka and D. Hazry, “Development of a proportional control method for a mobile robot, ” Applied Mathematics and Computational 186, pp.74-82,2007。
[57] RoboRemoFree遠端控制應用程式下載,Http://roboremo.com/downloads.html。
[58] Bluetooth terminal HC-05遠端控制應用程式下載,https://apkpure.com/bluetooth-terminal-hc-05/project.bluetoothterminal。