研究生: |
涂翔敏 Xiang-Min Tu |
---|---|
論文名稱: |
基於凝視方向偵測之智慧化自動餵食機器人人機溝通系統開發 Gaze tracking based communication system for automatic food selection |
指導教授: |
林其禹
Chyi-Yeu Lin |
口試委員: |
徐繼聖
Gee-Sern Hsu 張以全 I-Tsyuen Chang |
學位類別: |
碩士 Master |
系所名稱: |
工程學院 - 機械工程系 Department of Mechanical Engineering |
論文出版年: | 2020 |
畢業學年度: | 108 |
語文別: | 中文 |
論文頁數: | 57 |
中文關鍵詞: | 餵食機器人 、瞳孔偵測 、凝視方向偵測 |
外文關鍵詞: | feeding robot, pupil detection, gaze tracking |
相關次數: | 點閱:162 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
目前市面上已有數種餵食機器人針對不同的使用族群發展出相對應的操作方法,例如:搖桿操作及按鈕操作,但大多數會使用餵食機器人的使用者仍會因受傷程度或先天身體限制而無法順暢地操作餵食機器人達到自主進食的目的。為了解決以上的問題,本研究發展出一套低成本且基於凝視方向偵測的智慧化人機溝通系統來取代市面上常見的操作模組,本系統將搭配自製相機初始化系統進行相機位置初始化動作、LED提示燈確認使用者所選食物及自行研發之凝視方向偵測人機溝通系統來達到智慧選食的目標。最後針對五位受試者進行的測試確認本系統在自動餵食機前選擇食物操作上的穩健性和可用性。
There are several kinds of feeding robots on the market that have been developed corresponding different operation methods for each groups of users, such as joystick operation and button operation. However, most users who use the feeding robot will still be affected by the injury or physical limitations unable to smoothly operate the machine to achieve the purpose of self- feeding.In order to solve the above problems, this study developed a low-cost intelligent human-machine interaction system based on pupil and face posture detection to replace common operating modules.This system will be used with a camera initialization system, LED prompt light to confirm the food selected by the user and the human-machine communication system based on pupil and face posture detection to achieve the goal of intelligent food selection.The experiment involving five testers showed confirmative and stable results of the developed system for food selection operationin front of an automatic feeding machine.
1. Perera, C.J., T.D. Lalitharatne, and K. Kiguchi. EEG-controlled meal assistance robot with camera-based automatic mouth position tracking and mouth open detection. In 2017 IEEE International Conference on Robotics and Automation (ICRA). 2017.
2. Mike Topping. An Overview of the Development of Handy1, a Rehabilitation Robot to Assist the Severly Disabled.In 2002Journal of Intelligent and Robotic Systems. 2002.p. 253-263.
3. Song, W.-K. and J. Kim, Novel Assistive Robot for Self-Feeding.InTech: Robotic Systems - Applications, Control and Programming.2012.
4. I. Naotunna, C. J. Perera, C. Sandaruwan, R. A. R. C. Gopura and T.
D. Lalitharatne.Meal assistance robots: A review on current status,
challenges and future directions. In 2015 IEEE/SICE International Symposiumon System Integration (SII). 2015. p. 211-216.
5. JuliusSweetland.2019;Available from: https://github.com/OptiKey/OptiKey
6. A. Gibaldi, M. Vanegas, P. Bex, and G. Maiello. Evaluation of the tobii eyex eye tracking controller and matlab toolkit for research. Behavior Research Methods. 2016.
7. Paul Viola and Michael Jones.Rapid Object Detection using a Boosted Cascade of Simple Features. In 2001 IEEE Conf. on Computer Vision and Pattern Recognition. 2001.
8. D. E. King. Dlib-ml: A machine learning toolkit. In 2009 Journal of Machine Learning Research (JMLR). 2009.
9. F. Timm and E. Barth. Accurate Eye Centre Localisation by Means of Gradients. In 2011 VISAPP. 2011. p. 125-130.
10. N. Poulopoulos and E. Psarakis.A new high precision eye center localization technique. In 2017 ICIP. 2017.