簡易檢索 / 詳目顯示

研究生: 涂翔敏
Xiang-Min Tu
論文名稱: 基於凝視方向偵測之智慧化自動餵食機器人人機溝通系統開發
Gaze tracking based communication system for automatic food selection
指導教授: 林其禹
Chyi-Yeu Lin
口試委員: 徐繼聖
Gee-Sern Hsu
張以全
I-Tsyuen Chang
學位類別: 碩士
Master
系所名稱: 工程學院 - 機械工程系
Department of Mechanical Engineering
論文出版年: 2020
畢業學年度: 108
語文別: 中文
論文頁數: 57
中文關鍵詞: 餵食機器人瞳孔偵測凝視方向偵測
外文關鍵詞: feeding robot, pupil detection, gaze tracking
相關次數: 點閱:162下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

目前市面上已有數種餵食機器人針對不同的使用族群發展出相對應的操作方法,例如:搖桿操作及按鈕操作,但大多數會使用餵食機器人的使用者仍會因受傷程度或先天身體限制而無法順暢地操作餵食機器人達到自主進食的目的。為了解決以上的問題,本研究發展出一套低成本且基於凝視方向偵測的智慧化人機溝通系統來取代市面上常見的操作模組,本系統將搭配自製相機初始化系統進行相機位置初始化動作、LED提示燈確認使用者所選食物及自行研發之凝視方向偵測人機溝通系統來達到智慧選食的目標。最後針對五位受試者進行的測試確認本系統在自動餵食機前選擇食物操作上的穩健性和可用性。


There are several kinds of feeding robots on the market that have been developed corresponding different operation methods for each groups of users, such as joystick operation and button operation. However, most users who use the feeding robot will still be affected by the injury or physical limitations unable to smoothly operate the machine to achieve the purpose of self- feeding.In order to solve the above problems, this study developed a low-cost intelligent human-machine interaction system based on pupil and face posture detection to replace common operating modules.This system will be used with a camera initialization system, LED prompt light to confirm the food selected by the user and the human-machine communication system based on pupil and face posture detection to achieve the goal of intelligent food selection.The experiment involving five testers showed confirmative and stable results of the developed system for food selection operationin front of an automatic feeding machine.

摘要 I Abstract II 誌謝 III 目錄 III 圖目錄 VI 表目錄 VIII 第一章緒論 1 1-1 前言 1 1-1-1研究動機 2 1-1-2研究目的 3 1-2 文獻回顧 4 1-3 本文架構 7 第二章研究理論基礎 8 2-1人機互動簡述 8 2-2 瞳孔中心位置偵測 9 2-2-1 Haar feature-based cascade 分類器 9 2-2-2 Dlib人臉特徵點擷取 10 2-2-3 透視轉換矩陣 10 2-2-4 瞳孔中心定位 12 2-3 凝視方向辨識 12 第三章系統架構 16 3-1自動餵食機器人系統 16 3-2 人機互動模組 17 3-2-1 相機初始化 17 3-2-2 凝視位置校正 19 3-2-3 智慧選食 20 3-3開發環境及硬體配置 22 3-3-1開發環境 22 3-3-2 相機初始化系統 23 3-3-3 LED模組及語音輸出模組 26 3-3-4 餐盤及食物容器 28 3-4 系統流程圖 29 第四章實驗器材與配置 30 4-1 光源燈 31 4-2 揚聲器 32 4-3 相機初始化機構 32 4-4餐盤及食物容器 33 4-5 測試表單 34 第五章實驗結果 37 5-1 光源與相機參數調整 38 5-2 測試時間分析 40 5-3 人機互動選食系統測試結果 41 第六章結論與未來展望 44 6-1 結論 44 6-2 未來展望 44 參考文獻 45

1. Perera, C.J., T.D. Lalitharatne, and K. Kiguchi. EEG-controlled meal assistance robot with camera-based automatic mouth position tracking and mouth open detection. In 2017 IEEE International Conference on Robotics and Automation (ICRA). 2017.
2. Mike Topping. An Overview of the Development of Handy1, a Rehabilitation Robot to Assist the Severly Disabled.In 2002Journal of Intelligent and Robotic Systems. 2002.p. 253-263.
3. Song, W.-K. and J. Kim, Novel Assistive Robot for Self-Feeding.InTech: Robotic Systems - Applications, Control and Programming.2012.
4. I. Naotunna, C. J. Perera, C. Sandaruwan, R. A. R. C. Gopura and T.
D. Lalitharatne.Meal assistance robots: A review on current status,
challenges and future directions. In 2015 IEEE/SICE International Symposiumon System Integration (SII). 2015. p. 211-216.
5. JuliusSweetland.2019;Available from: https://github.com/OptiKey/OptiKey
6. A. Gibaldi, M. Vanegas, P. Bex, and G. Maiello. Evaluation of the tobii eyex eye tracking controller and matlab toolkit for research. Behavior Research Methods. 2016.
7. Paul Viola and Michael Jones.Rapid Object Detection using a Boosted Cascade of Simple Features. In 2001 IEEE Conf. on Computer Vision and Pattern Recognition. 2001.
8. D. E. King. Dlib-ml: A machine learning toolkit. In 2009 Journal of Machine Learning Research (JMLR). 2009.
9. F. Timm and E. Barth. Accurate Eye Centre Localisation by Means of Gradients. In 2011 VISAPP. 2011. p. 125-130.
10. N. Poulopoulos and E. Psarakis.A new high precision eye center localization technique. In 2017 ICIP. 2017.

無法下載圖示 全文公開日期 2025/02/03 (校內網路)
全文公開日期 2025/02/03 (校外網路)
全文公開日期 2025/02/03 (國家圖書館:臺灣博碩士論文系統)
QR CODE