簡易檢索 / 詳目顯示

研究生: 林易增
Yi-Tseng Lin
論文名稱: 具視覺導引P300腦機介面於輪椅控制之應用
An Eye-gaze Guided P300 Brain Computer Interface and Its Wheelchair Control Applications
指導教授: 郭重顯
Chung-Hsien Kuo
口試委員: 李柏磊
Po-Lei Lee
劉益宏
Yi-Hung Liu
鍾聖倫
Sheng-Luen Chung
學位類別: 碩士
Master
系所名稱: 應用科技學院 - 醫學工程研究所
Graduate Institute of Biomedical Engineering
論文出版年: 2016
畢業學年度: 104
語文別: 中文
論文頁數: 74
中文關鍵詞: 眼動導引腦機介面P300腦機介面典型相關性分析支持向量機
外文關鍵詞: P300, BCI, eye-gaze guided BCI, CCA, SVM
相關次數: 點閱:290下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本論文提出基於視覺導引之P300腦機介面(Brain Computer Interface, BCI),此一介面應用於智慧型輪椅控制。腦機介面已成功應用在輪椅控制中,但傳統應用上,輪椅之控制面板大多固定於使用者視角下方,使用者操作時須同時低頭注視刺激面板並注意前方環境,易錯亂造成系統誤判;而當使用者操控輪椅轉彎時也需隨時注意兩側環境,無法注視控制面板,操作起來較不直覺。有鑒於此,本論文將視覺刺激畫面投影於使用者正前方半透明屏幕,並以眼動訊號偵測使用者之注視方向,定義出左、中及右三區,隨著使用者視角位置調整投射於前方之視覺刺激畫面顯示區域,使腦機介面使用起來更為直覺。本論文中,利用微型投影機將視覺刺激畫面投影於前方距使用者35公分之透明投影屏幕,產生P300視覺觸發之閃燈。為了提高資料轉換速率(Information Transfer Rate, ITR)本研究透過以典型相關性分析(Canonical Correlation Analysis, CCA)為基礎之空間濾波器,並運用支持向量機(Support Vector Machine, SVM)作為分類器,提高系統分類率。實驗結果顯示本研究提出之視覺導引P300腦機介面在100次離線實驗中,十位受試者成功率平均高達88.2%,ITR平均為22.97 bits/ min,而在輪椅控制實測比較無視覺導引之腦機介面,在U型軌跡與S型軌跡其均方根誤差分別降低4.71公分及4.88公分。


    This thesis proposes an eye-gaze guided P300 brain computer interface (BCI) which is used for wheelchair control applications. BCIs provided promising solutions for the disabled to control their wheelchairs. The conventional visual stimulus control panels are generally attached to the place below the users’ eye-gaze. Because users need to pay attention to the environment in front of the wheelchair and visual stimulus control panel simultaneously, it is not intuitive for users when they tried to control the wheelchair based on conventional visual stimulus BCI approaches. In order to tackle the disadvantage of the control panel, a translucent user interactive visual stimulus panel is set up in front of the wheelchair. The place of P300 flickers can be adjusted by eye-gaze guided approach synchronously. According to users’ electrooculography (EOG) collection, the projected field can be defined as three districts including left, middle and right area. In this research, a micro projector is utilized to produce flashing visual stimuli on the display board which is 35cm away from the user. To improve the information transfer rate (ITR), a spatial filter based on Canonical Correlation Analysis (CCA) and Support Vector Machine (SVM) were also applied to this work to improve the performance of BCI classification. The result of experiments showed that the proposed BCI is with 88.2% in accuracy and 22.97 bits/min information transfer rate in average received from ten subjects. In ground truth experiments of wheelchair control, the root mean squared error (RMSE) of eye-gaze guided P300 BCI are 7.4cm and 10.41cm in “U” and “S” trajectories which were less than the result of non-eye-gaze guided BCI with 12.11cm and 15.29cm.

    誌謝 I 中文摘要 II Abstract III 目錄 IV 圖目錄 VII 表目錄 X 符號對照表 XI 第1章 緒論 1 1.1 研究背景與動機 1 1.2 研究目的 2 1.3 論文架構 3 第2章 文獻回顧 4 2.1 P300腦機介面 4 2.2 複合式腦機介面 6 2.3 腦機介面應用於輪型機器人 7 2.4 基於眼動訊號之輪椅控制 7 2.5 獨立成分分析法 7 2.6 基於典型相關性分析空間濾波器 8 2.7 支持向量機分類器 8 2.8 文獻回顧總結 9 第3章 視覺導引腦機介面 10 3.1 本論文設計之腦機介面 10 3.2 眼動視覺導引 13 3.3 視覺導引之腦機介面系統架構 16 3.4 數位濾波器設計 17 3.4.1 可調式IIR高通與低通濾波器設計 18 3.5 基於典型相關性分析空間濾波器與支持向量機分類器 21 3.5.1 基於典型相關性分析空間濾波器 21 3.5.2 基於P300腦機介面支持向量機分類器 22 第4章 實驗流程設計 23 4.1 腦波訊號電極 23 4.2 腦波訊號擷取系統 24 4.3 P300腦機介面實驗流程 25 4.3.1 視覺觸發投影系統 25 4.3.2 P300視覺刺激閃燈距離設計 27 4.3.3 實驗協定 31 4.3.4 P300視覺刺激閃燈亂數表設計 32 4.3.5 P300腦機介面分類器訓練 33 4.4 視覺導引校正流程 34 4.5 視覺導引P300腦機介面準確度離線實驗流程 35 4.6 輪椅控制與應用 36 第5章 實驗結果與討論 37 5.1 P300腦機介面分類器模型訓練 38 5.2 眼動視覺導引 42 5.3 視覺導引P300腦機介面離線準確度實驗 46 5.4 輪椅控制與實測 50 第6章 結論與未來研究方向 58 6.1 結論 58 6.2 未來研究方向 59 參考文獻 60 附錄 63

    [1] A. Kaplan, S. Shishkin, I. Ganin, I. Basyul and A. Zhigalov, “Adapting the P300-based brain-computer interface for gaming: a review,” IEEE Transactions on Computational Intelligence and AI in Games, vol. 5, no. 2, pp. 141-149, Jun. 2013.
    [2] L. Bi, X. A. Fan, N. Luo, K. Jie, Y. Li and Y. Liu, “A head-up display-based P300 brain-computer interface for destination selection,” IEEE Transactions on Intelligent Transportation Systems, vol. 14, no. 4, pp. 1996-2001, Dec. 2013.
    [3] A. Lenhardt, M. Kaper and H. J. Ritter, “An adaptive P300-based online brain-computer interface,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 16, no. 2, pp. 121-130, Apr. 2008.
    [4] D. Puanhvuan and Y. Wongsawat, “Illuminant effect on LCD and LED stimulators for P300-based brain-controlled wheelchair,” Biomedical Engineering International Conference (BMEiCON), pp. 254 - 257, 2011.
    [5] D. Puanhvuan and Y. Wongsawat, “Semi-automatic P300-based brain-controlled wheelchair,” ICME International Conference on Complex Medical Engineering (CME), pp. 455 - 460, 2012.
    [6] I. Pathirage, K. Khokar, E. Klay, R. Alqasemi, and R. Dubey, “A vision based P300 brain computer interface for grasping using a wheelchair-mounted robotic arm,” IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), pp. 188 - 193, 2013.
    [7] F. Xin'an, L. Bi, Y. Li, K. Jie, and H. Ding, “A P300 brain-computer interface for controlling a mobile robot by issuing a motion command,” ICME International Conference on Complex Medical Engineering (CME), pp. 4707 – 710, 2013.
    [8] J. Long, Y. Li, H. Wang, T. Yu, J. Pan and F. Li, “A hybrid brain computer interface to control the direction and speed of a simulated or real wheelchair,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 20, no. 5, pp. 720-729, Sep. 2012.
    [9] Y. Li, J. Pan, F. Wang and Z. Yu, “A hybrid BCI system combining P300 and SSVEP and its application to wheelchair control,” IEEE Transactions on Biomedical Engineering, vol. 60, no. 11, pp. 3156-3166, Nov. 2013.
    [10] C. Postelnicu and D. Talaba, “P300-based brain-neuronal computer interaction for spelling applications,” IEEE Transactions on Biomedical Engineering, vol. 60, no. 2, pp. 534-543, Jan. 2013.
    [11] R. C. Panicker, S. Puthusserypady and Y. Sun, “An asynchronous P300 BCI with SSVEP-based control state detection,” IEEE Transactions on Biomedical Engineering, vol. 58, no. 6, pp. 1781-1788, Jun. 2011.
    [12] D. P. McMullen, G. Hotson, K. D. Katyal, B. A.Wester, M. S. Fifer, T. G. McGee, A. Harris, M. S. Johannes, R. J. Vogelstein, A. D. Ravitz, W. S. Anderson, N. V. Thakor and N. E. Crone “Demonstration of a semi-autonomous hybrid brain-machine interface using human intracranial EEG, eye tracking, and computer vision to control a robotic upper limb prosthetic,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 22, pp. 784-796, Jul. 2014.
    [13] C. Escolano, J. M. Antelis and J. Minguez, “A telepresence mobile robot controlled with a noninvasive brain-computer interface,” IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 42, pp. 793-804, Jun. 2012.
    [14] R. Barea, L. Boquete, M. Mazo, E. Lopez, “System for assisted mobility using eye movements based on electrooculography,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 10, no. 4, pp. 209-218, Dec. 2002.
    [15] Y. K. Wang, S. A. Chen and C. T. Lin, “An EEG-based brain-computer interface for dual task driving detection,” Neurocomputing, vol. 129, pp. 85-93, Apr. 2014.
    [16] N. Xu, X. Gao, B. Hong, X. Miao, S. Gao and F. Yang, “BCI competition 2003-data set IIb: enhancing P300 wave detection using ICA-based subspace projections for BCI applications,” IEEE Transactions on Biomedical Engineering, vol. 51, no. 6, pp. 1067-1072, Jun. 2004.
    [17] M. Spuler, A. Walter, W. Rosenstiel and M. Bogdan, “Spatial filtering based on canonical correlation analysis for classification of evoked or event-related potentials in EEG data,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 22, no. 6, pp. 1097-1103, Nov. 2014.
    [18] A. Frisoli, C. Loconsole, D. Leonardis, F. Banno, M. Barsotti, C. Chisari and M. Bergamasco, “A new gaze-BCI-driven control of an upper limb exoskeleton for rehabilitation in real-world tasks,” IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, vol. 42, 1169-1179, Nov. 2012.
    [19] Y. H. Liu, C. T. Wu, Y. H. Kao and Y. T. Chen, “Single-trial EEG-based emotion recognition using kernel Eigen-emotion pattern and adaptive support vector machine,” in Proceedings of IEEE International Conference on Engineering in Medicine and Biology Society (EMBC), 2013, pp. 4306-4309.
    [20] C. C. Chang and C. J. Lin, “LIBSVM,” ACM Transactions on Intelligent Systems and Technology, vol. 2, no. 3, pp. 1-27, Apr. 2011.
    [21] J. Polich, “Updating P300: An integrative theory of P3a and P3b,” Clinical Neurophysiology, vol. 118, no. 10, pp. 2128-2148, Oct. 2007.
    [22] 葉明杰,「手臂運動之肌電訊號特性分析與應用」,交通大學,碩士論文,民國94年。
    [23] 陳健修,「以P300事件相關電位為基礎之腦機介面人型機器人操控應用」,國立臺灣科技大學,碩士論文,民國103年。
    [24] 周鴻鈞,「具延遲校正之創新分散式視覺刺激腦機介面架構於無線連網設備應用」,國立臺灣科技大學,博士論文,民國104年

    無法下載圖示 全文公開日期 2021/08/30 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)
    全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
    QR CODE