簡易檢索 / 詳目顯示

研究生: 蔡沁宸
Chin-Chen Tsai
論文名稱: 基於RGB-D感測器用於非接觸式人機互動之目光偵測系統
Gaze Detection System Using RGB-D Sensors for Non-Contact Human Interaction
指導教授: 林其禹
Chyi-Yeu Lin
口試委員: 邱士軒
Shih-Hsuan Chiu
徐繼聖
Gee-Sern Jison Hsu
學位類別: 碩士
Master
系所名稱: 工程學院 - 機械工程系
Department of Mechanical Engineering
論文出版年: 2016
畢業學年度: 104
語文別: 英文
論文頁數: 51
中文關鍵詞: 目光偵測人機互動頭部方向瞳孔偵測RGBD感應器
外文關鍵詞: gaze detection, human-computer interaction, head orientation, pupil detection, RGBD sensor
相關次數: 點閱:434下載:4
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

在人機互動系統中,眼神凝視方向一直被視為一項能用以展示使用者目光和注意力的重要訊息。過往的研究裡,可接受的頭部姿勢判斷和瞳孔位置估算大多能在受限制的環境設定下達成。到目前為止,由於偵測的低精準度和頭戴式裝置造成的不便,都導致目前即時目光偵測的相關應用還是十分有限。同時,當臉部不是正面朝向裝置時,眼睛定位系統還無法準確地定位瞳孔中心。本文提出了兩種新方法以改善現有的目光追蹤技術,第一種方法使用了最新的RGBD設備Kinect v2來估測頭部運動以及目光的三維方向,提供較高精確度的偵測和高解析度的圖像。第二,採用一項最佳化瞳孔搜尋法來提高搜尋效率,顯著降低運算時間,因此,本論文提出結合頭部姿勢和瞳孔位置資訊的混合方法,實際提升目光辨識估測效能。


In human-computer interaction, gaze orientation is known as an important
and promising source of information to demonstrate the attention and
focus of users. Within previous research, satisfactory accuracy in head
pose and eye location estimation can be achieved mostly in constrained
settings. However, currently, real-time gaze orientation based applications
are still limited due to low-accuracy or inconvenience of associated head-
mounted devices. Also, in the presence of non-frontal faces, eye locators
are not adequate to accurately locate the center of the eyes. In this thesis,
two novel methods are proposed to improve the existing gaze tracking
techniques. The first method uses Kinect v2, one of the latest RGBD
devices, to estimate the 3D direction of the head movement and gaze.
Different from the previous devices such as Kinect v1 and web-camera, it
offers high-accuracy detections and high-resolution images. In the second,
a revised pupil search method with optimization is devised to increase the
efficiency of searching and so as to significantly minimize the calculation
time. Therefore, in this thesis, a hybrid scheme combining the head pose
and the eye location information is proposed to obtain the enhanced gaze
estimation.

1 Introduction 1 1.1 Thesis Proposal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.2 Overview of the Thesis . . . . . . . . . . . . . . . . . . . . . . . . . . 2 2 Background 3 2.1 Face Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.1.1 Face Detection using Haar Cascades . . . . . . . . . . . . . . 3 2.1.2 Active Shape Models (ASM) . . . . . . . . . . . . . . . . . . . 4 2.1.3 Active Appearance Models (AAM) . . . . . . . . . . . . . . . 5 2.2 Pupil Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 2.2.1 Feature-Based Methods . . . . . . . . . . . . . . . . . . . . . . 6 2.2.2 Model-Based Methods . . . . . . . . . . . . . . . . . . . . . . 7 2.3 Gaze Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 2.3.1 Model-Based Gaze Estimation . . . . . . . . . . . . . . . . . . 7 2.3.2 Screen-Based Gaze Estimation . . . . . . . . . . . . . . . . . . 8 3 Methods and Implementations 9 3.1 Proposed Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 3.2 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 3.2.1 Hardware Basics . . . . . . . . . . . . . . . . . . . . . . . . . 11 3.2.1.1 Face Orientation and Location . . . . . . . . . . . . 12 3.2.1.2 Coordinate System . . . . . . . . . . . . . . . . . . . 13 3.2.2 Basic Face Detection . . . . . . . . . . . . . . . . . . . . . . . 14 3.2.3 HD Face Detection . . . . . . . . . . . . . . . . . . . . . . . . 14 3.2.4 Pupil Detection . . . . . . . . . . . . . . . . . . . . . . . . . . 15 3.2.4.1 Brute-Force Search . . . . . . . . . . . . . . . . . . . 18 3.2.4.2 Golden Section Search (GSS) . . . . . . . . . . . . . 20 3.2.5 Perspective Transformation . . . . . . . . . . . . . . . . . . . 24 i3.2.6 Gaze Detection . . . . . . . . . . . . . . . . . . . . . . . . . . 25 3.2.7 System Calibration . . . . . . . . . . . . . . . . . . . . . . . . 26 4 Experiments and Results 27 4.1 Experimental Settings . . . . . . . . . . . . . . . . . . . . . . . . . . 27 4.2 Statistical Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 4.3 Statistical Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 4.3.1 Gaze Detection without Head Movement . . . . . . . . . . . . 29 4.3.2 Gaze Detection with Head Movement . . . . . . . . . . . . . . 30 4.4 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 4.4.1 Statistical Analysis on Customer’s Shopping Tendency . . . . 31 4.4.2 Non-Contact HCI in Shopping . . . . . . . . . . . . . . . . . . 32 5 Conclusions 34 6 Future Work 35 6.1 Multi-Camera-Detection . . . . . . . . . . . . . . . . . . . . . . . . . 35 6.2 Reflection Avoidance . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 6.3 Advanced Statistical Analysis . . . . . . . . . . . . . . . . . . . . . 37 Bibliography 38

References
[1] Reza Jafari and Djemel Ziou. Gaze estimation using kinect/ptz camera. In
Robotic and Sensors Environments (ROSE), 2012 IEEE International Sympo-
sium on, pages 13–18. IEEE, 2012.
[2] Fabian Timm and Erhardt Barth. Accurate eye centre localisation by means of
gradients. VISAPP, 11:125–130, 2011.
[3] Paul Viola and Michael Jones. Rapid object detection using a boosted cascade
of simple features. In Computer Vision and Pattern Recognition, 2001. CVPR
2001. Proceedings of the 2001 IEEE Computer Society Conference on, volume 1,
pages I–511. IEEE, 2001.
[4] Timothy F Cootes, Cristopher J Taylor, et al. Statistical models of appearance
for computer vision, 2004.
[5] Martin Bohme, Andr´e Meyer, Thomas Martinetz, and Erhardt Barth. Remote
eye tracking: State of the art and directions for future development. In Proc. of
the 2006 Conference on Communication by Gaze Interaction (COGAIN), pages
12–17, 2006.
[6] Danjie Zhu, Steven T Moore, and Theodore Raphan. Robust pupil center
detection using a curvature algorithm. Computer methods and programs in
biomedicine, 59(3):145–157, 1999.
[7] Takehiko Ohno, Naoki Mukawa, and Atsushi Yoshikawa. Freegaze: a gaze track-
ing system for everyday gaze interaction. In Proceedings of the 2002 symposium
on Eye tracking research & applications, pages 125–132. ACM, 2002.
[8] John G Daugman. High confidence visual recognition of persons by a test of
statistical independence. IEEE transactions on pattern analysis and machine
intelligence, 15(11):1148–1161, 1993.
38[9] Ko Nishino and Shree K Nayar. Eyes for relighting. In ACM Transactions on
Graphics (TOG), volume 23, pages 704–711. ACM, 2004.
[10] J-G Wang, Eric Sung, and Ronda Venkateswarlu. Eye gaze estimation from a
single image of one eye. In Computer Vision, 2003. Proceedings. Ninth IEEE
International Conference on, pages 136–143. IEEE, 2003.
[11] Kyung-Nam Kim and RS Ramakrishna. Vision-based eye-gaze tracking for
human computer interface. In Systems, Man, and Cybernetics, 1999. IEEE
SMC’99 Conference Proceedings. 1999 IEEE International Conference on, vol-
ume 2, pages 324–329. IEEE, 1999.
[12] Kar-Han Tan, David J Kriegman, and Narendra Ahuja. Appearance-based eye
gaze estimation. In Applications of Computer Vision, 2002.(WACV 2002). Pro-
ceedings. Sixth IEEE Workshop on, pages 191–195. IEEE, 2002.
[13] Stephen RH Langton, Helen Honeyman, and Emma Tessler. The influence of
head contour and nose angle on the perception of eye-gaze direction. Perception
& psychophysics, 66(5):752–771, 2004.
[14] Kenneth Alberto Funes Mora and Jean-Marc Odobez. Gaze estimation from mul-
timodal kinect data. In 2012 IEEE Computer Society Conference on Computer
Vision and Pattern Recognition Workshops, pages 25–30. IEEE, 2012.
[15] Jason Jerald and Mike Daily. Eye gaze correction for videoconferencing. In Pro-
ceedings of the 2002 symposium on Eye tracking research & applications, pages
77–81. ACM, 2002.
[16] Arantxa Villanueva and Rafael Cabeza. Models for gaze tracking systems. Jour-
nal on Image and Video Processing, 2007(3):4, 2007.
[17] Sean Andrist, Tomislav Pejsa, Bilge Mutlu, and Michael Gleicher. A head-eye
coordination model for animating gaze shifts of virtual characters. In Proceedings
of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction,
page 4. ACM, 2012.
[18] Vangos Pterneas. HOW TO USE KINECT HD FACE. http://pterneas.com/
2015/06/06/kinect-hd-face/, 2015.
39[19] Ravi Kothari and Jason L Mitchell. Detection of eye locations in unconstrained
visual images. In Image Processing, 1996. Proceedings., International Conference
on, volume 3, pages 519–522. IEEE, 1996.

QR CODE