簡易檢索 / 詳目顯示

研究生: 林品鐘
Ping-Jung Lin
論文名稱: 一個在平面螢幕使用街景導航的眼動追蹤估算方法
An Estimation Approach for Tracking Head Orientation while Navigating Street View on the Flat Screen Monitor
指導教授: 鄧惟中
Wei-Chung Teng
口試委員: 鮑興國
Hsing-Kuo Pao
花凱龍
Kai-Lung Hua
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2015
畢業學年度: 103
語文別: 中文
論文頁數: 56
中文關鍵詞: 頭戴式顯示器眼動研究估算方法
外文關鍵詞: head-mounted display, eye tracking, estimation approach
相關次數: 點閱:290下載:1
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本研究利用民眾生活中經常使用的Google地圖做為測試環境,透過眼動儀測量受測者的眼動情形,比較使用平面螢幕和頭戴式顯示器(Oculus DK2)測驗時的效率與眼動情形差異,並以熱圖(Heat Map)方式呈現。平面螢幕只能顯示部份街景全景圖;使用眼動儀可以記錄眼睛注視位置的螢幕座標,但無法得知實際看街景的位置。因此本研究提出一個轉換的方法,將兩種顯示器下,測得的資料轉換成一致單位,使平面螢幕的記錄資料在轉換後,也可以知道實際看街景的位置,並且與配戴頭戴式顯示器的眼動情形一樣可視覺化。經由實驗後發現,在使用頭戴式顯示器下進行Google地圖測驗時的凝視時間增加75.8%、掃視時間減少37.0%、觀看地圖時間減少17.3%,此三項指標皆優於使用平面螢幕。


    In this thesis, we adopted Google Maps as our platform to study the differences in eye movements as well as the performance of test subjects when asked to perform several tasks using a traditional flat screen monitor and using a widely-known head-mounted virtual reality display Oculus DK2. There are several reasons why comparing the performance between these two user interfaces is a difficult task. First, the street views that Google Map provides are panorama images attached to a spherical 3D model, and only a part of it is showed on a flat screen monitor. Therefore, simply recording the coordinates measured by eye tracking devices will not help us understand where the users are really looking at. Second, when wearing a head-mounted VR display such as Oculus DK2, there is no way to track eye movements as the eyes are blocked by the display itself. In order to overcome these difficulties, we developed a method to transform the data collected from these two user interfaces into one common format. Experiment results show significant differences between the two user interfaces. Our test subjects show 75.8% increment in fixation time, 37.0% decrement in saccade time and 17.3% decrement in time spent on checking maps while wearing Oculus DK2 which is better than using a traditional flat screen monitor.

    摘要 I Abstract II 誌謝 III 目錄 IV 圖目錄 VI 表目錄 IX 第1章 緒論 1 1.1前言 1 1.2研究背景 2 1.3動機與目的 3 1.4論文架構 4 第2章 文獻探討 5 2.1手持式地圖探索系統 5 2.1.1 手持式地圖探索系統簡介 5 2.1.2 虛擬實境模式 8 2.2虛擬實境 9 2.2.1虛擬實境簡介 9 2.2.2 沉浸程度對效能的影響 10 2.3眼動儀 11 2.3.1眼動行為研究 11 2.3.2眼動資料分析 12 第3章 研究方法 15 3.1系統架構 15 3.2在虛擬實境下的眼動座標測量 16 3.3使用螢幕時的眼動座標轉換方法 18 3.3.1 Field of View 18 3.3.2 View Space to World Space 20 第4章 實驗與結果 23 4.1實驗平台與工具 23 4.2實驗設計 24 4.2.1座標轉換驗證實驗 24 4.2.2地圖探索實驗 26 4.2.3地圖導航實驗 27 4.3實驗結果 28 4.3.1座標轉換驗證實驗結果分析 28 4.3.2地圖探索實驗結果分析 31 4.3.3地圖導航實驗結果分析 40 第5章 結論與未來工作 46 5.1結論 46 5.2未來工作 46 參考文獻 47 附錄A 49

    [1]https://support.google.com/maps
    [2]Fabroyir, H., Teng, W.-C., Wang, S.-L., & Tara, R.-Y., “MapXplorer Handy: An Immersive Map Exploration System Using Handheld Device,” in Cyberworlds (CW), on 2013 International Conference, pp. 101-107, 2013.
    [3]https://www.oculus.com/en-us/
    [4]Land, M. F, “Eye movements and the control of actions in everyday life,” Progress in Retinal and Eye Research, vol. 25, no 3, pp. 296-324, 2006.
    [5]Wooding, D. S, “Fixation maps: quantifying eye-movement traces,” in Proceedings of the 2002 symposium on Eye tracking research & applications, New Orleans, Louisiana, pp. 31-36, 2002.
    [6]Aretz, A. J., & Wickens, C. D. The Mental Rotation of Map Displays. Human Performance, vol. 5, no. 4, pp. 303-328, 1992.
    [7]Parush, A., & Berman, D. Navigation and orientation in 3D user interfaces: the impact of navigation aids and landmarks. International Journal of Human-Computer Studies, vol. 61, no. 3, pp. 375-395, 2004.
    [8]Grigore C. Burdea, Philippe Coiffet. Virtual Reality Technology, 2nd Edition, New York, MA: Wiley-IEEE Press Pub, 1993.
    [9]Annetta, L., Folta, E., & Klesath, M, “Presence, Identity, Self-Representation, and Perspective Taking Within Virtual Online Courses,” in V-Learning, Springer Netherlands, pp. 83-96, 2010.
    [10]Laurel, B. Computers as theatre. Boston, MA: Addison-Wesley Pub, 1993.
    [11]Slater, M., Linakis, V., Usoh, M., & Kooper, R, “Immersion, presence, and performance in virtual environments: An experiment with tri-dimensional chess,” in ACM virtual reality software and technology (VRST), pp 163-172, 1996.
    [12]Rayner, K, “Eye movements in reading and information processing: 20 years of research,” Psychological bulletin, vol. 124, no 3, pp. 372-422, 1998.
    [13]Tinker, M. A, “Recent studies of eye movements in reading,” Psychological Bulletin, vol. 55, no 4, pp. 215-231, 1958.
    [14]Just, M. A., & Carpenter, P.A, “A theory of reading: From eye fixations to comprehension,” Psychological review, vol. 87, no 4, pp. 329-354, 1980.
    [15]Matin, E., Shao, K. C., & Boff, K. R, “Saccadic overhead: Information-processing time with and without saccades,” Perception & Psychophysics, vol. 53, no 4, pp 372-380, 1993.
    [16]http://bitshare.cm/post/13642256744/heat-maps-show-how-social-profiles-are-viewed
    [17]Lai, M.-L., Tsai, M.-J., Yang, F.-Y., Hsu, C.-Y., Liu, T.-C., Lee, S. W.-Y., Lee, M.-H., Chiou, G.-L., Liang, J.-C., Tsai, C.-C, “A review of using eye-tracking technology in exploring learning from 2000 to 2012,” Educational Research Review, vol. 10, pp. 90-115, 2013.
    [18]Oculus Development Kit 2 Developer Guide
    [19]https://developers.google.com/maps/documentation/javascript/streetview

    QR CODE