簡易檢索 / 詳目顯示

研究生: 蔡文亭
Wen-Ting Tsai
論文名稱: 立體擴增環境下使用者操作3D水平選單之績效研究
User performance in operating the stereoscopic Menus in augmented reality
指導教授: 林久翔
Chiuh-Siang Lin
口試委員: 江行全
Bernard Jiang
梁曉帆
Sheau-Farn Liang
學位類別: 碩士
Master
系所名稱: 管理學院 - 工業管理系
Department of Industrial Management
論文出版年: 2019
畢業學年度: 107
語文別: 中文
論文頁數: 61
中文關鍵詞: 擴增實境立體實境選單操作使用者表現績效
外文關鍵詞: augmented reality, stereoscopic reality, menu operation, the performance of user
相關次數: 點閱:230下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 鑒於3D環境之裝置例如AR、VR等科技日漸普及,但目前仍無一致性的設計原則,本研究回顧目前立體實境之相關研究,探討三個影響因素包含顯示環境之設計、互動原則之設計及任務困難度等對使用者績效之影響,並利用系統操作最常見之水平選單進行實驗。
    顯示環境主要利用人眼知覺 3D 影像之單眼及雙眼線索進行設計,而其中影響立體視覺最主要來自單眼影像之視覺影像融合,本研究利用負像差之原理將選單呈現在3D電視螢幕及人眼之間。互動方式則仿擴增實境常使用之手勢追蹤輸入方法,利用Leap手勢追蹤裝置監測食指指尖之座標及移動方向,模擬真實環境按壓按鈕的情境;任務困難度由改變按鈕尺寸及距離進行調整。
    其結果顯示,影響使用者最主要的因素來自於選單按鈕之尺寸及按鈕間距離,隨著按鈕尺寸越大及按鈕間距離越短,移動時間會隨之降低,而有游標輔助的客觀距離量測可以有效降低錯誤率,由於人眼在立體實境下會有距離預測落差,因此在微幅負視差的條件表現相對較好,但未達顯著影響。另外,按鈕位置亦會影響使用者績效,目視中心的按鈕錯誤率最低。本研究之結果將可應用至立體實境近體空間之選單設計。


    In view of the 3D technologies such as AR and VR getting popular, but there are still no consistent design principles, thus this study reviews the current research in stereoscopic reality and sorts out several factors: the design of display, interaction and the difficulties of task.Menus are important in system operation, and often be operated via pointing. Since gesture tracking as input method commonly found in the augmented reality, here we used the Leap Motion to track user’s hands and fingers to operate the system, try to simulate how users press the button in the real environment.
    This study controls the display environment, interaction mode, and the level of task difficulty to understand users’ performance. People use the monocular and binocular cues to detect the 3D images, but for the stereoscopic images are formed by the visual fusion from each eye. Here we compare negative parallax and zero parallax which belong to different display environments, and with cursors or without cursors in different interaction mode and different task difficulty.
    The results show that the most important factor affecting the performance comes from the size of the menu button and the distance between the buttons. As the button size is larger and the distance between the buttons is shorter, the movement time will decrease, and the error rate will decrease, too. In cursor-assisted condition, user can measure the distance effectively, so the error rate reduced. And since the human will percept the distance incorrectly, the condition of smaller negative parallax is better. In addition, the button position affects user performance, for the center button, it has the lowest error rate. The results can be applied to design the stereoscopic menu in peripersonal space.

    I 摘要 II ABSTRACT III 誌謝 IV 目錄 VI 圖目錄 VII 表目錄 第一章 緒論..........................1 1.1 研究背景與動機.................1 1.2 研究目的......................5 1.3 論文研究架構..................5 1.3.1 目標問題辨認階段..............5 1.3.2 實驗階段.....................5 1.4 研究限制.....................6 第二章 文獻探討.....................7 2.1 人機介面互動方法 ............7 2.1.1 互動相容性...................7 2.1.2 距離判斷.....................9 2.2 人體知覺3D立體影響之原理.....10 2.3 AR立體成像顯示器之發展與應用..12 2.4 虛擬實境之訊號輸入...........15 2.5 績效指標....................17 2.6 選單設計....................18 第三章 研究方法....................21 3.1 受試者......................21 3.2 實驗設計與研究模型...........21 3.2.1 研究設計....................21 3.2.2 自變數......................26 3.2.3 應變數......................28 3.3 研究程序....................29 3.4 實驗環境與設備..............29 第四章 研究結果...................33 4.1 資料處理...................33 4.2 移動時間...................33 4.3 錯誤率.....................38 第五章 討論......................38 第六章 結論......................43 附錄..............................48 圖目錄 第一章 緒論.............................................................................................................................1 圖 1.1 混合實境的分類,圖重製自(Milgram & Kishino, 1994) 2 圖 1.2 行動手持型AR顯示器,圖片來源:https://applemagazine.com/will-the-iphone-8-dabble-in-augmented-reality/33356 2 圖 1.3 影片及空間AR顯示器,圖片來源:http://www.augment.com/blog/3-consumer-giants-who-used-augmented-reality-for-retail/ 2 圖 1.4 穿戴式AR顯示器,圖片來源:https://blog.metavision.com/ces-2018-a-first-look-at-the-ar-supercar-you-can-touch-feel 3 第二章 文獻探討....................................................................................................................7 圖 2.1 零像差、負像差、正像差示意圖 10 圖 2.2 Panum融像區的零像差區域,取自(Raynaud, 2016) 11 圖 2.3 正像差、負像差公式對照圖 12 圖 2.4 立體成像原理,重製自(Urey et al., 2011) 13 圖 2.5 AR感測器技術分類表,重製自(Rabbi & Ullah, 2013) 15 圖 2.6 Leap內部結構圖,取自 (Weichert et al., 2013) 17 圖 2.7 Leap控制器有效偵測範圍,圖片來源:https://developer.leapmotion.com/documentation/java/api/Leap.Device.html 17 圖 2.8 垂直選單 19 圖 2.9 水平選單 19 圖 2.10 派狀選單,圖片來源:https://doc.qt.io/archives/qq/qq09-qt-solutions.html 19 圖 2.11 圓形選單,圖片來源:http://www.soliddna.com/SEHelp/ST5/EN/i_v/radialtb1h.htm 20 第三章 研究方法………………………………………………...…………………………21 圖 3.1 研究模型 23 圖 3.2 水平選單按鈕排列圖 24 圖 3.3 情境內步驟示意程序圖 25 圖 3.4 水平選單與螢幕之間的距離 27 圖 3.5 實驗配置及情境 30 圖 3.6 Sony 3D 電視,圖片來源:http://220v.com/product/1506/ 30 圖 3.7 Sony主動式快門眼鏡 30 圖 3.8 Leap Motion裝置 31 圖 3.9 實驗環境 32 第五章 討論………………………………………………………………………….……..38 圖 5.1 任務困難度對於平均移動時間的影響 38 圖 5.2 任務困難度對於錯誤率的影響 39 圖 5.3 按鈕間距離與按鈕尺寸對平均移動時間之影響 39 圖 5.4 不同任務難易度與水平選單浮出距離之平均錯誤率 40 圖 5.5 不同任務難易度與有無游標輔助之平均錯誤率 41 圖 5.6 選單按鈕示意圖 41 圖 5.7 按鈕平均錯誤率 42 表目錄 第二章 文獻探討....................................................................................................................7 表 2.1 IPD依負像差公式計算之各項對應距離 12 第三章 研究方法…………………………………………………………...……………....21 表 3.1 路徑編號及距離 25 表 3.2 循環取樣示意圖及距離組合 26 表 3.3 任務困難度對照表 28 第四章 研究結果...................................................................................................................33 表4.1 實驗設計之平均移動時間、ANOVA及Tukey HSD分析表 34 表4.2 按鈕尺寸與距離對於平均移動時間之影響的ANOVA及Tukey HSD分析表 35 表4.3 平均移動時間與按鈕尺寸、按鈕間距離之回歸分析 35 表4.4 實驗設計之平均錯誤率、ANOVA及Turkey HSD分析表 36 表4.5 按鈕平均錯誤率、ANOVA分析表 37

    1. Arnott, Stephen R, & Shedden, Judith M. (2000). Attention switching in depth using random-dot autostereograms: Attention gradient asymmetries. Perception & Psychophysics, 62(7), 1459-1473.
    2. Azuma, Ronald T. (1997). A survey of augmented reality. Presence: Teleoperators & Virtual Environments, 6(4), 355-385.
    3. Bowman, Doug A, Kruijff, Ernst, Jr., Joseph J. LaViola, & Poupyrev, Ivan. (2001). An introduction to 3-D user interface design. Presence: Teleoperators & Virtual Environments, 10(1), 96-108.
    4. Bowman, Doug A, & Wingrave, Chadwick A. (2001). Design and evaluation of menu systems for immersive virtual environments. Paper presented at the Proceedings of the Virtual Reality 2001 Conference.
    5. Brandl, Peter, Leitner, Jakob, Seifried, Thomas, Haller, Michael, Doray, Bernard, & To, Paul. (2009). Occlusion-aware menu design for digital tabletops. Paper presented at the CHI'09 Extended Abstracts on Human Factors in Computing Systems.
    6. Bruder, Gerd, Steinicke, Frank, & Stürzlinger, Wolfgang. (2013a). Effects of visual conflicts on 3D selection task performance in stereoscopic display environments. Paper presented at the Symposium on 3D User Interfaces 2013.
    7. Bruder, Gerd, Steinicke, Frank, & Stürzlinger, Wolfgang. (2013b). To touch or not to touch? Comparing 2D touch and 3D mid-air interaction on stereoscopic tabletop surfaces. Paper presented at the Proceedings of the 1st symposium on Spatial user interaction.
    8. Carmigniani, Julie, Furht, Borko, Anisetti, Marco, Ceravolo, Paolo, Damiani, Ernesto, & Ivkovic, Misa. (2011). Augmented reality technologies, systems and applications. Multimedia tools and applications, 51(1), 341-377.
    9. Colombo, Carlo, Del Bimbo, Alberto, & Valli, Alessandro. (2003). Visual capture and understanding of hand pointing actions in a 3-D environment. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 33(4), 677-686.
    10. Cutting, James E, & Vishton, Peter M. (1995). Perceiving layout and knowing distances: The integration, relative potency, and contextual use of different information about depth. In Perception of space and motion (pp. 69-117): Elsevier.
    11. Grossman, Tovi, & Balakrishnan, Ravin. (2004). Pointing at trivariate targets in 3D environments. Paper presented at the Proceedings of the SIGCHI conference on Human factors in computing systems.
    12. Grossman, Tovi, & Wigdor, Daniel. (2007). Going Deeper: a Taxonomy of 3D on the Tabletop. Paper presented at the Second Annual IEEE International Workshop on Horizontal Interactive Human-Computer Systems.
    13. Grossman, Tovi, Wigdor, Daniel, & Balakrishnan, Ravin. (2004). Multi-finger gestural interaction with 3D volumetric displays. Paper presented at the Proceedings of the 17th annual ACM symposium on User interface software and technology.
    14. Guna, Jože, Jakus, Grega, Pogačnik, Matevž, Tomažič, Sašo, & Sodnik, Jaka. (2014). An Analysis of the Precision and Reliability of the Leap Motion Sensor and Its Suitability for Static and Dynamic Tracking. Sensors, 14(2), 3702-2720.
    15. Hand, Chris. (1994). A survey of 3-D input devices. Department of Computing Science, De Montfort University, The Gateway, Leicester LE1 9BH.
    16. Hand, Chris. (1997). A survey of 3D interaction techniques. Paper presented at the Computer graphics forum.
    17. Hast, Anders. (2010). Game Engine Gems, Volume One: Jones & Bartlett Learning.
    18. Hinckley, Ken, & Wigdor, Daniel. (2002). The human-computer interaction handbook: fundamentals, evolving technologies and emerging applications.
    19. Hua, Hong, Girardot, Axelle, Gao, Chunyu, & Rolland, Jannick P. (2000). Engineering of head-mounted projective displays. Applied Optics, 39(22), 3814-3824.
    20. ISO. (2000). Ergonomic requirements for office work with visual display terminals (VDTs) -- Part 9: Requirements for non-keyboard input devices. In: International Organization for Standardization.
    21. Jacoby, Richard H, & Ellis, Stephen R. (1992). Using virtual menus in a virtual environment. Paper presented at the Visual Data Interpretation.
    22. Jones, J Adam, Swan II, J Edward, Singh, Gurjot, Kolstad, Eric, & Ellis, Stephen R. (2008). The effects of virtual reality, augmented reality, and motion parallax on egocentric depth perception. Paper presented at the Proceedings of the 5th symposium on Applied perception in graphics and visualization.
    23. Kipper, Greg, & Rampolla, Joseph. (2012). Augmented Reality: An Emerging Technologies Guide to AR: Elsevier Science.
    24. Kouroupetroglou, Georgios, Pino, Alexandros, Balmpakakis, Athanasios, Chalastanis, Dimitrios, Golematis, Vasileios, Ioannou, Nikolaos, & Koutsoumpas, Ioannis. (2012). Using Wiimote for 2D and 3D pointing tasks: gesture performance evaluation. Paper presented at the 9th International Gesture Workshop, GW 2011.
    25. Kulshreshth, Arun, Zorn, Chris, & LaViola, Joseph J. (2013). Poster: Real-time markerless kinect based finger tracking and hand gesture recognition for HCI. Paper presented at the Symposium on 3D User Interfaces 2013.
    26. Kyritsis, Markos, Gulliver, Stephen R, & Feredoes, Eva. (2016). Environmental factors and features that influence visual search in a 3D WIMP interface. International Journal of Human-Computer Studies, 92-93, 30-43.
    27. Lambooij, Marc, IJsselsteijn, Wijnand, Fortuin, Marten, & Heynderickx, Ingrid. (2009). Visual discomfort and visual fatigue of stereoscopic displays: A review. Journal of Imaging Science and Technology, 53(3), 30201-30201-30201-30214.
    28. Lee, Yung-Hui, Wu, Shu-Kai, & Liu, Yan-Pin. (2013). Performance of remote target pointing hand movements in a 3D environment. Human movement science, 32(3), 511-526.
    29. Lin, Chiuhsiang Joe, Ho, Sui-Hua, & Chen, Yan-Jyun. (2015). An investigation of pointing postures in a 3D stereoscopic environment. Applied Ergonomics, 48, 154-163.
    30. Lin, Chiuhsiang Joe, Woldegiorgis, Bereket H, & Caesaron, Dino. (2015). Distance estimation of near‐field visual objects in stereoscopic displays. Journal of the Society for Information Display, 22(7), 370-379.
    31. Lin, Chiuhsiang Joe, & Woldegiorgis, Bereket Haile. (2018). Kinematic analysis of direct pointing in projection-based stereoscopic environments. Human movement science, 57, 21-31.
    32. Lipton, Lenny. (1997). Stereographics Developers' Handbook: StereoGraphics Corporation.
    33. Liu, Hongzhe, Xi, Yulong, Song, Wei, Um, Kyhyun, & Cho, Kyungeun. (2013). Gesture-based NUI application for real-time path modification. Paper presented at the 2013 IEEE 11th International Conference on Dependable, Autonomic and Secure Computing.
    34. MacGregor, James, Lee, Eric, & Lam, Newman. (1986). Optimizing the structure of database menu indexes: A decision model of menu search. Human Factors, 28(4), 387-399.
    35. MacKenzie, I Scott. (1992). Fitts' law as a research and design tool in human-computer interaction. Human-computer interaction, 7(1), 91-139.
    36. Milgram, Paul, & Kishino, Fumio. (1994). A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS on Information and Systems, 77(12), 1321-1329.
    37. Nielsen, Michael, Störring, Moritz, Moeslund, Thomas B, & Granum, Erik. (2003). A procedure for developing intuitive and ergonomic gesture interfaces for man-machine interaction. Paper presented at the Proceedings of the 5th International Gesture Workshop.
    38. Paap, Kenneth R, & Cooke, Nancy J. (1997). Handbook of Human-Computer Interaction (Second Edition). In Design of menus (pp. 533-572): Elsevier.
    39. Pressigout, Muriel, & Marchand, Eric. (2006). Hybrid tracking algorithms for planar and non-planar structures subject to illumination changes. Paper presented at the Mixed and Augmented Reality, 2006. ISMAR 2006. IEEE/ACM International Symposium on.
    40. Rabbi, Ihsan, & Ullah, Sehat. (2013). A survey on augmented reality challenges and tracking. Acta graphica: znanstveni časopis za tiskarstvo i grafičke komunikacije, 24(1-2), 29-46.
    41. Raynaud, Dominique. (2016). Studies on Binocular Vision (Vol. 47). Switzerland: Springer
    42. Rolland, Jannick P., Baillot, Yohan, & Goon, Alexei A. (2001). A survey of tracking technology for virtual environments.
    43. Seigle, David. (2009). 3rd Dimension: Dimensionalization. Veritas Visus, 4(3), 69-75.
    44. Sherman, William R, & Craig, Alan B. (2002). Understanding Virtual Reality: Interface, Application, and Design. San Francisco, CA, USA: Morgan Kaufmann Inc.
    45. Silva, Robert. (2017). 3D Glasses Overview - Passive Polarized vs Active Shutter. In: Lifewire.
    46. Swan II, J Edward, Jones, Adam, Kolstad, Eric, Livingston, Mark A, & Smallman, Harvey S. (2007). Egocentric depth judgments in optical, see-through augmented reality. IEEE transactions on visualization and computer graphics, 13(3), 429-442.
    47. Thompson, William B, Willemsen, Peter, Gooch, Amy A, Creem-Regehr, Sarah H, Loomis, Jack M, & Beall, Andrew C. (2004). Does the quality of the computer graphics matter when judging distances in visually immersive environments? Presence: Teleoperators & Virtual Environments, 13(5), 560-571.
    48. Urey, Hakan, Chellappan, Kishore V, Erden, Erdem, & Surman, Phil. (2011). State of the art in stereoscopic and autostereoscopic displays. Proceedings of the IEEE, 99(4), 540-555.
    49. Valkov, Dimitar, Steinicke, Frank, Bruder, Gerd, & Hinrichs, Klaus. (2011). 2D touching of 3D stereoscopic objects. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems.
    50. Weichert, Frank, Bachmann, Daniel, Rudak, Bartholomäus, & Fisseler, Denis. (2013). Analysis of the Accuracy and Robustness of the Leap Motion Controller. Sensors, 13(5), 6380-6393.
    51. Welch, Robert B, Widawski, Mel H, Harrington, Janice, & Warren, David H. (1979). An examination of the relationship between visual capture and prism adaptation. Perception & Psychophysics, 25(2), 126-132.
    52. Zhai, Shumin. (1998). User performance in relation to 3D input device design. ACM Siggraph Computer Graphics, 32(4), 50-54.
    53. Zhai, Shumin, Kong, Jing, & Ren, Xiangshi. (2004). Speed–accuracy tradeoff in Fitts’ law tasks—on the equivalency of actual and nominal pointing precision. International Journal of Human-Computer Studies, 61(6), 823-856.
    54. Zhou, Feng, Duh, Henry Been-Lirn, & Billinghurst, Mark. (2008). Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR. Paper presented at the Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.
    55. 勞動部勞動及職業安全衛生研究所. (2016). 人體計測資料庫簡介及重要計測值. Retrieved from https://www.ilosh.gov.tw/menu/1188/1201/%E4%BA%BA%E9%AB%94%E8%A8%88%E6%B8%AC%E8%B3%87%E6%96%99%E5%BA%AB/%E4%BA%BA%E9%AB%94%E8%A8%88%E6%B8%AC%E8%B3%87%E6%96%99%E5%BA%AB%E7%B0%A1%E4%BB%8B%E5%8F%8A%E9%87%8D%E8%A6%81%E8%A8%88%E6%B8%AC%E5%80%BC

    無法下載圖示 全文公開日期 2024/01/23 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)
    全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
    QR CODE