研究生: |
蔡文亭 Wen-Ting Tsai |
---|---|
論文名稱: |
立體擴增環境下使用者操作3D水平選單之績效研究 User performance in operating the stereoscopic Menus in augmented reality |
指導教授: |
林久翔
Chiuh-Siang Lin |
口試委員: |
江行全
Bernard Jiang 梁曉帆 Sheau-Farn Liang |
學位類別: |
碩士 Master |
系所名稱: |
管理學院 - 工業管理系 Department of Industrial Management |
論文出版年: | 2019 |
畢業學年度: | 107 |
語文別: | 中文 |
論文頁數: | 61 |
中文關鍵詞: | 擴增實境 、立體實境 、選單操作 、使用者表現績效 |
外文關鍵詞: | augmented reality, stereoscopic reality, menu operation, the performance of user |
相關次數: | 點閱:556 下載:2 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
鑒於3D環境之裝置例如AR、VR等科技日漸普及,但目前仍無一致性的設計原則,本研究回顧目前立體實境之相關研究,探討三個影響因素包含顯示環境之設計、互動原則之設計及任務困難度等對使用者績效之影響,並利用系統操作最常見之水平選單進行實驗。
顯示環境主要利用人眼知覺 3D 影像之單眼及雙眼線索進行設計,而其中影響立體視覺最主要來自單眼影像之視覺影像融合,本研究利用負像差之原理將選單呈現在3D電視螢幕及人眼之間。互動方式則仿擴增實境常使用之手勢追蹤輸入方法,利用Leap手勢追蹤裝置監測食指指尖之座標及移動方向,模擬真實環境按壓按鈕的情境;任務困難度由改變按鈕尺寸及距離進行調整。
其結果顯示,影響使用者最主要的因素來自於選單按鈕之尺寸及按鈕間距離,隨著按鈕尺寸越大及按鈕間距離越短,移動時間會隨之降低,而有游標輔助的客觀距離量測可以有效降低錯誤率,由於人眼在立體實境下會有距離預測落差,因此在微幅負視差的條件表現相對較好,但未達顯著影響。另外,按鈕位置亦會影響使用者績效,目視中心的按鈕錯誤率最低。本研究之結果將可應用至立體實境近體空間之選單設計。
In view of the 3D technologies such as AR and VR getting popular, but there are still no consistent design principles, thus this study reviews the current research in stereoscopic reality and sorts out several factors: the design of display, interaction and the difficulties of task.Menus are important in system operation, and often be operated via pointing. Since gesture tracking as input method commonly found in the augmented reality, here we used the Leap Motion to track user’s hands and fingers to operate the system, try to simulate how users press the button in the real environment.
This study controls the display environment, interaction mode, and the level of task difficulty to understand users’ performance. People use the monocular and binocular cues to detect the 3D images, but for the stereoscopic images are formed by the visual fusion from each eye. Here we compare negative parallax and zero parallax which belong to different display environments, and with cursors or without cursors in different interaction mode and different task difficulty.
The results show that the most important factor affecting the performance comes from the size of the menu button and the distance between the buttons. As the button size is larger and the distance between the buttons is shorter, the movement time will decrease, and the error rate will decrease, too. In cursor-assisted condition, user can measure the distance effectively, so the error rate reduced. And since the human will percept the distance incorrectly, the condition of smaller negative parallax is better. In addition, the button position affects user performance, for the center button, it has the lowest error rate. The results can be applied to design the stereoscopic menu in peripersonal space.
1. Arnott, Stephen R, & Shedden, Judith M. (2000). Attention switching in depth using random-dot autostereograms: Attention gradient asymmetries. Perception & Psychophysics, 62(7), 1459-1473.
2. Azuma, Ronald T. (1997). A survey of augmented reality. Presence: Teleoperators & Virtual Environments, 6(4), 355-385.
3. Bowman, Doug A, Kruijff, Ernst, Jr., Joseph J. LaViola, & Poupyrev, Ivan. (2001). An introduction to 3-D user interface design. Presence: Teleoperators & Virtual Environments, 10(1), 96-108.
4. Bowman, Doug A, & Wingrave, Chadwick A. (2001). Design and evaluation of menu systems for immersive virtual environments. Paper presented at the Proceedings of the Virtual Reality 2001 Conference.
5. Brandl, Peter, Leitner, Jakob, Seifried, Thomas, Haller, Michael, Doray, Bernard, & To, Paul. (2009). Occlusion-aware menu design for digital tabletops. Paper presented at the CHI'09 Extended Abstracts on Human Factors in Computing Systems.
6. Bruder, Gerd, Steinicke, Frank, & Stürzlinger, Wolfgang. (2013a). Effects of visual conflicts on 3D selection task performance in stereoscopic display environments. Paper presented at the Symposium on 3D User Interfaces 2013.
7. Bruder, Gerd, Steinicke, Frank, & Stürzlinger, Wolfgang. (2013b). To touch or not to touch? Comparing 2D touch and 3D mid-air interaction on stereoscopic tabletop surfaces. Paper presented at the Proceedings of the 1st symposium on Spatial user interaction.
8. Carmigniani, Julie, Furht, Borko, Anisetti, Marco, Ceravolo, Paolo, Damiani, Ernesto, & Ivkovic, Misa. (2011). Augmented reality technologies, systems and applications. Multimedia tools and applications, 51(1), 341-377.
9. Colombo, Carlo, Del Bimbo, Alberto, & Valli, Alessandro. (2003). Visual capture and understanding of hand pointing actions in a 3-D environment. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 33(4), 677-686.
10. Cutting, James E, & Vishton, Peter M. (1995). Perceiving layout and knowing distances: The integration, relative potency, and contextual use of different information about depth. In Perception of space and motion (pp. 69-117): Elsevier.
11. Grossman, Tovi, & Balakrishnan, Ravin. (2004). Pointing at trivariate targets in 3D environments. Paper presented at the Proceedings of the SIGCHI conference on Human factors in computing systems.
12. Grossman, Tovi, & Wigdor, Daniel. (2007). Going Deeper: a Taxonomy of 3D on the Tabletop. Paper presented at the Second Annual IEEE International Workshop on Horizontal Interactive Human-Computer Systems.
13. Grossman, Tovi, Wigdor, Daniel, & Balakrishnan, Ravin. (2004). Multi-finger gestural interaction with 3D volumetric displays. Paper presented at the Proceedings of the 17th annual ACM symposium on User interface software and technology.
14. Guna, Jože, Jakus, Grega, Pogačnik, Matevž, Tomažič, Sašo, & Sodnik, Jaka. (2014). An Analysis of the Precision and Reliability of the Leap Motion Sensor and Its Suitability for Static and Dynamic Tracking. Sensors, 14(2), 3702-2720.
15. Hand, Chris. (1994). A survey of 3-D input devices. Department of Computing Science, De Montfort University, The Gateway, Leicester LE1 9BH.
16. Hand, Chris. (1997). A survey of 3D interaction techniques. Paper presented at the Computer graphics forum.
17. Hast, Anders. (2010). Game Engine Gems, Volume One: Jones & Bartlett Learning.
18. Hinckley, Ken, & Wigdor, Daniel. (2002). The human-computer interaction handbook: fundamentals, evolving technologies and emerging applications.
19. Hua, Hong, Girardot, Axelle, Gao, Chunyu, & Rolland, Jannick P. (2000). Engineering of head-mounted projective displays. Applied Optics, 39(22), 3814-3824.
20. ISO. (2000). Ergonomic requirements for office work with visual display terminals (VDTs) -- Part 9: Requirements for non-keyboard input devices. In: International Organization for Standardization.
21. Jacoby, Richard H, & Ellis, Stephen R. (1992). Using virtual menus in a virtual environment. Paper presented at the Visual Data Interpretation.
22. Jones, J Adam, Swan II, J Edward, Singh, Gurjot, Kolstad, Eric, & Ellis, Stephen R. (2008). The effects of virtual reality, augmented reality, and motion parallax on egocentric depth perception. Paper presented at the Proceedings of the 5th symposium on Applied perception in graphics and visualization.
23. Kipper, Greg, & Rampolla, Joseph. (2012). Augmented Reality: An Emerging Technologies Guide to AR: Elsevier Science.
24. Kouroupetroglou, Georgios, Pino, Alexandros, Balmpakakis, Athanasios, Chalastanis, Dimitrios, Golematis, Vasileios, Ioannou, Nikolaos, & Koutsoumpas, Ioannis. (2012). Using Wiimote for 2D and 3D pointing tasks: gesture performance evaluation. Paper presented at the 9th International Gesture Workshop, GW 2011.
25. Kulshreshth, Arun, Zorn, Chris, & LaViola, Joseph J. (2013). Poster: Real-time markerless kinect based finger tracking and hand gesture recognition for HCI. Paper presented at the Symposium on 3D User Interfaces 2013.
26. Kyritsis, Markos, Gulliver, Stephen R, & Feredoes, Eva. (2016). Environmental factors and features that influence visual search in a 3D WIMP interface. International Journal of Human-Computer Studies, 92-93, 30-43.
27. Lambooij, Marc, IJsselsteijn, Wijnand, Fortuin, Marten, & Heynderickx, Ingrid. (2009). Visual discomfort and visual fatigue of stereoscopic displays: A review. Journal of Imaging Science and Technology, 53(3), 30201-30201-30201-30214.
28. Lee, Yung-Hui, Wu, Shu-Kai, & Liu, Yan-Pin. (2013). Performance of remote target pointing hand movements in a 3D environment. Human movement science, 32(3), 511-526.
29. Lin, Chiuhsiang Joe, Ho, Sui-Hua, & Chen, Yan-Jyun. (2015). An investigation of pointing postures in a 3D stereoscopic environment. Applied Ergonomics, 48, 154-163.
30. Lin, Chiuhsiang Joe, Woldegiorgis, Bereket H, & Caesaron, Dino. (2015). Distance estimation of near‐field visual objects in stereoscopic displays. Journal of the Society for Information Display, 22(7), 370-379.
31. Lin, Chiuhsiang Joe, & Woldegiorgis, Bereket Haile. (2018). Kinematic analysis of direct pointing in projection-based stereoscopic environments. Human movement science, 57, 21-31.
32. Lipton, Lenny. (1997). Stereographics Developers' Handbook: StereoGraphics Corporation.
33. Liu, Hongzhe, Xi, Yulong, Song, Wei, Um, Kyhyun, & Cho, Kyungeun. (2013). Gesture-based NUI application for real-time path modification. Paper presented at the 2013 IEEE 11th International Conference on Dependable, Autonomic and Secure Computing.
34. MacGregor, James, Lee, Eric, & Lam, Newman. (1986). Optimizing the structure of database menu indexes: A decision model of menu search. Human Factors, 28(4), 387-399.
35. MacKenzie, I Scott. (1992). Fitts' law as a research and design tool in human-computer interaction. Human-computer interaction, 7(1), 91-139.
36. Milgram, Paul, & Kishino, Fumio. (1994). A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS on Information and Systems, 77(12), 1321-1329.
37. Nielsen, Michael, Störring, Moritz, Moeslund, Thomas B, & Granum, Erik. (2003). A procedure for developing intuitive and ergonomic gesture interfaces for man-machine interaction. Paper presented at the Proceedings of the 5th International Gesture Workshop.
38. Paap, Kenneth R, & Cooke, Nancy J. (1997). Handbook of Human-Computer Interaction (Second Edition). In Design of menus (pp. 533-572): Elsevier.
39. Pressigout, Muriel, & Marchand, Eric. (2006). Hybrid tracking algorithms for planar and non-planar structures subject to illumination changes. Paper presented at the Mixed and Augmented Reality, 2006. ISMAR 2006. IEEE/ACM International Symposium on.
40. Rabbi, Ihsan, & Ullah, Sehat. (2013). A survey on augmented reality challenges and tracking. Acta graphica: znanstveni časopis za tiskarstvo i grafičke komunikacije, 24(1-2), 29-46.
41. Raynaud, Dominique. (2016). Studies on Binocular Vision (Vol. 47). Switzerland: Springer
42. Rolland, Jannick P., Baillot, Yohan, & Goon, Alexei A. (2001). A survey of tracking technology for virtual environments.
43. Seigle, David. (2009). 3rd Dimension: Dimensionalization. Veritas Visus, 4(3), 69-75.
44. Sherman, William R, & Craig, Alan B. (2002). Understanding Virtual Reality: Interface, Application, and Design. San Francisco, CA, USA: Morgan Kaufmann Inc.
45. Silva, Robert. (2017). 3D Glasses Overview - Passive Polarized vs Active Shutter. In: Lifewire.
46. Swan II, J Edward, Jones, Adam, Kolstad, Eric, Livingston, Mark A, & Smallman, Harvey S. (2007). Egocentric depth judgments in optical, see-through augmented reality. IEEE transactions on visualization and computer graphics, 13(3), 429-442.
47. Thompson, William B, Willemsen, Peter, Gooch, Amy A, Creem-Regehr, Sarah H, Loomis, Jack M, & Beall, Andrew C. (2004). Does the quality of the computer graphics matter when judging distances in visually immersive environments? Presence: Teleoperators & Virtual Environments, 13(5), 560-571.
48. Urey, Hakan, Chellappan, Kishore V, Erden, Erdem, & Surman, Phil. (2011). State of the art in stereoscopic and autostereoscopic displays. Proceedings of the IEEE, 99(4), 540-555.
49. Valkov, Dimitar, Steinicke, Frank, Bruder, Gerd, & Hinrichs, Klaus. (2011). 2D touching of 3D stereoscopic objects. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems.
50. Weichert, Frank, Bachmann, Daniel, Rudak, Bartholomäus, & Fisseler, Denis. (2013). Analysis of the Accuracy and Robustness of the Leap Motion Controller. Sensors, 13(5), 6380-6393.
51. Welch, Robert B, Widawski, Mel H, Harrington, Janice, & Warren, David H. (1979). An examination of the relationship between visual capture and prism adaptation. Perception & Psychophysics, 25(2), 126-132.
52. Zhai, Shumin. (1998). User performance in relation to 3D input device design. ACM Siggraph Computer Graphics, 32(4), 50-54.
53. Zhai, Shumin, Kong, Jing, & Ren, Xiangshi. (2004). Speed–accuracy tradeoff in Fitts’ law tasks—on the equivalency of actual and nominal pointing precision. International Journal of Human-Computer Studies, 61(6), 823-856.
54. Zhou, Feng, Duh, Henry Been-Lirn, & Billinghurst, Mark. (2008). Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR. Paper presented at the Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.
55. 勞動部勞動及職業安全衛生研究所. (2016). 人體計測資料庫簡介及重要計測值. Retrieved from https://www.ilosh.gov.tw/menu/1188/1201/%E4%BA%BA%E9%AB%94%E8%A8%88%E6%B8%AC%E8%B3%87%E6%96%99%E5%BA%AB/%E4%BA%BA%E9%AB%94%E8%A8%88%E6%B8%AC%E8%B3%87%E6%96%99%E5%BA%AB%E7%B0%A1%E4%BB%8B%E5%8F%8A%E9%87%8D%E8%A6%81%E8%A8%88%E6%B8%AC%E5%80%BC