簡易檢索 / 詳目顯示

研究生: 王建立
Jianli Wang
論文名稱: 後WIMP介面中的3D手勢指向研究
3D Gestural Pointing in Post-WIMP Interfaces
指導教授: 陳建雄
Chien-Hsiung Chen
口試委員: 吳志富
Chih-Fu Wu
許言
Yen Hsu
陳建雄
Chien-Hsiung Chen
柯志祥
Chih-Hsiang Ko
范振能
Jeng-Neng Fan
學位類別: 博士
Doctor
系所名稱: 設計學院 - 設計系
Department of Design
論文出版年: 2022
畢業學年度: 110
語文別: 中文
論文頁數: 117
中文關鍵詞: 手勢指向手勢互動游標控制運動控制CD gain互動設計可用性研究
外文關鍵詞: Freehand pointing, Freehand interaction, Cursor control, Motion control, CD gain, Interaction design, Usability study
相關次數: 點閱:175下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 隨著商用的深度攝影機及慣性感測器的普及,人們已經可以通過手勢與計算機進行互動。這極大地豐富了我們與計算機進行互動的場景,也讓互動本身變得更加自然。然而相比於傳統的滑鼠,手勢互動的可用性具有較大的差距。尤其在通過手控制游標進行小目標指向與點擊時,效率低、錯誤率高且容易造成手的疲勞。如何提高手勢進行高精度的任務操作與理解手勢互動的認知模式是急需要被充分研究的議題。
    本研究通過四個實驗,從手勢控制游標進行目標指向與點擊入手,試圖解決該問題並以此角度探討手勢互動的框架。首先,通過兩個實驗研究手形與手的Z軸運動軌跡來了解手勢指向中的行為習慣與認知模式。然後通過相關發現進行原型設計以進一步探討手勢互動設計。最後,通過力輔助進行新的原型設計與研究,並以此角度探討手勢互動設計。
    研究一發現在遠距離大螢幕互動中,用戶傾向於在高精度任務中使用手指,而通過整個手來完成低精度任務。基於此,用戶通過食指指示手形移動小目標,通過五指張開的手形移動大目標。由於選擇小目標極為困難,用戶會通過伸展手臂去縮短與螢幕間的互動距離。除此之外,方向對 Z 軸方向的手部運動有顯著影響。從正右方開始,逆時針旋轉,方向與Z軸移動的距離呈類似於正弦函數關係。
    研究二設計了一種手勢游標原型,該技術通過五指張開的手形來控制區域游標(具較大選擇面積的游標),同時使用食指指示手形來控制一個標準游標(具單個像素選擇面積的游標)。同時為輔助指向,對手勢游標的CD gain轉化函數及點擊確認技術進行了改進。最後,通過Z軸移動距離來自動控制游標的尺寸。通過可用性測試,手勢游標比傳統控制方式在小目標指向與點擊中具有更高的性能與用戶滿意度。根據研究結果,研究總結了手勢游標的設計建議與應用,並對手勢互動設計進行了討論。
    研究三基於研究二的手勢游標CD gain轉化函數,設計了一種力輔助的游標控制技術。通過可用性測試,研究發現基於力輔助的游標控制方式在小目標指向與選擇上比傳統的游標具有更高的效率和更好的用戶滿意度。手的力度輸出可以有效結合其他模態進行人機互動,同時力度大小本身也蘊涵著闡述目標任務精度的信息。研究同時發現通過雙手與計算機進行互動自然且有效,但是雙手協調操作需要更多的練習才能得到掌握,而雙手同時執行複雜的連續任務會影響人們的認知,從而導致較差的互動表現。


    Low-cost motion and depth-sensing technology have changed the way humans interact with the computer, and have greatly enriched scenarios for users to interact with the computer. Unlike mouse-based operations used in desktop monitors, freehand distal pointing has no fixed surface support, which results in jitter and faster physical fatigue. These characteristics of freehand pointing make it difficult to hold the cursor steadily and hard to select small targets. Therefore, how to improve the performance of the freehand interaction in high-precision tasks and understand the user behaviors deserve further investigation.
    In this study, we explore the problems in freehand distal pointing through four experiments and discussed the framework of the freehand interaction from this perspective. Firstly, the hand shape and the movement in Z dimension was discussed to explore the user behaviors in freehand distal pointing through two experiments. Then, a “hand cursor” was designed based on the result of the previous two experiments. Finally, a new prototype assisted by force was designed and the framework was discussed again in the study.
    The first two experiments focus on the relationship between the precision of tasks and gestures including hand shape and hand motion in the Z dimension. Our findings show that users prefer to use index finger gestures and move forward for high-precision selections, while using open hand gestures for large target selections. The orientation had significant effects on hand motions in the Z dimension. The relationship between orientation and hand motions in the Z dimension is similar to a sinusoidal function.
    The third experiment investigated a “hand cursor” technique which used an open hand to control an area cursor (i.e., a cursor with a larger active area) while using an index finger extended gesture to switch the area cursor model into a standard cursor model (i.e., a cursor with a single pixel active area). A pointer acceleration (PA) technique based on an improved sigmoid transfer function and a selection technique were applied to the hand cursor to assist target acquisition. A method to parameterize the Z dimension movement distance of the hand was adopted to help control the size of the hand cursor. A usability experiment was conducted to compare the task performance and the user satisfaction of the hand cursor and a standard cursor. The experimental results indicated that the “hand cursor” had higher performance in small target selections and improved the user experience for young people. Based on the results of the investigation, some design suggestions and applications of the hand cursor are summarized to guide freehand interaction with the hand cursor.
    The fourth experiment investigated a “force cursor” technique that combining the pressure-based input of the non-preferred hand, the velocity, and the distance of the preferred hand movement to control a cursor for target acquisition. A usability experiment was conducted to compare the task performance and the user satisfaction of the two kinds of the force cursor and a standard cursor based on an original PA technique. The experimental results indicated that the force cursor had higher performance and better user experience in target acquisition. That is, pressure-based input could be combined with other ones in freehand interaction without adding too much cognition burden. Based on the results of the investigation, we found that the bimanual interaction was a natural way to interact with the computer while too complicated operations were applied to both hands might affect the cognition of the users that lead to worse performance.

    中文摘要 ii Abstract iv 誌謝 vi 目錄 viii 圖目錄 xi 表目錄 xiv 第一章 緒論 1.1 研究背景與動機 1.2 研究目的與範圍 1.3 研究流程 第二章 文獻探討 2.1 手勢互動研究發展現況 2.1.1 國際手勢互動研究發展現況 2.1.2 臺灣手勢互動研究發展現況 2.1.3 小結 2.2 手勢與心理語言學 2.2.1 手勢分類 2.2.2 指示手勢(deictic) 2.2.3 小結 2.3 手的運動控制與目標指向 2.3.1 上肢解剖学 2.3.2 手勢控制游標移動的方法 2.3.3 目標選擇模型 2.3.4 費茨定律(Fitts’ Law) 2.3.5 游標指向技術 2.3.6 基於力輸入的人機互動 2.4 手勢識別技術 2.5手勢互動下的介面設計 2.6 手勢互動與可用性研究 2.6.1可用性原則 2.6.2手勢互動可用性評估 2.6.3手勢互動可用性研究量化與質化指標 第三章 通過手勢進行游標指向的認知與行為:手形與Z軸移動軌跡 3.1 研究目的 3.2 研究框架 3.3 實驗一:手勢指向中的手形 3.3.1 受測者 3.3.2 實驗設備 3.3.3 實驗設計 3.3.4 實驗流程 3.3.5 結果與分析 3.4 實驗二:手勢指向中的Z軸移動軌跡 3.4.1 受測者 3.4.2 實驗設備 3.4.3 實驗設計 3.4.4 實驗流程 3.4.5 結果與分析 3.5 設計建議與總結 第四章 手勢游標:通過手形與Z軸方向移動來輔助目標指向與點擊 4.1 研究目的 4.2 手勢游標設計 4.2.1 手形 4.2.2 Z軸移動 4.2.3 CD Gain 4.2.4 選取命令 4.3 實驗 4.3.1 受測者 4.3.2 實驗設備 4.3.3實驗設計 4.3.4實驗任務與流程 4.3.5 結果與分析 4.4 設計建議與總結 第五章 基於力輔助的手勢指向研究 5.1 研究目的 5.2 基於力的人機互動 5.3 基於力的PA轉化函數 5.4 實驗 5.4.1 受測者 5.4.2 實驗設備 5.4.3 實驗設計 5.4.4 實驗任務與流程 5.5 結果與分析 5.5.1 移動時間 5.5.2 滿意度評估 5.6 討論 5.7 結論 第六章 結論與未來研究 6.1 總結與建議 6.2 未來研究計畫 參考文獻

    1. Abich, J., & Barber, D. J. (2017). The impact of human–robot multimodal communication on mental workload, usability preference, and expectations of robot behavior. Journal on Multimodal User Interfaces, 11(2), 211-225. doi:10.1007/s12193-016-0237-4
    2. Alkemade, R., Verbeek, F. J., & Lukosch, S. G. (2017). On the efficiency of a VR hand gesture-based interface for 3D object manipulations in conceptual design. International Journal of Human–Computer Interaction, 33(11), 882-901. doi:10.1080/10447318.2017.1296074
    3. Ardito, C., Buono, P., Costabile, M. F., & Desolda, G. (2015). Interaction with large displays: A survey. ACM Comput. Surv., 47(3), 1-38. doi:10.1145/2682623
    4. Balakrishnan, R. (2004). “Beating” Fitts’ law: Virtual enhancements for pointing facilitation. International Journal of Human-Computer Studies, 61(6), 857-874. doi:https://doi.org/10.1016/j.ijhcs.2004.09.002
    5. Ballendat, T., Marquardt, N., &Greenberg, S. (2010). Proxemic interaction: Designing for a proximity and orientation-aware environment. In Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces, Saarbrücken, Germany, 7–10 November 2010.
    6. Baloup, M., Oudjail, V., Pietrzak, T., & Casiez, G. (2018). Pointing techniques for distant targets in virtual reality. In Proceedings of the 30th Conference on l’Interaction Homme-Machine, Brest, France, 23–26 October 2018; doi:10.1145/3286689.3286696.
    7. Barclay, K., Wei, D., Lutteroth, C., & Sheehan, R. (2011). A quantitative quality model for gesture based user interfaces. Paper presented at the Proceedings of the 23rd Australian Computer-Human Interaction Conference, Canberra, Australia.
    8. Bateman, S., Mandryk, R. L., Gutwin, C., & Xiao, R. (2013). Analysis and comparison of target assistance techniques for relative ray-cast pointing. International Journal of Human-Computer Studies, 71(5), 511-532.
    9. Baudel, T., & Beaudouin-Lafon, M. (1993). Charade: Remote control of objects using free-hand gestures. Commun. ACM, 36(7), 28-35. doi:10.1145/159544.159562
    10. Baudel, T., Beaudouin-Lafon, M., Braffort, A., & Teil, D. (1992). An interaction model designed for hand gesture input. LRI Res. Rep. 772, Sept.
    11. Baudisch, P., Cutrell, E., Robbins, D., Czerwinski, M., Tandler, P., Bederson, B., & AlexZierlinger. (2003). Drag-and-pop and drag-and-pick: Techniques for accessing remote screen content on touch- and pen-operated systems. Proceedings of Interact, pp. 57–64.
    12. Bernardis, P., & Gentilucci, M. (2006). Speech and gesture share the same communication system. Neuropsychologia, 44(2), 178-190. doi:https://doi.org/10.1016/j.neuropsychologia.2005.05.007
    13. Blanch, R., Guiard, Y., & Beaudouin-Lafon, M. (2004). Semantic pointing: improving target acquisition with control-display ratio adaptation. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vienna, Austria.
    14. Blanch, R., & Ortega, M. (2011). Benchmarking pointing techniques with distractors: Adding a density factor to Fitts’ pointing paradigm. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7 May 2011.
    15. Bolt, R. A. (1980). Put-that-there: Voice and gesture at the graphics interface. SIGGRAPH Comput. Graph., 14(3), 262-270. doi:10.1145/965105.807503
    16. Borg, E. & Kaijser, L. (2006). A comparison between three rating scales for perceived exertion and two different work tests. Scandinavian Journal of Medicine & Science in Sports, 16(1), 57-69.
    17. Borg, G. (1998). Borg's perceived exertion and pain scales. Champaign, IL, US: Human Kinetics.
    18. Boring, S., Jurmu, M., & Butz, A. (2009). Scroll, tilt or move it: using mobile phones to continuously control pointers on large public displays. Paper presented at the Proceedings of the 21st Annual Conference of the Australian Computer-Human Interaction Special Interest Group: Design: Open 24/7, Melbourne, Australia, 2009.
    19. Boritz1, J., Booth2, K. S., & Cowan, W. B. (1991). Fitts’ law studies of directional mouse movement. In J. Neilson (Ed.), Proceedings of Graphics Interface ’91 (pp. 216–223). Toronto: CIPS.
    20. Bossavit, B., Marzo, A., Ardaiz, O., & Pina, A. (2013). Hierarchical menu selection with a body-centered remote interface. Interacting with Computers, 26(5), 389-402. doi:10.1093/iwc/iwt043
    21. Buchmann, V., Violich, S., Billinghurst, M., & Cockburn, A. (2004). FingARtips: Gesture based direct manipulation in augmented reality. Paper presented at the Proceedings of the 2nd international conference on Computer graphics and interactive techniques in Australasia and South East Asia, Singapore, 2004.
    22. Burger, B., Ferrané, I., Lerasle, F., & Infantes, G. (2012). Two-handed gesture recognition and fusion with speech to command a robot. Autonomous Robots, 32(2), 129-147. doi:10.1007/s10514-011-9263-y
    23. Burno, R. A., Wu, B., Doherty, R., Colett, H., & Elnaggar, R. (2015). Applying Fitts’ Law to Gesture Based Computer Interactions. Procedia Manufacturing, 3 (Supplement C), 4342-4349. doi:https://doi.org/10.1016/j.promfg.2015.07.429
    24. Carbini, S., Viallet, J.-E., Bernier, O., & Bascle, B. (2005). Tracking Body Parts of Multiple People for Multi-person Multimodal Interface, Berlin, Heidelberg.
    25. Card, S. K., Moran, T. P., & Newell, A. (1980). The keystroke-level model for user performance time with interactive systems. Commun. ACM, 23(7), 396-410. doi:10.1145/358886.358895
    26. Carvalho, D., Bessa, M., Magalhães, L., & Carrapatoso, E. (2018). Performance evaluation of different age groups for gestural interaction: A case study with Microsoft Kinect and Leap Motion. Universal Access in the Information Society, 17(1), 37-50. doi:10.1007/s10209-016-0518-4
    27. Cashion, J., Wingrave, C., & LaViola Jr, J. J. (2012). Dense and dynamic 3d selection for game-based virtual environments. IEEE transactions on visualization and computer graphics, 18(4), 634-642.
    28. Casiez, G., & Roussel, N. (2011). No more bricolage!: Methods and tools to characterize, replicate and compare pointing transfer functions. Paper presented at the Proceedings of the 24th annual ACM symposium on User interface software and technology, Santa Barbara, California, USA.
    29. Casiez, G., Roussel, N., & Vogel, D. (2012). 1€ Filter: a simple speed-based low-pass filter for noisy input in interactive systems, in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM: Austin, Texas, USA. p. 2527-2530.
    30. Cha, Y., & Myung, R. (2013). Extended Fitts’ law for 3D pointing tasks using 3D target arrangements. Int. J. Ind. Ergon. 43, 350–355. doi:10.1016/j.ergon.2013.05.005.
    31. Chen, C.-H.; Wang, J.-L. (2020). The semantic meaning of hand shapes and Z-dimension movements of freehand distal pointing on large displays. Symmetry 2020, 12, 329.
    32. Chen, P., Hu, Y., & Yang, F. (2018). A conformal geometric algebra method for virtual hand modeling and interaction. EURASIP Journal on Image and Video Processing, 2018(1), 72. doi:10.1186/s13640-018-0318-2
    33. Chen, X. A., Schwarz, J., Harrison, C., Mankoff, J., & Hudson, S. E. (2014). Air+touch: Interweaving touch & in-air gestures. Paper presented at the Proceedings of the 27th annual ACM symposium on User interface software and technology, Honolulu, Hawaii, USA.
    34. Chen, Z., Ma, X., Peng, Z., Zhou, Y., Yao, M., Ma, Z., . . . Shen, M. (2018). User-defined gestures for gestural interaction: Extending from hands to other body parts. International Journal of Human–Computer Interaction, 34(3), 238-250. doi:10.1080/10447318.2017.1342943
    35. Cho, K., Lee, J.-H., Lee, B.-T., & Park, E. (2015). Effects of feedforward in in-air remote pointing. International Journal of Human–Computer Interaction, 31(2), 89-100. doi:10.1080/10447318.2014.959107
    36. Choi, E., Kwon, S., Lee, D., Lee, H., & Chung, M. K. (2014). Towards successful user interaction with systems: Focusing on user-derived gestures for smart home systems. Applied Ergonomics, 45(4), 1196-1207. doi:https://doi.org/10.1016/j.apergo.2014.02.010
    37. Choumane, A., Casiez, G., & Grisoni, L. (2010). Buttonless clicking: Intuitive select and pick-release through gesture analysis. Paper presented at the 2010 IEEE Virtual Reality Conference (VR).
    38. Chu, C.-C. P., Dani, T. H., & Gadh, R. (1997). Multi-sensory user interface for a virtual-reality-based computeraided design system. Computer-Aided Design, 29(10), 709-725. doi:https://doi.org/10.1016/S0010-4485(97)00021-3
    39. Clark, A., Dünser, A., Billinghurst, M., Piumsomboon, T., & Altimira, D. (2011). Seamless interaction in space. Paper presented at the Proceedings of the 23rd Australian Computer-Human Interaction Conference, Canberra, Australia.
    40. Cochet, H., & Vauclair, J. (2014). Deictic gestures and symbolic gestures produced by adults in an experimental context: Hand shapes and hand preferences. Laterality: Asymmetries of Body, Brain and Cognition, 19(3), 278-301. doi:10.1080/1357650X.2013.804079
    41. Cockburn, A., Quinn, P., Gutwin, C., Ramos, G., & Looser, J. (2011). Air pointing: Design and evaluation of spatial target acquisition with and without visual feedback. International Journal of Human-Computer Studies, 69(6), 401-414. doi:https://doi.org/10.1016/j.ijhcs.2011.02.005
    42. Coelho, J. C., & Verbeek, F. J. (2014). Pointing task evaluation of leap motion controller in 3D virtual environment. Paper presented at the In Creating the Difference: Proceedings of the Chi Sparks 2014 Conference.
    43. Crossman, E. R. F. W., & Goodeve, P. J. (1983). Feedback control of hand-movement and Fitts' law. The Quarterly Journal of Experimental Psychology Section A, 35(2), 251-278. doi:10.1080/14640748308402133
    44. Czerwinski, M., Robertson, G., Meyers, B., Smith, G., Robbins, D., & Tan, D. (2006). Large display research overview. In CHI'06 extended abstracts on Human factors in computing systems.
    45. Dam, A. v. (1997). Post-WIMP user interfaces. Commun. ACM, 40(2), 63-67. doi:10.1145/253671.253708
    46. Dasiyici, M. C. (2008). Multi-scale Cursor: Optimizing mouse interaction for large personal workspaces. M.S. Thesis, Computer Science.
    47. Debarba, H., Nedel, L., & Maciel, A. (2012). LOP-cursor: Fast and precise interaction with tiled displays using one hand and levels of precision. Paper presented at the 2012 IEEE Symposium on 3D User Interfaces (3DUI).
    48. Dong, H. W., Danesh, A., Figueroa, N., & El Saddik, A. (2015). An elicitation study on gesture preferences and memorability toward a practical hand-gesture vocabulary for smart televisions. Ieee Access, 3, 543-555. doi:10.1109/access.2015.2432679
    49. Endo, Y., Fujita, D., & Komuro, T. (2017). Distant pointing user interfaces based on 3D hand pointing recognition. In Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces, Brighton, UK. doi:10.1145/3132272.3132292
    50. Figueiredo, M., Bohm, K., & Teixeira, J. (1993). Advanced interaction techniques in virtual environments. Computers & Graphics, 17(6), 655-661. doi:10.1016/0097-8493(93)90114-o
    51. Fikkert, W., Vet, P. v. d., Veer, G. v. d., & Nijholt, A. (2010). Gestures for large display control, Berlin, Heidelberg.
    52. Findlater, L., Jansen, A., Shinohara, K., Dixon, M., Kamb, P., Rakita, J., & Wobbrock, J. O. (2010). Enhanced area cursors: reducing fine pointing demands for people with motor impairments. Paper presented at the Proceedings of the 23nd annual ACM symposium on User interface software and technology, New York, USA.
    53. Fitts, P. M. (1954). The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology, 47(6), 381-391. doi:10.1037/h0055392
    54. Foehrenbach, S., et al. (2008). Natural interaction with hand gestures and tactile feedback for large, high-res displays. In: MITH'08: Workshop on Multimodal Interaction Through Haptic Feedback, held in conjunction with AVI'08: International Working Conference on Advanced Visual Interfaces.
    55. Foehrenbach, S., König, W. A., Gerken, J., & Reiterer, H. (2009). Tactile feedback enhanced hand gesture interaction at large, high-resolution displays. Journal of Visual Languages & Computing, 20(5), 341-351. doi:https://doi.org/10.1016/j.jvlc.2009.07.005
    56. Freeman, D., Vennelakanti, R., & Madhvanath, S. (2012). Freehand pose-based gestural interaction: Studies and implications for interface design. Paper presented at the 2012 4th International Conference on Intelligent Human Computer Interaction (IHCI).
    57. Fruchard B., Strohmeier P., Bennewitz R., & Steimle J. (2021). Squish this: Force input on soft surfaces for visual targeting tasks. presented at the Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan. doi.org/10.1145/3411764.3445623.
    58. Gerard P. van Galen, & Jong, W. P. d. (1995). Fitts' law as the outcome of a dynamic noise filtering model of motor control. Human Movement Science, 14(4), 539-571.
    59. Casiez, G., Vogel, D., Balakrishnan, R., & Cockburn, A. (2008). The impact of control-display gain on user performance in pointing tasks. Human–computer interaction, 23(3), 215-250.
    60. Ghidary, S. S., Nakata, Y., Saito, H., Hattori, M., & Takamori, T. (2002). Multi-modal interaction of human and home robot in the context of room map generation. Autonomous Robots, 13(2), 169-184. doi:10.1023/a:1019689509522
    61. Gomez-Donoso, F., Orts-Escolano, S., Garcia-Garcia, A., Garcia-Rodriguez, J., Castro-Vargas, J. A., Ovidiu-Oprea, S., & Cazorla, M. (2017). A robotic platform for customized and interactive rehabilitation of persons with disabilities. Pattern Recognition Letters, 99, 105-113. doi:https://doi.org/10.1016/j.patrec.2017.05.027
    62. Goth, G. (2011). Brave NUI world. Commun. ACM, 54(12), 14-16. doi:10.1145/2043174.2043181
    63. Grossman, T., & Balakrishnan, R. (2005). The bubble cursor: Enhancing target acquisition by dynamic resizing of the cursor's activation area. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Portland, Oregon, USA.
    64. Guiard, Y., Blanch, R., & Beaudouin-Lafon, M. (2004). Object pointing: A complement to bitmap pointing in GUIs. Paper presented at the Proceedings of Graphics Interface 2004, London, Ontario, Canada.
    65. Guinness, D., Jude, A., Poor, G.M., & Dover, A. (2015). Models for rested touchless gestural interaction. In Proceedings of the 3rd ACM Symposium on Spatial User Interaction, Los Angeles, Ca, USA.
    66. Haque, F., Nancel, M., & Vogel, D. (2015). Myopoint: Pointing and clicking using forearm mounted electromyography and inertial motion sensors. Paper presented at the Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul, Republic of Korea.
    67. Harrison, C. & Dey, A.K. (2008). Lean and zoom: proximity-aware user interface and content magnification, in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Italy.
    68. Hart, S. G., & Staveland, L. E. (1988). Development of NASA-TLX (task load index): Results of empirical and theoretical research. In P. A. Hancock & N. Meshkati (Eds.), Advances in Psychology (Vol. 52, pp. 139-183): North-Holland.
    69. Heo S. & Lee G. (2011). Force gestures: augmented touch screen gestures using normal and tangential force. presented at the CHI '11 Extended Abstracts on Human Factors in Computing Systems, Vancouver, BC, Canada. doi.org/10.1145/1979742.1979895.
    70. Herot C. F., Weinzapfel G. (1978). One-point touch input of vector information for computer displays. presented at the Proceedings of the 5th annual conference on Computer graphics and interactive techniques. doi.org/10.1145/800248.807392.
    71. Herrera-Acuna, R., Argyriou, V., & Velastin, S. A. (2015). A Kinect-based 3D hand-gesture interface for 3D databases. Journal on Multimodal User Interfaces, 9(2), 121-139. doi:10.1007/s12193-014-0173-0
    72. Hespanhol, L., Tomitsch, M., Grace, K., Collins, A., & Kay, J. (2012). Investigating intuitiveness and effectiveness of gestures for free spatial interaction with large displays. Paper presented at the Proceedings of the 2012 International Symposium on Pervasive Displays, Porto, Portugal.
    73. Hincapi-Ramos, J. D., Guo, X., Moghadasian, P., & Irani, P. (2014). Consumed endurance: a metric to quantify arm fatigue of mid-air interactions. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Toronto, Ontario, Canada.
    74. Hsu, S. H., Huang, C. C., Tsuang, Y. H., & Sun, J. S. (1999). Effects of age and gender on remote pointing performance and their design implications. International Journal of Industrial Ergonomics, 23(5), 461-471. doi:https://doi.org/10.1016/S0169-8141(98)00013-4
    75. Huang, T. F., Chao, P. C. P., & Kao, Y. Y. (2012). Tracking, recognition, and distance detection of hand gestures for a 3‐D interactive display. Journal of the Society for Information Display, 20(4), 180-196.
    76. Jacob, M. G., & Wachs, J. P. (2016). Optimal modality selection for cooperative human–robot task completion. Ieee Transactions on Cybernetics, 46(12), 3388-3400. doi:10.1109/TCYB.2015.2506985
    77. Jahani, H., & Kavakli, M. (2018). Exploring a user-defined gesture vocabulary for descriptive mid-air interactions. Cognition, Technology & Work, 20(1), 11-22. doi:10.1007/s10111-017-0444-0
    78. Jetter, H.-C., Reiterer, H., & Geyer, F. (2014). Blended interaction: Understanding natural human–computer interaction in post-WIMP interactive spaces. Personal and Ubiquitous Computing, 18(5), 1139-1158. doi:10.1007/s00779-013-0725-4
    79. Jones, K. S., McIntyre, T. J., & Harris, D. J. (2020). Leap motion-and mouse-based target selection: Productivity, perceived comfort and fatigue, user preference, and perceived usability. International Journal of Human–Computer Interaction, 36(7), 621-630.
    80. Jota, R., Nacenta, M. A., Jorge, J. A., Carpendale, S., & Greenberg, S. (2010). A comparison of ray pointing techniques for very large displays. Paper presented at the Proceedings of Graphics Interface 2010, Ottawa, Ontario, Canada.
    81. Jota, R., Pereira, J. M., & Jorge, J. A. (2009). A comparative study of interaction metaphors for large-scale displays. Paper presented at the CHI '09 Extended Abstracts on Human Factors in Computing Systems, Boston, MA, USA.
    82. Jude, A., Poor, G. M., & Guinness, D. (2014a). An evaluation of touchless hand gestural interaction for pointing tasks with preferred and non-preferred hands. Paper presented at the Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational, Helsinki, Finland.
    83. Jude, A., Poor, G. M., & Guinness, D. (2014b). Personal space: user defined gesture space for GUI interaction. Paper presented at the CHI '14 Extended Abstracts on Human Factors in Computing Systems, Toronto, Ontario, Canada.
    84. Kabbash, P., & Buxton, W. A. S. (1995). The "Prince" technique: Fitts' law and selection using area cursors. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Denver, Colorado, USA.
    85. Kabbash, P., MacKenzie, I. S., & Buxton, W. (1993). Human performance using computer input devices in the preferred and non-preferred hands. Paper presented at the Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems, Amsterdam, The Netherlands.
    86. Kajastila, R., & Lokki, T. (2013). Eyes-free interaction with free-hand gestures and auditory menus. International Journal of Human-Computer Studies, 71(5), 627-640. doi:10.1016/j.ijhcs
    87. Kakez, S., Conan, V., & Bisson, P. (1997). Virtually documented environments: A new interface paradigm for task-oriented access to information. Computer Graphics Forum, 16(3), C319-C327. doi:10.1111/1467-8659.00169
    88. Kendon, A. (1986). Current issues in the study of gesture. The biological foundations of gestures: Motor and semiotic aspects, 1, 23-47.
    89. Kendon, A. (2004). Gesture: Visible action as utterance. Cambridge, UK: Cambridge University Press.
    90. Kettebekov, S., & Sharma, R. (2001). Toward natural gesture/speech control of a large display, Berlin, Heidelberg.
    91. Kim, H., Suh, K., & Lee, E. (2017). Multi-modal user interface combining eye tracking and hand gesture recognition. Journal on Multimodal User Interfaces, 11(3), 241-250. doi:10.1007/s12193-017-0242-2
    92. Kim, H., Oh, S., Han, S. H., & Chung, M. K. (2019). Motion-display gain: A new control-display mapping reflecting natural human pointing gesture to enhance interaction with large displays at a distance. International Journal of Human-Computer Interaction, 35(2), 180-195.
    93. Kim, M., & Lee, J. Y. (2016). Touch and hand gesture-based interactions for directly manipulating 3D virtual objects in mobile augmented reality. Multimedia Tools and Applications, 75(23), 16529-16550. doi:10.1007/s11042-016-3355-9
    94. Kobayashi, M., & Igarashi, T. (2008). Ninja cursors: using multiple cursors to assist target acquisition on large screens. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Florence, Italy.
    95. Kopper, R., Bacim, F., & Bowman, D. A. (2011). Rapid and accurate 3D selection by progressive refinement. Paper presented at the 2011 IEEE Symposium on 3D User Interfaces (3DUI).
    96. Kopper, R., Bowman, D. A., Silva, M. G., & McMahan, R. P. (2010). A human motor behavior model for distal pointing tasks. International Journal of Human-Computer Studies, 68(10), 603-615. doi:https://doi.org/10.1016/j.ijhcs.2010.05.001
    97. Köpsel, A., Majaranta, P., Isokoski, P., & Huckauf, A. (2016). Effects of auditory, haptic and visual feedback on performing gestures by gaze or by hand. Behaviour & Information Technology, 35(12), 1044-1062. doi:10.1080/0144929X.2016.1194477
    98. Kouroupetroglou, G., Pino, A., Balmpakakis, A., Chalastanis, D., Golematis, V., Ioannou, N., & Koutsoumpas, I. (2012). Using wiimote for 2D and 3D pointing tasks: Gesture performance evaluation. In E. Efthimiou, G. Kouroupetroglou, & S.-E. Fotinea (Eds.), Gesture and Sign Language in Human-Computer Interaction and Embodied Communication: 9th International Gesture Workshop, GW 2011, Athens, Greece, May 25-27, 2011, Revised Selected Papers (pp. 13-23). Berlin, Heidelberg: Springer Berlin Heidelberg.
    99. Krapichler, C., Haubner, M., Lösch, A., Schuhmann, D., Seemann, M., & Englmeier, K.-H. (1999). Physicians in virtual environments-multimodal human-computer interaction. Interacting with Computers, 11(4), 427-452. doi:10.1016/s0953-5438(98)00060-5
    100. Lam, M. C., Arshad, H., Prabuwono, A. S., Tan, S. Y., & Kahaki, S. M. M. (2018). Interaction techniques in desktop virtual environment: the study of visual feedback and precise manipulation method. Multimedia Tools and Applications, 77(13), 16367-16398. doi:10.1007/s11042-017-5205-9
    101. Lee, B., Isenberg, P., Riche, N. H., & Carpendale, S. (2012). Beyond mouse and keyboard: expanding design considerations for information visualization interactions. IEEE Transactions on Visualization and Computer Graphics, 18(12), 2689-2698. doi:10.1109/TVCG.2012.204
    102. Lee B., Lee H., Lim S.-C., Lee H., Han S., & Park J. (2012). Evaluation of human tangential force input performance. in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: Association for Computing Machinery, pp. 3121-3130.
    103. Lee, L. H., Yeung, N. Y., Braud, T., Li, T., Su, X., & Hui, P. (2020). Force9: Force-assisted miniature keyboard on smart wearables. presented at the Proceedings of the 2020 International Conference on Multimodal Interaction, Virtual Event, Netherlands. doi.org/10.1145/3382507.3418827.
    104. Lee L. H., Zhu Y., Yau Y., Braud T., Su X. and Hui P. (2020). One-thumb text acquisition on force-assisted miniature interfaces for mobile headsets. in 2020 IEEE International Conference on Pervasive Computing and Communications (PerCom), pp. 1-10. doi: 10.1109/PerCom45495.2020.9127378
    105. Lee, M., Billinghurst, M., Baek, W., Green, R., & Woo, W. (2013). A usability study of multimodal input in an augmented reality environment. Virtual Reality, 17(4), 293-305. doi:10.1007/s10055-013-0230-0
    106. Lewis, J. R. (2014). Usability: Lessons learned … and yet to be learned. International Journal of Human–Computer Interaction, 30(9), 663-684. doi:10.1080/10447318.2014.930311
    107. Li, A. X., Lou, X. L., Hansen, P., & Peng, R. (2016). On the influence of distance in the interaction with large displays. Journal of Display Technology, 12(8), 840-850. doi:10.1109/jdt.2016.2527704
    108. Li, X., Chen, C.-H. & Huang, Y.-P. (2016). 3D interactive system based on vision computing of direct-flective cameras. Journal of the Society for Information Display, 24(8), 521-528.
    109. Liang, H., Chang, J., Kazmi, I. K., Zhang, J. J., & Jiao, P. F. (2017). Hand gesture-based interactive puppetry system to assist storytelling for children. Visual Computer, 33(4), 517-531. doi:10.1007/s00371-016-1272-6
    110. Liang, J., & Green, M. (1994). JDCAD: A highly interactive 3D modeling system. Computers & Graphics, 18(4), 499-506. doi:https://doi.org/10.1016/0097-8493(94)90062-0
    111. Liang, Z., Xu, X., & Zhou, S. (2017). The smallest target size for a comfortable pointing in freehand space: Human pointing precision of freehand interaction. Universal Access in the Information Society, 16(2), 381-393. doi:10.1007/s10209-016-0464-1
    112. Lin, J., Harris-Adamson, C., & Rempel, D. (2019). The design of hand gestures for selecting virtual objects. Int. J. Hum. Comput. Interact, 35, 1729–1735. doi:10.1080/10447318.2019.1571783
    113. Löcken, A., Hesselmann, T., Pielot, M., Henze, N., & Boll, S. (2012). User-centred process for the definition of free-hand gestures applied to controlling music playback. Multimedia Systems, 18(1), 15-31. doi:10.1007/s00530-011-0240-2
    114. Lou, X., Peng, R., Hansen, P., & Li, X. A. (2017). Effects of User’s Hand Orientation and Spatial Movements on Free Hand Interactions with Large Displays. International Journal of Human-Computer Interaction, 1-14. doi:10.1080/10447318.2017.1370811
    115. Lubos, P., Bruder, G., Ariza, O., & Steinicke, F. (2016). Touching the sphere: Leveraging joint-centered kinespheres for spatial user Interaction. Paper presented at the Proceedings of the 2016 Symposium on Spatial User Interaction, Tokyo, Japan.
    116. MacKenzie, I. S. (1992). Fitts' law as a research and design tool in human-computer interaction. Hum.-Comput. Interact., 7(1), 91-139. doi:10.1207/s15327051hci0701_3
    117. MacKenzie, I. S., Kauppinen, T., & Silfverberg, M. (2001). Accuracy measures for evaluating computer pointing devices. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Seattle, Washington, USA.
    118. MacKenzie, I. S., Kauppinen, T., & Silfverberg, M. (2001). Accuracy measures for evaluating computer pointing devices. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Seattle, Washington, USA.
    119. MacKenzie, I.S. (1992). Fitts' law as a research and design tool in human-computer interaction. Hum.-Comput. Interact., 7(1), 91-139.
    120. Mäkelä, V., Heimonen, T., & Turunen, M. (2014). Magnetic cursor: Improving target selection in freehand pointing interfaces. Paper presented at the Proceedings of The International Symposium on Pervasive Displays, Copenhagen, Denmark.
    121. Matulic, F., & Vogel, D. (2018). Multiray: Multi-finger raycasting for large displays. Paper presented at the Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal QC, Canada.
    122. McGuffin, M. J., & Balakrishnan, R. (2005). Fitts' law and expanding targets: Experimental studies and designs for user interfaces. ACM Trans. Comput.-Hum. Interact., 12(4), 388-422. doi:10.1145/1121112.1121115
    123. McGuffin, M., & Balakrishnan, R. (2002). Acquisition of expanding targets. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Minneapolis, Minnesota, USA.
    124. McNeil, D. (1992). Hand and mind: What gestures reveal about thought. Chicago, IL: University of Chicago Press.
    125. Meyer, D. E., Abrams, R. A., Kornblum, S., Wright, C. E., & Smith, J. E. K. (1988). Optimality in human motor performance: Ideal control of rapid aimed movements Psychological Review 95, 340-370.
    126. Mizobuchi S., Terasaki S., Keski-Jaskari T., Nousiainen J., Ryynanen M., & Silfverberg M. (2005). Making an impression: Force-controlled pen input for handheld devices. in CHI '05 Extended Abstracts on Human Factors in Computing Systems: Association for Computing Machinery, pp. 1661-1664.
    127. Morrel-Samuels, P. (1990). Clarifying the distinction between lexical and gestural commands. International Journal of Man-Machine Studies, 32(5), 581-590. doi:https://doi.org/10.1016/S0020-7373(05)80034-3
    128. Morris, M. R., Danielescu, A., Drucker, S., Fisher, D., Lee, B., Schraefel, M. C., & Wobbrock, J. O. (2014). Reducing legacy bias in gesture elicitation studies. interactions, 21(3), 40-45. doi:10.1145/2591689
    129. Moscovich, T., & Hughes, J. F. (2006). Multi-finger cursor techniques. Paper presented at the Proceedings of Graphics Interface 2006, Quebec, Canada.
    130. Murata, A. (1999). Extending effective target width in Fitts' law to a two-dimensional pointing task. International Journal of Human–Computer Interaction, 11(2), 137-152. doi:10.1207/S153275901102_4
    131. Murata, A., & Iwase, H. (2001). Extending Fitts' law to a three-dimensional pointing task. Human Movement Science, 20(6), 791-805.
    132. Nancel, M., Chapuis, O., Pietriga, E., Yang, X.-D., Irani, P. P., & Beaudouin-Lafon, M. (2013). High-precision pointing on large wall displays using small handheld devices. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France.
    133. Nancel, M., et al. (2013). High-precision pointing on large wall displays using small handheld devices, in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France.
    134. Nancel, M., Pietriga, E., Chapuis, O., & Beaudouin-Lafon, M. (2015). Mid-air pointing on ultra-walls. ACM Trans. Comput.-Hum. Interact., 22(5), 1-62. doi:10.1145/2766448
    135. Nancel, M., Chapuis, O., Pietriga, E., Yang, X.-D., & Irani, P.P. (2013). High-precision pointing on large wall displays using small handheld devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France.
    136. Ni, T., Bowman, D. A., North, C., & McMahan, R. P. (2011). Design and evaluation of freehand menu selection interfaces using tilt and pinch gestures. International Journal of Human-Computer Studies, 69(9), 551-562. doi:https://doi.org/10.1016/j.ijhcs.2011.05.001
    137. Nielsen, M., Störring, M., Moeslund, T. B., & Granum, E. (2004). A procedure for developing intuitive and ergonomic gesture interfaces for HCI. In A. Camurri & G. Volpe (Eds.), Gesture-Based Communication in Human-Computer Interaction: 5th International Gesture Workshop, GW 2003, Genova, Italy, April 15-17, 2003, Selected Revised Papers (pp. 409-420). Berlin, Heidelberg: Springer Berlin Heidelberg.
    138. Norman, D. A. (2010). Natural user interfaces are not natural. interactions, 17(3), 6-10. doi:10.1145/1744161.1744163
    139. Norman, D. A., & Nielsen, J. (2010). Gestural interfaces: a step backward in usability. interactions, 17(5), 46-49. doi:10.1145/1836216.1836228
    140. Oldfield, R. C. (1971). The assessment and analysis of handedness: The Edinburgh inventory. Neuropsychologia, 9(1), 97-113. doi:https://doi.org/10.1016/0028-3932(71)90067-4
    141. Park, Y., Kim, J., & Lee, K. (2015). Effects of auditory feedback on menu selection in hand-gesture interfaces. Ieee Multimedia, 22(1), 32-40. doi:10.1109/mmul.2015.5
    142. Pereira, A., Wachs, J. P., Park, K., & Rempel, D. (2015). A user-developed 3D hand gesture set for human-computer interaction. Human Factors, 57(4), 607-621. doi:10.1177/0018720814559307
    143. Peres, S. C., Nguyen, V., Kortum, P. T., Akladios, M., Wood, S. B., & Muddimer, A. (2009). Software ergonomics: relating subjective and objective measures. Paper presented at the CHI '09 Extended Abstracts on Human Factors in Computing Systems, Boston, MA, USA.
    144. Petford, J., Nacenta, M.A., & Gutwin, C. (2018). Pointing all around you: Selection performance of mouse and ray-cast pointing in full-coverage displays. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada. doi:10.1145/3173574.3174107.
    145. Pino, A., Tzemis, E., Ioannou, N., & Kouroupetroglou, G. (2013). Using kinect for 2D and 3D pointing tasks: Performance evaluation. In M. Kurosu (Ed.), Human-Computer Interaction. Interaction Modalities and Techniques: 15th International Conference, HCI International 2013, Las Vegas, NV, USA, July 21-26, 2013, Proceedings, Part IV (pp. 358-367). Berlin, Heidelberg: Springer Berlin Heidelberg.
    146. Quek, F. K. H. (1996). Unencumbered gestural interaction. Ieee Multimedia, 3(4), 36-47. doi:10.1109/93.556459
    147. Quek, F., McNeill, D., Bryll, R., Duncan, S., Ma, X.-F., Kirbas, C., Ansari, R. (2002). Multimodal human discourse: gesture and speech. ACM Trans. Comput.-Hum. Interact., 9(3), 171-193. doi:10.1145/568513.568514
    148. Ramos G., Boulos M., & Balakrishnan R. (2004). Pressure widgets. in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: Association for Computing Machinery, pp. 487–494.
    149. Rempel, D., Camilleri, M. J., & Lee, D. L. (2015). The design of hand gestures for human-computer interaction: Lessons from sign language interpreters. International Journal of Human-Computer Studies, 72(10-11), 728-735. doi:10.1016/j.ijhcs.2014.05.003
    150. Ren, G., & Neill, E. O. (2012). 3D Marking menu selection with freehand gestures. Paper presented at the 2012 IEEE Symposium on 3D User Interfaces (3DUI).
    151. Ren, G., & O'Neill, E. (2013). 3D selection with freehand gesture. Computers & Graphics, 37(3), 101-120. doi:https://doi.org/10.1016/j.cag.2012.12.006
    152. Robertson, G., Czerwinski, M., Baudisch, P., Meyers, B., Robbins, D., Smith, G., & Tan, D. (2005). The large-display user experience. IEEE Computer Graphics and Applications, 25(4), 44-51. doi:10.1109/MCG.2005.88
    153. Rooney, C., & Ruddle, R. (2012). Improving window manipulation and content interaction on high-resolution, wall-sized displays. International Journal of Human–Computer Interaction, 28(7), 423-432. doi:10.1080/10447318.2011.608626
    154. Rubine, D. (1992). Combining gestures and direct manipulation. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Monterey, California, USA.
    155. Ruiz, J., Li, Y., & Lank, E. (2011). User-defined motion gestures for mobile interaction. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada.
    156. Sambrooks, L., & Wilkinson, B. (2013). Comparison of gestural, touch, and mouse interaction with Fitts' law. Paper presented at the Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration, Adelaide, Australia.
    157. Schmidt, R. A., Zelaznik, H., Hawkins, B., Frank, J. S., & Quinn, J. T. (1979). Motor output variability: a theory for the accuracy of rapid motor acts. Psychological Review 86, 415–451.
    158. Sheikh, I. H., & Hoffmann, E. R. (1994). Effect of target shape on movement time in a Fitts task. Ergonomics, 37(9), 1533-1547. doi:10.1080/00140139408964932
    159. Shen, J. C., Luo, Y. L., Wu, Z. K., Tian, Y., & Deng, Q. Q. (2016). CUDA-based real-time hand gesture interaction and visualization for CT volume dataset using leap motion. Visual Computer, 32(3), 359-370. doi:10.1007/s00371-016-1209-0
    160. Sherrington, C. S. (1907). On the proprioceptive system, especially in its reflex aspect. Brain, 29(4), 467-482. doi:10.1093/brain/29.4.467
    161. Shi K., Irani P., Gustafson S., & Subramanian S. (2008). PressureFish: A method to improve control of discrete pressure-based input. presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Florence, Italy. doi.org/10.1145/1357054.1357256
    162. Shim, J., Yang, Y., Kang, N., Seo, J., & Han, T.-D. (2016). Gesture-based interactive augmented reality content authoring system using HMD. Virtual Reality, 20(1), 57-69. doi:10.1007/s10055-016-0282-z
    163. Siddhpuria, S., Malacria, S., Nancel, M., & Lank, E. (2018). Pointing at a distance with everyday smart devices. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada. doi:10.1145/3173574.3173747
    164. Son, M., Jung, J., & Park, W. (2017). Evaluating the utility of two gestural discomfort evaluation methods. PLOS ONE, 12(4), e0176123. doi:10.1371/journal.pone.0176123
    165. Soukoreff, R. W., & MacKenzie, I. S. (2004). Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts’ law research in HCI. International Journal of Human-Computer Studies, 61(6), 751-789. doi:https://doi.org/10.1016/j.ijhcs.2004.09.001
    166. Tian, F., Lyu, F., Zhang, X., Ren, X., & Wang, H. (2017). An empirical study on the interaction capability of arm stretching. International Journal of Human-Computer Interaction, 33(7), 565-575. doi:10.1080/10447318.2016.1265782
    167. Tse, E., Hancock, M., & Greenberg, S. (2007). Speech-filtered bubble ray: improving target acquisition on display walls. Paper presented at the Proceedings of the 9th international conference on Multimodal interfaces, Nagoya, Aichi, Japan.
    168. Turk, M. (2001). Perceptual user interfaces. In R. A. Earnshaw, R. A. Guedj, A. v. Dam, & J. A. Vince (Eds.), Frontiers of Human-Centered Computing, Online Communities and Virtual Environments (pp. 39-51). London: Springer London.
    169. Turk, M. (2014). Multimodal interaction: A review. Pattern Recognition Letters, 36, 189-195. doi:https://doi.org/10.1016/j.patrec.2013.07.003
    170. Vatavu, R.-D. (2012). User-defined gestures for free-hand TV control. Paper presented at the Proceedings of the 10th European Conference on Interactive TV and Video, Berlin, Germany.
    171. Vatavu, R.-D. (2013). A comparative study of user-defined handheld vs. freehand gestures for home entertainment environments. Journal of Ambient Intelligence and Smart Environments, 5(2), 187-211. doi:10.3233/ais-130200
    172. Vogel, D., & Balakrishnan, R. (2005). Distant freehand pointing and clicking on very large, high resolution displays. Paper presented at the Proceedings of the 18th annual ACM symposium on User interface software and technology, Seattle, WA, USA.
    173. Vogel, D. & Balakrishnan, R. (2004). Interactive public ambient displays: Transitioning from implicit to explicit, public to personal, interaction with multiple users. In Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology, Santa Fe, NM, USA.
    174. Wachs, J. P., Kolsch, M., Stern, H., & Edan, Y. (2011). Vision-based hand-gesture applications. Communications of the Acm, 54(2), 60-71. doi:10.1145/1897816.1897838
    175. Wachsmuth, I. (1999). Communicative rhythm in gesture and speech. In International Gesture Workshop (pp. 277-289). Springer, Berlin, Heidelberg.
    176. Wagner, P., Malisz, Z., & Kopp, S. (2014). Gesture and speech in interaction: An overview. Speech Communication, 57, 209-232. doi:https://doi.org/10.1016/j.specom.2013.09.008
    177. Walji, A. H. (2007). Functional anatomy of the upper limb (extremity). In Biomechanics in Ergonomics (pp. 223-288). CRC Press.
    178. Ware, C. (2004). Information visualization: perception for design. Morgan Kaufmann, San Francisco, CA, USA.
    179. Weber E. H., Pulsu De. (1834). Resorpitione, auditu et tactu: Annotationes anatomicae et physiologicae. 1st Ed. Leipzig: Koehlor.
    180. Weiser, M. (2002). The computer for the 21st Century. IEEE Pervasive Computing, 1(1), 19-25. doi:10.1109/MPRV.2002.993141
    181. Wexelblat, A. (1995). An approach to natural gesture in virtual environments. ACM Trans. Comput.-Hum. Interact., 2(3), 179-200. doi:10.1145/210079.210080
    182. Wexelblat, A. (1998). Research challenges in gesture: Open issues and unsolved problems. In I. Wachsmuth & M. Fröhlich (Eds.), Gesture and Sign Language in Human-Computer Interaction: International Gesture Workshop Bielefeld, Germany, September 17–19, 1997 Proceedings (pp. 1-11). Berlin, Heidelberg: Springer Berlin Heidelberg.
    183. Wilkins, D. (2003). Why pointing with the index finger is not a universal (in sociocultural and semiotic terms). In S. Kita (Ed.), Pointing: Where language, culture, and cognition meet(pp.171-215). Mahwah, NJ: Erlbaum.
    184. Wilson, A. D., & Cutrell, E. (2005). FlowMouse: A computer vision-based pointing and gesture input device. In M. F. Costabile & F. Paterno (Eds.), Human-Computer Interaction - Interact 2005, Proceedings (Vol. 3585, pp. 565-578). Berlin: Springer-Verlag Berlin.
    185. Wilson G., Stewart C., & Brewster S. A. (2010). Pressure-based menu selection for mobile devices. presented at the Proceedings of the 12th international conference on Human computer interaction with mobile devices and services, Lisbon, Portugal. doi.org/10.1145/1851600.1851631.
    186. Wobbrock, J. O., Aung, H. H., Rothrock, B., & Myers, B. A. (2005). Maximizing the guessability of symbolic input. Paper presented at the CHI '05 Extended Abstracts on Human Factors in Computing Systems, Portland, OR, USA.
    187. Wobbrock, J. O., Morris, M. R., & Wilson, A. D. (2009). User-defined gestures for surface computing. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, MA, USA.
    188. Wongphati, M., Osawa, H., & Imai, M. (2015). User-defined gestures for controlling primitive motions of an end effector. Advanced Robotics, 29(4), 225-238. doi:10.1080/01691864.2014.978371
    189. Worden, A., Walker, N., Bharat, K., & Hudson, S. (1997). Making computers easier for older adults to use: area cursors and sticky icons. Paper presented at the Proceedings of the ACM SIGCHI Conference on Human factors in computing systems, Atlanta, Georgia, USA.
    190. Wu, H., & Yang, L. (2020). User-defined gestures for dual-screen mobile interaction. International Journal of Human-Computer Interaction, 36(10), 978-992.
    191. Yoshida T., Ogawa J., Choi K. Y., Bushnaq S., Nakagaki K., & Ishii H. (2021). inDepth: Force-based interaction with objects beyond a physical barrier. presented at the Proceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction, Salzburg, Austria. doi.org/10.1145/3430524.3442447.
    192. Zaiţi, I.-A., Pentiuc, Ş.-G., & Vatavu, R.-D. (2015). On free-hand TV control: Experimental results on user-elicited gestures with Leap Motion. Personal and Ubiquitous Computing, 19(5), 821-838. doi:10.1007/s00779-015-0863-y
    193. Zhang, J., et al. (2013). Three-dimensional interaction and autostereoscopic display system using gesture recognition. Journal of the Society for Information Display, 21(5), 203-208.
    194. Ziebart, B., Dey, A., & Bagnell, J. A. (2012). Probabilistic pointing target prediction via inverse optimal control. Paper presented at the Proceedings of the 2012 ACM international conference on Intelligent User Interfaces, Lisbon, Portugal.
    195. Zimmerman, T. G., Lanier, J., Blanchard, C., Bryson, S., & Harvill, Y. (1986). A hand gesture interface device. SIGCHI Bull., 18(4), 189-192. doi:10.1145/1165387.275628
    196. 丁玉蘭. (2011). 人機工程學. 北京: 北京理工大學出版社.
    197. 區國良、 曾郁庭、 姚玉娟(2014)。體感式數位遊戲行動學習系統對學習成就及學習保留影響之研究。科學教育學刊,22:2,163-184。
    198. 呂俊宏 、 徐俊斌、 Latief、Suryawahyuni 、羅文伶 、潘文福 、劉從義(2015)。體感手勢游標融入配對遊戲學習之使用者接受度評估研究。教育傳播與科技研究, 110 ,63-78。
    199. 林信志 、朱菀鈴、 朱昱融(2016)。手勢互動內容創作系統之開發與評估。數位學習科技期刊,8:2 ,1-16。
    200. 陳建雄、王建立(2021). 以手勢為互動基礎的介面設計學術研究發展:文獻回顧與綜論,設計學報,26(1),59-82。

    QR CODE