簡易檢索 / 詳目顯示

研究生: 劉韋廷
Wei-Tyng Liu
論文名稱: 應用擴增實境技術於人機介面之探討與研究─以互動遊戲為例
A Study on Human Computer Interfaces Using Augmented Reality Techniques – Taking Interactive Games as an Example
指導教授: 范欽雄
Chin-Shyurng Fahn
口試委員: 許永和
Yung-Hoh Sheu
李建德
Jin-De Li
吳怡樂
Yi-Le Wu
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2012
畢業學年度: 100
語文別: 英文
論文頁數: 89
中文關鍵詞: 擴增實境人機互動USB HID視覺追蹤邊緣檢測標籤辨識
外文關鍵詞: Human–computer interface, USB HID visual tracking
相關次數: 點閱:310下載:3
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 擴增實境(Augmented Reality; AR)是從虛擬實境(Virtual Reality; VR)延伸的技術,主要是在現實環境中結合了三維虛擬物件,其特色為擁有高真實性、高互動性和高教育性等,現在已逐漸的在各個領域裡有許多廣泛的應用。
    本篇論文中,我們設計了一些與擴增實境技術做結合的人機互動方式,利用我們所提出的基於邊緣偵測的即時視覺追蹤技術,以一般的USB攝影機作為視覺感測元件,並使用具有特定圖案的自訂標籤來標記三維虛擬物件。我們在來源影像中利用邊緣和骨架方向來對四邊形做偵測,再利用所找出的四邊形頂點對其做正規化,來進行標籤辨識,並且進一步的利用這些頂點來估算真實世界中,標籤與攝影機之間的空間關係。我們所提出的方法能使電腦在攝影機的畫面中感知多個標籤並加以辨識該身分,進而將各個三維虛擬物件合成於畫面中特定的位置。
    另外,為了增加遊戲中的人機互動性,我們將設計許多不一樣的互動方式並透過娛樂的方式來進行另類的知識教育,本研究將許多日常生活中所需的知識、認知及音樂,透過娛樂的方式傳達給使用者,透過本系統可以讓使用者一邊享有娛樂的目的,另一邊透過互動的方式進行學習教育,以擴增實境的方式,呈現給玩家一個不同以往視覺、聽覺的教育感受。最後,本論文所提出的系統與市面上的遊戲,且和Kinect、Wii等常見的互動裝置,進行客觀性的比較,得知我們的系統在設備成本上比較低廉、互動方式是使用簡單的道具達成、而遊戲的物件是使用3D的立體物件來提高整體的真實度。


    Augmented reality is an extended technology from virtual reality. It focuses on combining with the real world and three-dimensional virtual objects. Augmented reality has high reality, high interaction, high education, and the other characteristics. For now, it has a lot of wide applications in various fields gradually.
    In this thesis, we design the interactive games system which uses the techniques of augmented reality. By using our real-time visual tracking techniques based on edge detection and employing a common USB camera for capturing source images, we can make the three-dimensional virtual objects display on our defined marker that combined with the source images in camera’s scan. In the beginning, we use the edges and skeletons to detect the quadrangular markers from source images. And then, we normalize quadrangles by making use of those vertices of quadrangle we found before for marker recognized. We can estimate the spatial relations in real word between patterns and the camera further. Our proposed method enables computers to sense each marker whether exists in the sight of the camera, which identity the marker has, and what pose and position it is. With this information, computers can superimpose virtual objects on the specific markers respectively, so that users can interact with the virtual objects simultaneously by interacting with the markers.
    Furthermore, for increasing the games system’s interaction of Human–Computer, we design many different interactive and entertaining way to a different type of knowledge education, In this study, we communicated to the user by means of entertainment with many of the daily life in the required knowledge, cognitive, and music. Through this system allows the user side to enjoy the entertainment purposes, on the other side by means of interactive learning and education, which can use augmented reality showing the player to a different vision, hearing education experience. Finally, compared with games on the market, Kinect, and Wii, the apparatus cost of the system proposed in this paper is low. Besides, simple appliances are used to do interaction, and 3D objects are employed in the game to enhance the sense of reality.

    致謝 i 摘要 ii Abstract iii Contents v List of Figures vii List of Tables xi Chapter 1 Introduction 1 1.1 Overview 1 1.2 Background and motivation 2 1.3 Our Proposed Methods 3 1.4 Thesis organization 4 Chapter 2 Related Work 5 2.1 Virtual reality and augmented reality 5 2.2 Human-computer interface 9 2.3 USB HID System 11 2.4 VRML 13 Chapter 3 Marker Detection and Position Estimation 15 3.1 Quadrangle detection 16 3.1.1 Edge detection 17 3.1.2 Thinning 19 3.1.3 Quadrangle searching 22 3.2 Marker Recognition 27 3.2.1 Quadrangle normalization 28 3.2.2 Grayscale normalization 30 3.2.3 Error reduction 33 3.2.4 Marker identification 35 3.3 Pose and Position Estimation 37 3.3.1 Estimation of the rotation matrix 37 3.3.2 Translation estimation 39 Chapter 4 Human-Computer Interface 42 4.1 Human-computer interface introduction 42 4.1.1 Human-computer interface definition 42 4.1.2 Human-computer interface Design methods 43 4.2 Our first system for Human-computer interface 44 4.2.1 The first interaction in our first system 45 4.2.2 The second interaction in our first system 46 4.2.3 The third interaction in our first system 48 4.2.4 Overall interface windows in our first system 49 4.3 Our second system for Human-computer interface 50 4.3.1 Get HID USB device driver for our second system 51 4.3.2 Use HID USB device driver interaction in our second system 52 4.3.3 Overall interface windows in our second system 54 Chapter 5 System Implementation 55 5.1 System overview 55 5.2 game parameters design 59 5.3 Interactive events design 64 5.4 HID USB device driver 65 Chapter 6 Experimental Results and Discussion 67 6.1 Tracking result 67 6.2 System implementation result 72 6.3 System comparison 82 Chapter 7 Conclusion and Future Works 86 7.1 Conclusion 86 7.2 Future works 87 References 88

    [1] R. Azuma et al., “A survey of augment reality,” Presence-Teleoperators and Virtual Environments, vol. 6, no. 4, pp. 355-385, 1997.
    [2] C. Kirner, E. Zorzal, and T. Kirner, “Case studies on the development of games using augmented reality,” in Proceedings of the IEEE International Conference on System, Man and Cybernetic, Taipei, Taiwan, vol. 2, pp.1636-1641, 2006.
    [3] J. Marescaux et al., “Augmented-reality– Assisted laparoscopic adrenalectomy,” The Journal of the American Medical Association, vol. 292, no. 18, pp. 2214-2215, 2004.
    [4] T. Sielhorst et al., “An augmented reality delivery simulator for medical training,” in Proceedings of the Augmented environments for Medical Imaging including Augmented Reality in Computer-aided Surgery, Rennes, France, vol. 1, pp. 16-19, 2004.
    [5] S. H. Lee, J. Choi, and J. I. Park, “Interactive e-learning system using pattern recognition and augmented reality,” IEEE Transactions on Consumer Electronics, vol. 55, no. 2, pp. 1-6, 2009.
    [6] S. A. Green et al., “Human-robot collaboration: A literature review and augmented reality approach in design.” International Journal of Advanced Robotic Systems, vol. 5, no. 1, pp. 1-7, 2008.
    [7] F. Zhou, H. B. L. Duh, and M. Billinghurst, “Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR,” in Proceedings of the 7th IEEE International Symposium on Mixed and Augmented Reality, Cambridge, UK, pp. 193 - 202, 2008.
    [8] J. P. Rolland, L. Davis, and Y. Baillot, “A survey of tracking technology for virtual environments,” in Proceedings of the Fundamentals of Wearable Computers and Augmented Reality, Mahwah, NJ, vol. 3, pp. 67-112, 2001.
    [9] J. Newman, D. Ingram, and A. Hopper, “Augmented reality in a wide area sentient environment,” in Proceedings of the IEEE International Symposium on Augmented Reality, NY, pp. 77 - 86, 2001.
    [10] G. Klinker, R. Reicher, and B. Brugge, “Distributed user tracking concepts for augmented reality applications,” in Proceedings of the IEEE International Symposium on Augmented Reality, Munich, Germany, pp. 37- 44, 2000.

    [11] J. Newman et al., “Ubiquitous tracking for augmented reality,” in Proceedings of the 3rd IEEE International Symposium on Mixed and Augmented Reality, Arlington, TX, pp. 192 - 201, 2004.
    [12] S. Ivan, “The ultimate display.” in Proceedings of the Congress of the International Federation of Information, vol. 2, pp. 506-508, 1965.
    [13] G. Papaioannou, “Virtual reality system and applications,” in Proceedings of the ACM Symposium on Virtual Reality Software and Technology, New York, pp. 384, 2006.
    [14] CyberMIND. http://cybermindnl.com/ (accessed on June 23, 2011).
    [15] J. Vallino, “Interactive augmented reality.” Unpublished doctoral dissertation, University of Rochester Press, New York, 1998.
    [16] The virtual reality modeling language: Version 1.0 Specification. http://www.web3d.org/x3d/specfications/VRML1.0/index.html (accessed on June 23, 2011).
    [17] web3D Consortium. http://www.web3d.org/realtime-3d/ (accessed on June 23, 2011).
    [18] Tamagotchi. http://tamagotch.channel.or.jp/index.php (accessed on June 23, 2011).
    [19] I. Sobel and G. Feldman, “A 3 × 3 isotropic gradient operator for image processing,” Presented at a talk at the Stanford Artificial Project, 1968.
    [20] J. Canny, “A computational approach to edge detection,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 8, no. 6, pp. 679 - 698, 1986.
    [21] ARToolKit. http://www.hitl.washington.edu/artoolkit/ (accessed on June 23, 2011).
    [22] R. C. Gonzalez and R. E. Woods, Digital Image Processing, Second Edition, New Jersey: Prentice Hall, 2002.

    QR CODE