簡易檢索 / 詳目顯示

研究生: 范維媛
Wei-yuan Fan
論文名稱: 以擴增實境技術開發互動式虛擬寵物系統
Development of an Interactive Virtual Pet System Using Augmented Reality Techniques
指導教授: 范欽雄
Chin-shyurng Fahn
口試委員: 黃榮堂
Jung-tang Huang
王榮華
Jung-hua Wang
邱士軒
Shiu-hsuan Chiu
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2011
畢業學年度: 99
語文別: 英文
論文頁數: 64
中文關鍵詞: 擴增實境邊緣檢測標籤辨識視覺追蹤虛擬寵物
外文關鍵詞: virtual pet.
相關次數: 點閱:231下載:4
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

  擴增實境(augmented reality; AR)是從虛擬實境(virtual reality; VR)延伸的技術,主要是在現實環境中結合了三維虛擬物件,其特色為擁有高真實性、高互動性和高教育性等,現在已逐漸的在各個領域裡有許多廣泛的應用。
在本篇論文中,我們設計了一個使用擴增實境技術的虛擬寵物系統,利用我們所提出的基於邊緣偵測的即時視覺追蹤技術,以一般的USB攝影機作為視覺感測元件,並使用具有特定圖案的自訂標籤來標記三維虛擬物件。我們在來源影像中利用邊緣和骨架方向來對四邊形做偵測,再利用所找出的四邊形頂點對其做正規化,來進行標籤辨識,並且進一步的利用這些頂點來估算真實世界中,標籤與攝影機之間的空間關係。我們所提出的方法能使電腦在攝影機的畫面中感知多個標籤並加以辨識該身分,進而將各個三維虛擬物件合成於畫面中特定的位置。
另外,為了增加寵物系統的豐富性,我們所設計的虛擬寵物系統擁有多種不同的個性化的屬性參數,這些數值會隨使用者的飼養方式不同而有不一樣的變化,並隨機產生不同的互動事件,再搭配上各種語音音效,能使得虛擬寵物系統更為多樣化。此外,我們在系統中提供了檔案儲存的功能,可以讓使用者能夠自由的掌握遊戲的進度,為了讓使用者和寵物之間能有更高的互動性,我們還加入了虛擬寵物的資訊視窗,讓使用者能清楚了解目前虛擬寵物的各種需求和狀態,以便能順利的進行遊戲。


Augmented reality is an extended technology from virtual reality. It focuses on combining with the real world and three-dimensional virtual objects. Augmented reality has high reality, high interaction, high education, and the other characteristics. For now, it has a lot of wide applications in various fields gradually.
In this thesis, we design a virtual pet system which uses the techniques of augmented reality. By using our real-time visual tracking techniques based on edge detection and employing a common USB camera for capturing source images, we can make the three-dimensional virtual objects display on our defined marker that combined with the source images in camera’s scan. In the beginning, we use the edges and skeletons to detect the quadrangular markers from source images. And then, we normalize quadrangles by making use of those vertices of quadrangle we found before for marker recognized. We can estimate the spatial relations in real word between patterns and the camera further. Our proposed method enables computers to sense each marker whether exists in the sight of the camera, which identity the marker has, and what pose and position it is. With this information, computers can superimpose virtual objects on the specific markers respectively, so that users can interact with the virtual objects simultaneously by interacting with the markers.
Furthermore, for increasing the pet system’s variety, the system we devised has many different properties of the individual parameters. Those parameters will be changed with the different keeping method of user. By generating each kind of random events, and combining diverse sound effects, our virtual pet system can be more variety. We offer the saving function in our system to let user can control the game process at will. For enhancing more interaction and playing game smoothly , we also show the information windows of virtual pet in system that it can make user to understand his pet state immediately.

致謝 i 摘要 ii Abstract iii Contents v List of Figures vii List of Tables xii Chapter 1 Introduction 1 1.1 Overview 1 1.2 Background and motivation 2 1.3 Thesis organization 3 Chapter 2 Related Work 4 2.1 Virtual reality and augmented reality 4 2.2 Virtual pets 8 2.3 VRML 11 Chapter 3 Marker Detection and Position Estimation 12 3.1 Quadrangle detection 133 3.1.1 Edge detection 134 3.1.2 Thinning 136 3.1.3 Quadrangle searching 139 3.2 Marker recognition 24 3.2.1 Quadrangle normalization 25 3.2.2 Grayscale normalization 27 3.2.3 Error reduction 30 3.2.4 Marker identification 32 3.3 Pose and position estimation 34 3.3.1 Estimation of the rotation matrix 34 3.3.2 Translation estimation 36 Chapter 4 System Implementation 39 4.1 System overview 39 4.2 Pet parameters design 27 4.3 Interactive events design 44 Chapter 5 Experimental Results and Discussion 47 5.1 Tracking result 47 5.2 Results of the implementation system 51 Chapter 6 Conclusion and Future Works 61 6.1 Conclusion 61 6.2 Future works 61 References 63

[1] R. Azuma et al., “A survey of augment reality,” Presence-Teleoperators and Virtual Environments, vol. 6, no. 4, pp. 355-385, 1997.
[2] C. Kirner, E. Zorzal, and T. Kirner, “Case Studies on the Development of Games Using Augmented Reality,” in IEEE International Conference on System, Man and Cybernetics, 2006. SMC’06, vol. 2, 2006.
[3] J. Marescaux, F Rubino, M. Arenas, D. Mutter, L. Soler, “Augmented-Reality–
Assisted Laparoscopic Adrenalectomy,” The Journal of the American Medical Association (JAMA), 292, 18, 2004.
[4] Tobias Sielhorst, Tobias Obst, Rainer Burgkart, Robert Riener, and Nassir Navab, “An Augmented Reality Delivery Simulator for Medical Training,” Augmented environments for Medical Imaging including Augmented Reality in Computer-aided Surgery(AMIARCS 04), Rennes, France, p1, pp. 16-19, 2004.
[5] Sang Hwa Lee, Junyeong Choi, Jong-I1 Park, “Interactive E-Learning System Using Pattern Recognition and Augmented Reality,” in IEEE Transactions on Consumer Electronics, vol. 55, 2, pp. 1-6 May 2009.
[6] Scott A. Green, Mark Billinghurst, XiaoQi Chen J., Geoffrey Chase, “Human-Robot Collaboration: A Literature Review and Augmented Reality Approach in Design.” International Journal of Advanced Robotic Systems, vol. 5, no. 1, pp. 1-7, 2008.
[7] F. Zhou, H. B. L. Duh, and M. Billinghurst, “Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR,” in proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality, Cambridge, UK, pp. 193 – 202, 2008.
[8] J. P. Rolland, L. Davis, and Y. Baillot, A survey of tracking technology for virtual environments, In Fundamentals of Wearable Computers and Augmented Reality, 1st ed., W. Barfield and T. Caudell, Eds. Mahwah, NJ: CRC, pp. 67-112, 2001.
[9] J. Newman, D. Ingram, and A. Hopper, “Augmented reality in a wide area sentient environment,” in Proceedings of IEEE/ACM International Symposium on Augmented Reality, New York, USA, pp. 77 – 86, 2001.

[10] G. Klinker, R. Reicher, and B. Brugge, “Distributed user tracking concepts for augmented reality applications,” in Proceedings of IEEE/ACM International Symposium on Augmented Reality, Munich, Germany, pp. 37 – 44, 2000.
[11] J. Newman, M. Wagner, M. Bauer, A. MacWilliams, T. Pintaric, D. Beyer, D. Pustka, F. Strasser, D. Schmalstieg, and G. Klinker, “Ubiquitous tracking for augmented reality,” in Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality, Arlington, USA, pp. 192 – 201, 2004.
[12] Sutherland, Ivan E. The Ultimate Display. Proceedings of IFIP 65, vol 2, pp. 506-508, 582-583, 1965.
[13] G. Burdea, “Virtual Reality System and Applications,” Electro’93 International Conference, Short Course, Edison, NJ, pp. 164, April 28, 1993.
[14] CyberMIND. http://cybermindnl.com/ (accessed on June 23, 2011).
[15] Vallino, J. R., “Interactive augmented reality.” Unpublished doctoral dissertation , University of Rochester, New York, 1998.
[16] G. Bell, A. Parisi, and M. Pesce, “The Virtual Reality Modeling Language: Version 1.0 Specification.” May 26, 1995.
[17] web3D Consortium. http://www.web3d.org/realtime-3d/ (accessed on June 23, 2011).
[18] Tamagotchi. http://tamagotch.channel.or.jp/index.php (accessed on June 23, 2011).
[19] I. Sobel and G. Feldman, “A 3 × 3 Isotropic Gradient Operator for Image Processing,” presented at a talk at the Stanford Artificial Project, 1968.
[20] J. Canny, “A Computational Approach to Edge Detection,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 8, no. 6, pp. 679 - 698, 1986.
[21] ARToolKit. http://www.hitl.washington.edu/artoolkit/ (accessed on June 23, 2011).
[22] R. C. Gonzalez and R. E. Woods, Digital Image Processing, Second Edition, New Jersey: Prentice Hall, 2002.

QR CODE