簡易檢索 / 詳目顯示

研究生: 李育儒
YU-RU LI
論文名稱: 結合擴增實境與Ipv6技術之探討與研究─以網路對戰遊戲為例
A Study on Augmented Reality Combined with Ipv6 Technology - Taking Multiplayer Online Games as an Example
指導教授: 范欽雄
Chin-Shyurng Fahn
口試委員: 王榮華
Jung-Hua Wang
林啟芳
Chi-Fang Lin
古鴻炎
Hung-Yen Ku
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2012
畢業學年度: 100
語文別: 英文
論文頁數: 118
中文關鍵詞: 擴增實境邊緣檢測視覺追蹤標籤辨識
外文關鍵詞: Streaming media
相關次數: 點閱:370下載:3
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

擴增實境(augmented reality; AR)是從虛擬實境(virtual reality; VR)延伸的技術,主要是在現實環境中結合了三維虛擬物件,其特色為擁有高真實性、高互動性和高教育性等,現在已逐漸的在各個領域裡有許多廣泛的應用。在本篇論文中,我們設計了一個使用擴增實境與IPV6技術的網路對戰系統,利用我們所提出的基於邊緣偵測的即時視覺追蹤技術,以一般的USB攝影機作為視覺感測元件,並使用具有特定圖案的自訂標籤來標記三維虛擬物件。我們在來源影像中利用邊緣和骨架方向來對四邊形做偵測,再利用所找出的四邊形頂點對其做正規化,來進行標籤辨識,並且進一步的利用這些頂點來估算真實世界中,標籤與攝影機之間的空間關係。我們所提出的方法最終能使電腦在攝影機的畫面中感知多個標籤並加以辨識該身分,進而將各個三維虛擬物件合成於畫面中特定的位置,且結合新一代網路通訊協定(IPV6),利用Streaming技術傳輸其動作與音效,保證遊戲品質與畫面流暢性,提供玩家高品質連線對戰的樂趣。此外,在現有擴增實境網路對戰系統中多半沒有考慮實體層的拓樸架構(Physical Topology)與使用者位址,以至於造成頻寬的浪費以及延遲的增加。因此,本篇論文同時提出一個IPV6參考實體網路架構、使用者位址與貢獻度的即時串流系統架構,利用貢獻度分群與近距離供應的概念,有效降低延遲,提高使用者滿意度,並節省網路頻寬與伺服器負載,提升系統效益。


Augmented reality is an extended technology from virtual reality. It focuses on combining with the real world and three-dimensional virtual objects. Augmented reality has high reality, high interaction, high education, and the other characteristics. For now, it has a lot of wide applications in various fields gradually. In this thesis, we design a multiplayer online games system which uses the techniques of augmented reality and IPV6. By using our real-time visual tracking techniques based on edge detection and employing a common USB camera for capturing source images, we can make the three-dimensional virtual objects display on our defined marker that combined with the source images in camera’s scan. In the beginning, we use the edges and skeletons to detect the quadrangular markers from source images. And then, we normalize quadrangles by making use of those vertices of quadrangle we found before for marker recognized. We can estimate the spatial relations in real word between patterns and the camera further. Our proposed method enables computers to sense each marker whether exists in the sight of the camera, which identity the marker has, and what pose and position it is. With this information, computers can superimpose virtual objects on the specific markers respectively, so that users can interact with the virtual objects simultaneously by interacting with the markers. We also combined with a new generation of Internet protocol (IPV6), and using streaming technology to transfer actions and sound effects, which can ensure the quality of the game and screen fluency. So, we provide high-quality games for players.

中文摘要 Abstract 致謝 List of Figures List of Tables Chapter 1 Introduction 1.1 Overview 1.2 Background and motivation 1.3 Our Proposed Methods and System 1.4 Thesis organization Chapter 2 Related Work 2.1 Virtual reality and augmented reality 2.2 Online games 2.3 VRML Chapter 3 Marker Detection and Position Estimation 3.1 Quadrangle detection 3.1.1 Edge detection 3.1.2 Thinning 3.1.3 Quadrangle searching 3.2 Marker Recognition 3.2.1 Quadrangle normalization 3.2.2 Grayscale normalization 3.2.3 Error reduction 3.2.4 Marker identification 3.3 Pose and Position Estimation 3.3.1 Estimation of the rotation matrix 3.3.2 Translation estimation Chapter 4 IPV6 Streaming Architecture 4.1 IPV4/IPV6 translation based on network processor 4.1.1 IPv6 DNS Architecture 4.1.2 DNS Latency Analysis 4.2 Our Streaming system architecture Chapter 5 System Implementation 5.1 System overview 5.2 Game parameters design 5.3 Interactive events design Chapter 6 Experimental Results and Discussion 6.1 Tracking result 6.2 Results of the implementation system 6.3 System Load test analysis 6.4 Our streaming Architecture analysis Chapter 7 Conclusion and Future Works 7.1 Conclusion 7.2 Future works References

[1] R. Azuma et al., “A survey of augment reality,” Presence-Teleoperators and Virtual Environments, vol. 6, no. 4, pp. 355-385, 1997.
[2] C. Kirner, E. Zorzal, and T. Kirner, “Case studies on the development of games using augmented reality,” in Proceedings of the IEEE International Conference on System, Man and Cybernetic, Taipei, Taiwan, vol. 2, pp.1636-1641, 2006.
[3] J. Marescaux et al., “Augmented-reality– Assisted laparoscopic adrenalectomy,” The Journal of the American Medical Association, vol. 292, no. 18, pp. 2214-2215, 2004.
[4] T. Sielhorst et al., “An augmented reality delivery simulator for medical training,” in Proceedings of the Augmented environments for Medical Imaging including Augmented Reality in Computer-aided Surgery, Rennes, France, vol. 1, pp. 16-19, 2004.
[5] S. H. Lee, J. Choi, and J. I. Park, “Interactive e-learning system using pattern recognition and augmented reality,” IEEE Transactions on Consumer Electronics, vol. 55, no. 2, pp. 1-6, 2009.
[6] S. A. Green et al., “Human-robot collaboration: A literature review and augmented reality approach in design.” International Journal of Advanced Robotic Systems, vol. 5, no. 1, pp. 1-7, 2008.
[7] F. Zhou, H. B. L. Duh, and M. Billinghurst, “Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR,” in Proceedings of the 7th IEEE International Symposium on Mixed and Augmented Reality, Cambridge, UK, pp. 193 - 202, 2008.
[8] J. P. Rolland, L. Davis, and Y. Baillot, “A survey of tracking technology for virtual environments,” in Proceedings of the Fundamentals of Wearable Computers and Augmented Reality, Mahwah, NJ, vol. 3, pp. 67-112, 2001.
[9] J. Newman, D. Ingram, and A. Hopper, “Augmented reality in a wide area sentient environment,” in Proceedings of the IEEE International Symposium on Augmented Reality, NY, pp. 77 - 86, 2001.
[10] G. Klinker, R. Reicher, and B. Brugge, “Distributed user tracking concepts for augmented reality applications,” in Proceedings of the IEEE International Symposium on Augmented Reality, Munich, Germany, pp. 37- 44, 2000.

[11] J. Newman et al., “Ubiquitous tracking for augmented reality,” in Proceedings of the 3rd IEEE International Symposium on Mixed and Augmented Reality, Arlington, TX, pp. 192 - 201, 2004.
[12] S. Deering, R. Hinden, “Internet Protocol, Version 6 (IPv6) Specification,” RFC2460, December, 1998.
[13] S. Ivan, “The ultimate display.” in Proceedings of the Congress of the International Federation of Information, vol. 2, pp. 506-508, 1965.
[14] G. Papaioannou, “Virtual reality system and applications,” in Proceedings of the ACM Symposium on Virtual Reality Software and Technology, New York, pp. 384, 2006.
[15] CyberMIND. http://cybermindnl.com/ (accessed on June 23, 2011).
[16] J. Vallino, “Interactive augmented reality.” Unpublished doctoral dissertation, University of Rochester Press, New York, 1998.
[17] The virtual reality modeling language: Version 1.0 Specification. http://www.web3d.org/x3d/specfications/VRML1.0/index.html (accessed on June 23, 2011).
[18] web3D Consortium. http://www.web3d.org/realtime-3d/ (accessed on June 23, 2011).
[19] Tamagotchi. http://tamagotch.channel.or.jp/index.php (accessed on June 23, 2011).
[20] I. Sobel and G. Feldman, “A 3 × 3 isotropic gradient operator for image processing,” Presented at a talk at the Stanford Artificial Project, 1968.
[21] J. Canny, “A computational approach to edge detection,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 8, no. 6, pp. 679 - 698, 1986.
[22] ARToolKit. http://www.hitl.washington.edu/artoolkit/ (accessed on June 23, 2011).
[23] R. C. Gonzalez and R. E. Woods, Digital Image Processing, Second Edition, New Jersey: Prentice Hall, 2002.

QR CODE