簡易檢索 / 詳目顯示

研究生: 施景元
ChingYuan - Shih
論文名稱: 應用於遠端遙控機器人之廣視野影像回饋系統
A Wide Field-of-View Visual Feedback System for Remote Controlled Robots
指導教授: 鄧惟中
Wei-Chung Teng
口試委員: 林其禹
Chyi-Yeu Lin
傅立成
Li-Chen Fu
唐政元
Cheng-Yuan Tang
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2007
畢業學年度: 95
語文別: 中文
論文頁數: 48
中文關鍵詞: 全景全向視野DirectShow移動式機器人
外文關鍵詞: panoramic, omnidirectional, field-of-view, DirectShow, mobile robot
相關次數: 點閱:185下載:7
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  •   在一般遠端遙控機器人系統的各種感測器資訊裡,視覺回饋是使用者操控時藉以產生空間臨場感最為直覺的工具。而為了提升使用者控制遠端機器人的效率,各種採用特殊廣角度鏡頭視覺感測器的機器人視覺系統紛紛被設計出來,以盡可能地取得大量的視覺資訊。然而這些經由特殊廣角度鏡頭設計的視覺感測器卻會引起影像嚴重的畸變(distortion),必須採用影像演算法進一步校正畫面後,使用者才得以正常辨認畫面內容。
      本論文針對遠端遙控機器人的視覺回饋,嘗試設計並建構一套具有廣視野畫面臨場感的系統。採用三台一般市售網路攝影機(webcam)與兩片反射平面鏡設計組合而成的光學系統,可達到非同步拍攝具相同視點(point-of-view)但不同視野(field-of-view)的多重畫面;再交由利用DirectShow設計之影像處理PanMultiCamFilter整併多視野畫面。本廣視野視覺感測器搭配可程式化控制馬達,可提供更有彈性之水平視野拍攝範圍,且透過一般網路將廣視野影像回饋傳送至遠端使用者介面。完成一高臨場感且具低畸變畫面特性之廣視野視覺感測器。


      Among all sensory data of a remote controlled robot system, visual feedback provides the most straightforward information on constructing spatial presence for users. For the efficiency of controlling remote robots, many visual feedback systems with special wide-angle lenses on cameras were designed to capture massive images. However, these special wide-angle lenses will cause image distortion heavily. Image must be corrected by image processing algorithms before it could be recognized by users.
      This thesis focuses on establishing a visual feedback system which provides wide FOV (field-of-view) presence in remote controlled robots. A new optical sys-tem combined with three webcams and two flat mirrors could asynchronously cap-ture multiple images with different FOV from the same POV (position-of-view). These images were then merged and transformed by the PanMultiCamFilter, devel-oped from the DirectShow. By joining a bearing mechanism with a programmable motor controller, the wide FOV visual feedback system could provide more flexible horizontal FOV. Those images captured by the system were transferred to remote user interface (UI) through networks. At the end, a wide FOV visual feedback sys-tem was created with high presence and low image distortion for remote controlled robots.

    論文摘要....................................i Abstract...................................ii 誌謝......................................iii 目錄.......................................iv 圖目錄.....................................vi 表目錄....................................vii 第 1 章 緒論...............................1   1.1 前言...............................1   1.2 研究動機...........................2   1.3 研究目的...........................2   1.4 論文架構...........................2 第 2 章 文獻探討...........................3   2.1 遠端遙控移動式機器人...............4     2.1.1 移動式機器人.................4     2.1.2 機器人控制之策略.............5     2.1.3 遠端遙控之型態與範圍.........5   2.2 視覺感測器系統.....................6     2.2.1 單鏡頭攝影機組...............7     2.2.2 多鏡頭攝影機組..............10   2.3 人機介面..........................11   2.4 Microsoft DirectShow..............11 第 3 章 研究方法..........................14   3.1 系統架構..........................14   3.2 視覺系統之建構....................15   3.3 影像處理核心技術..................18     3.3.1 PanCamFilter................20     3.3.2 MultiCamFilter..............22     3.3.3 PanMultiCamFilter...........24   3.4 系統網路與使用者介面..............27     3.4.1 系統網路核心................27     3.4.2 使用者介面..................27   3.5 旋轉馬達控制系統..................28 第 4 章 系統建置..........................29   4.1 移動式機器人機體..................29   4.2 視覺系統..........................30   4.3 Lego步進馬達與其控制系統..........31   4.4 NRLCam.net........................32   4.5 系統運作方式與成果................33 第 5 章 結論..............................35 參考文獻...................................36

    [1] Ken Goldberg, Michael Mascha, Steve Gentner, Nick Rothenberg, Carl Sutter, Jeff Wiegley, “Desktop Teleoperation via the World Wide Web,” Proceedings of the IEEE International Conference on Robotics and Automation, 654-659, 1995, <http://www.usc.edu/dept/raiders/>.
    [2] Kostas Daniilidis, Christopher Geyer, “Omnidirectional vision: Theory and algo-rithms,” Proceedings - International Conference on Pattern Recognition, 15 (1), 89-96, 2000.
    [3] Hong Hua, Narendra Ahuja, “A High-Resolution Panoramic Camera,” Proc. Computer Vision and Pattern Recongition, Vol. 1, I960-I967, 2001.
    [4] Maki Sugimoto, Georges Kagotani, Hideaki Nii, Naoji Shiroma, Masahiko Inami, Fumitoshi Matsuno, “Time Follower's Vision : A Teleoperation Interface with Past Images,” IEEE Computer Graphics and Applications, 25 (1), 54-63,. 2005.
    [5] Bob Ricks, Curtis W. Nielsen, Michael A. Goodrich, “Ecological Displays for Robot interaction : A new Perspective,” International Conference on Intelligent Robots and Systems (IROS), Vol. 3, 2855-2860, 2004.
    [6] Humanoid Robotics Project - 人間協調共存型, July 2007, <http://www.mstc.or.jp/hrp/main.html>.
    [7] Takashi Nishiyama, Hiroshi Hoshino, Kenshi Suzuki, Ryoji Nakajima, Kazuya Sa-wada and Susumu Tachi, “Development of Surrounded Audio-Visual Display Sys-tem for Humanoid Robot Control,” International Conference on Artificial Reality and Tel-existence(ICAT), 1999.
    [8] Susumu Tachi, Kiyoshi Komoriya, Kazuya sawada, Takashi Nishiyama, Toshiyuki Itoko, Masami Kobayashi, Kozo Inove, “Telexistence Cockpit for Humanoid Robot Control,” Advanced Robotics, Vol. 17, No. 3, 199-217, 2003.
    [9] 元宏, 岩田洋夫, “旋回式高解像度像,” 日本第6 回大論文集, Sep 2001.
    [10] Thomas B. Sheridan, “Telerobotics, automation and human supervisory control,” Cambridge, MA, MIT Press, 1992.
    [11] Yee Sern Wei, “Autonomous and Rational Behavior of Sony AIBO Robot,” The Uni-versity of Western Australia, 2005.
    [12] Lisensiaattity, Jussi Suomela, “Tele-presence Aided Teleoperation of Semi-autonomous Work Vehicles,” Course AS-84.147 Automaation kyttliittymt, Helsinki University of Technology, 2007.
    [13] 鍾慶彥, “廣視野視訊回饋應用於遠端機器人遙控系統之生成策略,” 碩士論文, 國立台灣科技大學, 2005.
    [14] George A. Bekey, “Autonomous Robots: From Biological Inspiration to Implemen-tation and Control,” The MIT Press, 2005.
    [15] Gregor Klančar, Matej Kristan, Rihard Karba, “Wide-angle camera distortions and non-uniform illumination in mobile robot tracking,” Robotics and Autonomous Systems 46 (2), 125-133, 2004.
    [16] 陳永昇, “全方位攝影機簡介,” 電腦視覺監控產學研聯盟電子報, 2005, <http://140.113.87.114/cvrc/edm/skill_3.htm>.
    [17] Microsoft, “DirectX SDK,” July 2007, <http://msdn2.microsoft.com/en-us/library/ms783323.aspx>.
    [18] Mark D. Pesce, “Programming Microsoft DirectShow for Digital Video and Televi-sion,” Microsoft Press, 2003.
    [19] 其明, “DirectShow指南,” 北京清大出版社, 2003.
    [20] ActivMedia Robotics, “PIONEER P3-DX,” July 2007, <http://www.activrobots.com/ROBOTS/p2dx.html>.
    [21] LEGO Education, “Team Challenge set with USB transmitter,” July 2007, <http://www1.lego.com/education/default.asp?l2id=0_1&page=7_1>.
    [22] James Matthews, “LEGO IR Protocol in C++,” 2001, <http://www.generation5.org/content/2001/rob08.asp>.

    QR CODE