簡易檢索 / 詳目顯示

研究生: 何昭慶
Chao-Ching Ho
論文名稱: 影像伺服控制與三維追蹤之研究
Visual Servoing Control Based Three-Dimensional Tracking
指導教授: 施慶隆
Ching-Long Shih
口試委員: 范光照
Kuang-Chao Fan
傅楸善
Chiou-Shann Fuh
許新添
Hsin-Teng Hsu
黃志良
Chih-Lyang Hwang
劉昌煥
Chang-Huan Liu
學位類別: 博士
Doctor
系所名稱: 電資學院 - 電機工程系
Department of Electrical Engineering
論文出版年: 2008
畢業學年度: 96
語文別: 英文
論文頁數: 106
中文關鍵詞: 影像伺服三維追蹤機器人控制
外文關鍵詞: visual servoing, 3D target tracking, robotics control.
相關次數: 點閱:318下載:16
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 由於目標於空間的立體相對位置不易被適當的量測,因此三維空間軌跡追蹤為一具高難度行為。在此研究中,我們提出一個新的目標軌跡追蹤機制,應用機器視覺的方法來決定目標物在空間的立體位置與移動軌跡;本研究使用連續適應性均值追蹤演算法能強健且快速計算出目標物的相對位置,並使用立體視覺演算法計算出目標在空間中的相對立體位置。使用連續適應性均值追蹤演算法具有對環境光源的高容忍度並相對於樣板比對法有更快速的計算效率。本研究使用一對的網路視訊攝影機控制器來達到立體平衡追蹤系統。實驗結果證明立體視覺使用於物體軌跡追蹤系統可達到強健、快速及高效率的視覺追蹤控制及障礙物迴避。使用我們所發展的立體視覺伺服技術平台及軟體模組,藉由結合視覺與運動控制的技術可應用於無人駕駛車、機器人控制、安全監視系統等等。


    Designing a real-time visual tracking system to perform control is a complex task because of a large amount of streaming video data must be transmitted and processed immediately when tracing the target. Usually, building such visual servoing systems requires the application of high-cost specialized hardware and the development of complicated visual control software. In this thesis, a novel low-cost, real-time visual servo control system is presented. The system uses the stereo vision consisting of two calibrated cameras to acquire images of the target, and applies the continuously adaptive vision tracking algorithm to provide feedbacks of the obejct’s real-time position at a high frame rate; and then employs a robot manipulator controlled by a fuzzy reasoning system to accquire the target. The collision avoidance method is also presented in this thesis for wheeled mobile robot navigation in indoors environments using feature extracting and matching algorithm. The target is tracked by its predetermined defined color. The mobile robot’s direction is dynamically adjusted according to its distance from the target and obstcles. This visual tracking and servoing system is less sensitive to lighting influences and thus performs more efficiently. The proposed real-time 3D visual servoing framework can be applied to the unmanned vehicle, robotics control and survillance systems.

    中文摘要 i ABSTRACT ii ACKNOWLEDGEMENTS iii CONTENTS iv NOMENCLATURE vi LIST OF TABLES vii LIST OF FIGURES viii CHAPTER Ⅰ INTRODUCTION 1 1.1 BACKGROUND 2 1.2 REVIEW OF PREVIOUS WORKS 3 1.3 THESIS STRUCTURE 7 CHAPTER Ⅱ MACHINE VISION AND OBJECT TRACKING 9 2.1 OBJECT TRACKING OVERVIEW 10 2.2 COLOR SPACES AND CONVERSION 11 2.2.1 RGB Color Space 12 2.2.2 Normalized RGB 13 2.2.3 HSI Space 13 2.2.4 YCrCb 14 2.3 BACK PROJECTION 15 2.4 IMAGE MOMENTS 16 2.5 CAMSHIFT TRACKING 17 2.5.1 CAMSHIFT Algorithm 17 2.5.2 Hybrid Tracking 19 2.6 CAMERA CALIBRATION 20 2.7 DEPTH CALCULATED USING BINOCULAR STEREO 25 2.8 FUNDAMENTAL MATRIX WITH EPIPOLAR GEOMETRY 28 2.9 DEPTH MAP CREATION 30 CHAPTER Ⅲ 2D VISUAL SERVOING ARCHITECTURE 33 3.1 MACHINE-VISION-BASED BALL–BEAM TRACKING SYSTEM 33 3.2 SYSTEM FLOW 37 3.3 EXPERIMENTAL RESULTS 39 3.4 SUMMARY 42 CHAPTER Ⅳ 3D VISUAL SERVOING ARCHITECTURE 43 4.1 FUZZY VISUAL SERVO SYSTEM 43 4.1.1 Image Capturing 45 4.1.2 Hand–Eye Coordinate Transformation 47 4.1.3 Fuzzy Reasoning Visual System 48 4.2 EXPERIMENTAL RESULTS 50 4.3 SUMMARY 56 CHAPTER Ⅴ 3D VISUAL SERVOING FOR MOBILE ROBOTS 58 5.1 STEREO-VISION-BASED WHEELED MOBILE ROBOT (WMR) SYSTEM 58 5.2 SYSTEM CALIBRATION AND MACHINE VISION PROCESSING 61 5.2.1 Image Preprocessing 63 5.2.2 Correspondence Matching and Fuzzy Reasoning Steering 65 5.2.3 Detection of Object Grabbing 68 5.3 EXPERIMENTAL RESULTS 69 5.3.1 Experiment of Picking–Placing Task 70 5.3.2 Experiment Involving Target Following and Obstacle Avoiding 73 5.4 SUMMARY 80 CHAPTER Ⅵ CONCLUSIONS 82 6.1 CONTRIBUTIONS 82 6.2 FURTHER WORKS 83 REFERENCES 84 APPENDIX A OPTICAL FLOW TRACKING 88 APPENDIX B CORNER DETECTION 89 VITA 90 PUBLICATION LIST 91

    1 R. Kelly, D. de Fisica Aplicada, and E. Cicese, "Robust asymptotically stable visualservoing of planar robots," IEEE Transactions on Robotics and Automation, vol. 12, pp. 759-766, 1996.
    2 K. Sumi, M. Hashimoto, and H. Okuda, "Three-levelbroad-edge matching based real-time robot vision," in Proceedings of the IEEE International Conference on Robotics and Automation, vol. 2, pp. 1416-1422, 1995.
    3 L. Di Stefano, S. Mattoccia, and M. Mola, "An efficient algorithm for exhaustive template matching based on normalized cross correlation," in Proceedings of the 12th International Conference on Image Analysis and Processing (ICIAP), pp. 322-327, 2003.
    4 Z. Lin, V. Zeman, and R. V. Patel, "On-line robot trajectory planning for catching a moving object," in Proceedings of the IEEE International Conference on Robotics and Automation, pp. 1726-1731, 1989.
    5 A. Ukigaya, T. Kawamura, and M. Kidode, "Statistical analysis of distribution patterns for moving objects - skill acquisition to focus a target region for a catch-robot," in Proceedings of 20th Annual Conference on Japanese Society of Artificial Intelligence (JSAI ), vol. 19, 2005.
    6 H. Suzuki and M. Minami, "Fish catching using gazing-GA visual servoing-verification of robustness for lighting condition varieties," in Proceedings of the 41st SICE Annual Conference, vol. 2, 2002.
    7 H. Suzuki and M. Minami, "Visual servoing to catch fish using global/local GA search," IEEE/ASME Transactions on Mechatronics, vol. 10, pp. 352-357, 2005.
    8 Y. Hong, M. Jung, H. Myung, H. Lee, Y. Lee, and S. Kim, "Detecting and tracking people by mobile robot using structured light range sensor," in IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1653-1658, 2005.
    9 A. Tsalatsanis, K. Valavanis, and A. Yalcin, "Vision based target tracking and collision avoidance for mobile robots," Journal of Intelligent and Robotic Systems, vol. 48, pp. 285-304, 2007.
    10 M. Kobilarov, G. Sukhatme, J. Hyams, and P. Batavia, "People tracking and following with mobile robot using an omnidirectional camera and a laser," in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), pp. 557-562, 2006.
    11 R. Mu?noz-Salinas, E. Aguirre, and M. Garc’?a-Silvente, "People detection and tracking using stereo vision and color," Image and Vision Computing, vol. 25, pp. 995-1007, 2007.
    12 C.-C. Ho and C.-L. Shih, "Real-time tracking and stereo vision-based control of a goldfish-catching system," in Proceedings of the 35th International MATADOR Conference, vol. 13, pp. 319-322, 2007.
    13 Y. Cheng, "Mean shift, mode seeking, and clustering," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 17, pp. 790-799, 1995.
    14 G. R. Bradski, "Real time face and object tracking as a component of a perceptual user interface," in Proceedings of the 4th IEEE Workshop on Applications of Computer Vision, pp. 214–219, 1998.
    15 S. Hutchinson, G. D. Hager, and P. I. Corke, "A tutorial on visual servo control," IEEE Transactions on Robotics and Automation, vol. 12, pp. 651-670, 1996.
    16 A. Yilmaz, O. Javed, and M. Shah, "Object tracking: A survey," ACM Computing Surveys (CSUR), vol. 38, no. 4, pp. 13, 2006.
    17 Kenneth R. Castlman, Digital Image Processing, Prentice-Hall, Inc., 1996.
    18 R. LeGrand and R. C. Luo, "Position estimation of selected targets," in Proceedings of International Conference on Robotics and Automation, vol. 2, pp. 1714-1719, 1996.
    19 K. I. Kim, S. Y. Oh, S. W. Kim, H. Jeong, C. N. Lee, B. S. Kim, and C. S. Kim, "An autonomous land vehicle PRV III," in Proceedings of the IEEE Intelligent Vehicles Symposium, pp. 159-164, 1996.
    20 M. J. Swain and D. H. Ballard, "Indexing via color histograms," in Proceedings of the Third International Conference on Computer Vision, pp. 390-393, 1990.
    21 M. C. Villa-Uriol, G. Chaudhary, F. Kuester, T. Hutchinson, and N. Bagherzadeh, "Extracting 3D from 2D: selection basis for camera calibration," in Proceedings of 7th IASTED International Conference on Computer Graphics and Imaging (CGIM), pp. 315-321, 2004.
    22 T. C. Hutchinson, F. Kuester, K. U. Doerr, and D. Lim, "Optimal hardware and software design of an image-based system for capturing dynamic movements," IEEE Transactions Instrumentation and Measurement, vol. 55, pp. 164-175, 2006.
    23 Z. Zhang, "Flexible camera calibration by viewing a plane from unknown orientations," in Proceedings of the 7th IEEE International Conference on Computer Vision, (ICCV), Corfu, Greece, pp. 666-673, September 1999.
    24 Z. Zhang, "A flexible new technique for camera calibration," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no 11, pp. 1330-1334, 2000.
    25 J. Y. Bouguet, "Camera Calibration Toolbox for Matlab," See http://www.vision.caltech.edu/bouguetj, 2003.
    26 R. Jain, R. Kasturi, and B. G. Schunck, Machine Vision: McGraw-Hill, 1995.
    27 R. S. Wright Jr and M. R. Sweet, OpenGL SuperBible Second Edition: Waite Group Press, 2000.
    28 R. I. Hartley, "In defense of the eight-point algorithm," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, pp. 580-593, 1997.
    29 E. Trucco and A. Verri, Introductory Techniques for 3-D Computer Vision: Prentice Hall PTR Upper Saddle River, NJ, USA, 1998.
    30 C. Harris and M. Stephens, "A combined corner and edge detector," in Proceedings of the 4th Alvey Vision Conference, vol. 15, pp. 147-152, 1988.
    31 Hauser, S. Sastry, and P. Kokotovi'c. "Nonlinear control via approximate input-output linearization: the ball and beam example," IEEE Transactions on Automatic Control, vol. 37, no. 3, pp.392-398, 1992.
    32 P. Kokotovic, "The joy of feedback: nonlinear and adaptive," IEEE Transactions on Control System, vol. 12, no. 3, pp. 7–17, June 1992.
    33 I. Petrovic, M. Brezak, and R. Cupec, "Machine vision based control of the ball and beam," in 7th International Workshop on Advanced Motion Control, pp. 573-577, 2002.
    34 Z. W. Woo, H. Y. Chung, and J. J. Lin, "A PID type fuzzy controller with self-tuning scaling factors," Fuzzy Sets and Systems, vol. 115, pp. 321-326, 2000.
    35 Lee CH, Mavroidis C., "PC-based control of robotic and mechatronic system under MS_Windows NT workstation," IEEE/ASME Transactions on Mechatronics, vol.6, no. 3, pp. 311-320, 2001.
    36 http://www.youtube.com/watch?v=gLcRlfOrMys
    37 H. K. Yuen, J. Princen, J. Illingworth, and J. Kittler, "Comparative study of hough transform methods for circle finding," Image and Vision Computing, vol. 8, pp. 71-77, 1990.
    38 Qingcang Yu, Harry H. Cheng, Wayne W. Cheng, Xiaodong Zhou, "Interactive open architecture computer vision," in Proc. of the 15th IEEE International Conference on Tools with Artificial Intelligence, Sacramento, CA, pp. 406- 410, 2003.
    39 http://www.youtube.com/watch?v=zPOlgFve-WU
    40 http://www.youtube.com/watch?v=p58404DixtA
    41 http://www.youtube.com/watch?v=-c199u_O3cs
    42 http://www.youtube.com/watch?v=MCLUXe9nwcI
    43 http://www.youtube.com/watch?v=UbDBtGB7v5Q
    44 http://www.youtube.com/watch?v=FbphUZcI0wU
    45 http://www.youtube.com/watch?v=_28te24gOmA
    46 http://www.youtube.com/watch?v=Z1aHpYpIonU
    47 http://www.youtube.com/watch?v=te-YHs5o7iM
    48 http://www.youtube.com/watch?v=K5AW1GUUg38
    49 http://www.youtube.com/watch?v=efCw9JZEJNo
    50 http://www.youtube.com/watch?v=Blv4HVOQHqg

    QR CODE