簡易檢索 / 詳目顯示

研究生: 張進億
chin-yi chang
論文名稱: 基於影像伺服之移動物體追蹤與夾取
Visual Servoing Based Moving Target Tracking and Grasping
指導教授: 林其禹
Chyi-Yeu Lin
口試委員: 林紀穎
Chi-Ying Lin
邱士軒
Shih-Hsuan Chiu
學位類別: 碩士
Master
系所名稱: 工程學院 - 機械工程系
Department of Mechanical Engineering
論文出版年: 2009
畢業學年度: 97
語文別: 中文
論文頁數: 80
中文關鍵詞: 影像伺服追蹤夾取
外文關鍵詞: visual servoing, tracking, grasping
相關次數: 點閱:211下載:8
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

本論文發展一影像伺服為基礎的追蹤系統。利用影像處理獲得目標物在影像中的資訊,並且利用立體視覺計算出目標物座標,並利用賈式矩陣控制機械手臂,用正向運動學結合得知手臂移動,且可利用正向運動學結果與影像結果得知控制賈式矩陣的誤差。本論文利用卡爾曼濾波器的特性來修正影像處理結果,藉此獲得更為準確的目標物位置。


This research proposes a tracking and grabbing system based on visual servoing. The system makes motion of manipulator robot arm more like the human motion. The control part uses the Jacobian to control the manipulator robot arm, so the manipulator robot will move to target quickly. Because the target is still moving, the position of target information from image processing is not current position. This research uses the Kalman filter to predict the target position. The Kalman filter will help us to get more correct information.

中文摘要 I ABSTRACT II 致 謝 III 目錄 IV 圖目錄 VI 表目錄 IX 第一章 序論 1 1.1研究動機及目的 1 1.2文獻回顧 2 1.3本文內容綱要 3 第二章 機器人學系統基本理論 4 2.1機器人定義 5 2.2串聯式機器人連桿座標系統之定義 5 2.3串聯式機器人之微分關係式與賈氏矩陣 8 第三章 機械手臂之運動學分析 11 3.1運動學概述 11 3.2順向運動學 12 3.3逆向運動學 17 第四章 影像處理 27 4.1 影像校正 27 4.2 影像深度 29 4.3 CAMSHIFT演算法 31 4.4 運動能量法 35 4.5 卡爾曼濾波器 37 第五章 追蹤控制的系統架構 41 5.1 系統架構 41 5.2 多執行序系統 44 5.3 實驗結果 46 第六章 實驗設備介紹 53 6.1 實驗硬體介紹 53 6.2 機械手臂的軟體介紹與操作 55 第七章 結論與未來展望 58 7.1結論 58 7.2 未來展望與發展 58 參考文獻 60 附錄 64 附錄二 65 附錄三 66

[1] C. Cedras and M. Shah, “Motion-Based Recognition: A Survey,” Image and Vision Computing, Vol. 13, pp. 129-154, 1995.

[2] J. Feddema and O. Mitchell, “Vision-Guided Servoing with Feature-Based Trajectory Generation,” IEEE Trans. on Robotics and Automation, Vol. 5, pp. 691-700, 1989.

[3] N.P. Papanikolopoulos, P.K. Khosla, and T. Kanade, “Visual Tracking of a Moving Target by a Camera Mounted on a Robot: A Combination of Control and Vision,” IEEE Trans. on Robotics and Automation, Vol. 9, No. 1, pp. 14-35, 1993.

[4] J. Gangloff, M.d. Mathelin and A. Gabriel, “6 DOF High Speed Dynamic Visual Servoing Using GPC Controllers,” Proceedings of the 1998 IEEE International Conference on Robotics and Automation, pp. 2008-2013, Leuven, Belgium, 1998.

[5] R. Kelly, R. Carelli, O. Nasisi, B. Kuchen and F. Reyes, “Stable Visual Servoing of Camera-in-Hand Robotic Systems,” IEEE/ ASME Trans. on Mechatronics, Vol.5, No.1, pp. 39-48, 2000.

[6] J. Stavnitzky and D. Capson, “Multiple Camera Model-Based 3-D Visual Servo,” IEEE Trans. on Robotics and Automation, Vol.16, No.6, pp. 732-739, 2000.

[7] S. Hutchinson, G.D. Hager and P. I. Corke, “A Tutorial on Visual Servo Control,” IEEE Trans. on Robotics and Automation, Vol. 12, No.5, pp.651-670, 1996.

[8] J.L. Barron, D.J. Fleet, S.S Beauchemin and T.A Burkitt, “Performance of Optical Flow Techniques,” Proceedings of 1992 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 236-242, 1992.

[9] L.E. Weiss, A.C. Sanderson and C.P. Neuman, “Dynamic Sensor-Based Control of Robots with Visual Feedback,” IEEE Trans. on Robotics and Automation, Vol. RA-3, No. 5, pp. 404-417, 1987.

[10] B.K.P. Horn and B.G. Schunck, “Determine Optical Flow,” Artificial Intelligence, Vol. 17, pp. 285-204, 1981.

[11] J. Hill and W.T. Park, “Real Time Control of a Robot with a Mobile Camera,” Proc. 9th ISIR, Washington, D.C., pp.233- 246, 1979.

[12] G. Hager and W. Wilson, “Coordinated Controller Design for Position Based Robot Visual Servoing in Cartesian Coordinates,” Proceedings of the IEEE International Conference on Robotics and Automation, ICRA’96’, Vol. 2, pp. 1650–1655,1996.

[13] P. Wunsch and G. Hirzinger, “Real-Time Visual Tracking of 3D Objects With Dynamic Handling of Occlusion, ” Proceedings of the IEEE International Conference on Robotics and Automation, ICRA’97’, Vol. 2,pp. 2868–2873,1997.

[14] B. Yoshimi and P. Allen, “Visual Control of Grasping and Manipultion Tasks,” Proceedings of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, MFI’94’, pp. 575–582,1994.

[15] G. Hager, “A Modular System for Robust Positioning Using Feedback from Stereo Vision,” IEEE Transactions on Robotics and Automation 13(4), 582–595,1997.

[16] K. Hashimoto and T. Noritsugu, “Performance and Sensitivity in Visual Servoing,” Proceedings of the IEEE International Conference on Robotics and Automation, ICRA’98’, Vol. 2, pp. 2321–2326,1998.

[17] J. Gangloff, M. de Mathelin, and G. Abba, “Visual Servoing of a 6DOF Manipulator for Unknown 3D Profile Following,” Proceedings of the IEEE International Conference on Robotics and Automation, ICRA’99’, Vol. 4, pp. 3236–3242,1999.

[18] E. Grosso, G. Metta, A. Oddera and G. Sandini, “Robust Visual Servoing in 3D Reaching Tasks,” IEEE Transactions on Robotics and Automation 12(5), 732–742,1996.

[19] M. Jarabek and D. Capson, “Robot Position Servoing Using Visual Gap Measurements,” in Proceedings of the IEEE Instrumentation and Measurement Technology Conference, Vol. 1, pp. 26–30,1998.

[20] R. Kelly, P. Shirkey and M. Spong, “Fixed-Camera Visual Servo Control for Planar Robots,” Proceedings of the IEEE International Conference on Robotics and Automation, ICRA’96’, Vol. 3, pp. 2643–2649,1996.

[21] D. Kragi´c and H. Christensen, “Integration of Visual Cues for Active Tracking of an End-Effector,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS’99’, Vol. 1, pp. 362–368,1999.

[22] D. Kragi´c and H. Christensen, “A Framework for Visual Servoing Tasks,” Proceedings of the Intelligent Autonomous Systems 6, IAS-6’, Venice, pp. 835–842,2000.

[23] E. Malis, F. Chaumette and S. Boudet, “Multi-Cameras Visual Servoing,” Proceedings of the IEEE International Conference on Robotics and Automation, ICRA’00’, Vol. 4, pp. 3183–3188, 2000.

[24] J. Denavit and R.S. Hartenberg, “A Kinematic Notation for Lower-Pair Mechanisms Based on Matrices,” J. Applied Mechanics, pp. 215~221, June 1955.

[25] L.W. Tsai, “Robot Analyse the Mechanics of Serial and Parallel Manipulators,” John Wiley and Sons, Inc, pp58, 1999.

[26] Camera Calibration Toolbox Web Site at http://www.vision.caltech.edu/bouguetj/index.html

[27] R.C. Gonzalez and R.E. “Woods, Digital Image Processing,” Addison Wesley, 1993.

[28] J. Denavit and R.S. Hartenberg, “A Kinematic Notation for Lower-Pair Mechanisms Based on Matrices,” J. Applied Mechanics, pp. 215~221, June 1955.

[29] L. W. Tsai, “Robot Analyse The mechanics of Serial and Parallel Manipulators,” John Wiley and Sons, Inc, pp58, 1999.

[30]B. Roth, “Performance Evaluation of Manipulators from a Kinematic Viewpoint,” Peformance Evaluation of Manipulators, National Bureau of Standards, special publication, 1975.

[31]L. Tsai and A. Morgan, “Solving the Kinematics of the Most General Six- and Five-degree-of-freedom Manipulators by Continuation Methods,” ASME Mechanisms Conference, Boston,Massachusetts, October 1984.

QR CODE