研究生: |
張甫安 Fu-an Chang |
---|---|
論文名稱: |
以DSP實現三軸機械臂的非校正影像伺服 DSP Based Uncalibrated Visual Servoing for a 3-DOF Robot Manipulator |
指導教授: |
林紀穎
Chi-Ying Lin |
口試委員: |
郭重顯
Chung-Hsien Kuo 李維楨 Wei-chen Lee |
學位類別: |
碩士 Master |
系所名稱: |
工程學院 - 機械工程系 Department of Mechanical Engineering |
論文出版年: | 2014 |
畢業學年度: | 102 |
語文別: | 中文 |
論文頁數: | 87 |
中文關鍵詞: | 相機校正 、影像伺服 、立體視覺 、機械手臂 、嵌入式系統 |
外文關鍵詞: | Stereo vision, Visual servo, Camera calibration, Robot manipulator, DSP |
相關次數: | 點閱:320 下載:9 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
摘要
結合影像感測器的機器人應用十分廣泛,若想達到精準地控制,攝影機校正的步驟是必須的,然而實際應用時可能發生人為碰撞或是因視野不佳導致攝影機被移動的情況,如此勢必要花費更多成本在重新進行攝影機校正上。且傳統影像伺服演算法可能存在區域最小值與奇異點問題。因此針對以上兩點,本研究採用線上校正攝影機外部參數的方式,使攝影機被移動時機器人系統能夠自行偵測並校正,提升機器人影像伺服系統的實用性。且本研究利用虛擬雙攝影機以及三維視覺空間的建立,推導出滿秩的影像賈可賓矩陣,藉此消除奇異點問題。在實現方面本研究採用分散式運算,將上述演算法實現於兩套嵌入式系統上。最後針對此演算法進行靜態定位、動態追跡以及整體非校正影像伺服系統實驗,證實本系統可行性。
Abstract
Robots equipped with visual sensors are widely applied everywhere. In a visual system camera calibration is an important step to achieve satisfactory performance. However, when camera position is changed by human collision, re-calibration is necessary and the system cost will be increased. Moreover, it is known that classical image based visual servo control scheme has some problems, e.g. image space singularities and local minima. Therefore, online calibration with camera’s extrinsic parameters update is preferred in practical applications because this technique can re-calibrate the camera’s extrinsic parameters automatically and increase the practicability of the robot visual servo system. This study applied a virtual composite camera model and a 3D visual Cartesian space to calculate the full-rank image Jacobian for visual tracking control. The advantage of this approach is that the image Jacobian obtained will be always non-singular for robustness. This research also employed a distributed embedded system including two DSP boards to implement the above visual servo algorithm. The experimental results demonstrate that the proposed visual servo system is feasible in an embedded system.
[1] “iRobot Roomba,” http://store.irobot.com/irobot-roomba-880/product.jsp?produ
ctId=28516906&cp=2501652&parentPage=family
[2] “Boston Dynamics BigDog,” http://www.bostondynamics.com/robot_bigdog.
html
[3] “Honda ASIMO,” http://world.honda.com/ASIMO/technology/2011/performing/
[4] S. Hutchinson, G. D. Hager, and P. I. Corke, “A Tutorial on Visual Servo Control,” IEEE Transactions on Robotics and Automation, Vol. 12, No. 5, pp. 605-611, 1996.
[5] K. Hashimoto, “A Review on Vision-based Control of Robot Manipulators,” Advanced Robotics, Vol. 17, No. 10, pp. 969-991, 2003.
[6] G. L. L. Munoz, and J. M. S. T. da Motta, “Comparative Performance Analysis of Image-based and Position-based Visual Servoing In A 6 DOF Manipulator,” International Congress of Mechanical Engineering, Ribeirao Preto, SP, Brazil, November 3-7, 2013, pp. 4911-4919.
[7] 葉忠憲, “移動式機械臂之抓物控制設計,” 國立交通大學電機與控制工程學系碩士論文, 2006.
[8] 梁勝昌, “應用立體視覺結合機械手臂於移動式載具電池抽換之研究,”國立台灣科技大學機械工程研究所碩士論文, 2009.
[9] K. Okada, M. Kojima, S. Tokutsu, T. Maki, Y. Mori, and M. Inaba, “Multi-cue 3D Object Recognition in Knowledge-based Vision-guided Humanoid Robot System,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, San Diego, CA, USA, October 29 - November 2, 2007, pp.3217-3222.
[10] “Camera Calibration Toolbox for Matlab”, http://www.vision.caltech.edu/bougue
tj/calib_doc/htmls/example.html
[11] K. Hashimoto, T. Kubota, M. Baeg, and F. Harashima, “A scheme for visual tracking of robot manipulator using neural network,” IEEE International Joint Conference on Neural Networks, Tokyo, Japan, November 18-21, 1991, pp. 1073-1078.
[12] 賴穎鋒, 具未經校準眼在手視覺系統之機械手臂基於行為模式之姿態控制,” 成功大學機械工程系碩士班論文, 2005.
[13] 蔡明宏, 基於類神經網路整合學習兩類攝影資訊之非校正型影像伺服系統,”南臺科技大學機械工程系碩士班論文, 2006.
[14] 陳奕全, 以未經校準之立體視覺定位機械手臂之類神經控制,”國立東華大學電機工程研究所, 2001.
[15] 劉咏龍, “適應性視覺伺服控制之機械手臂應用,” 國立中央大學電機工程研究所碩士班論文, 2003.
[16] R. Horaud, F. Dornaika, and B. Espiau, “Visually Guided Object Grasping,” IEEE Journal of Robotics and Automation, Vol. 14, No. 4, pp. 525-532, August 1998.
[17] B. H. Yoshimi and P. K. Allen, “Active, Uncalibrated Visual Servoing,” Proceedings of IEEE International Conference on Robotics and Automation, San Diego, CA, America, May 8-13, 1994, Vol. 1, pp. 156-161.
[18] B. H. Yoshimi and P. K. Allen, “Alignment Using an Uncalibrated Camera System,” IEEE Journal of Robotics and Automation, Vol. 11, No. 4, pp. 516-521, August 1995.
[19] C. Cai, E. Dean-Leon, N. Somani and A. Knoll, “3D Image-based Dynamic Visual Servoing With Uncalibrated Stereo Cameras,” International Symposium on Robotics (ISR), Seoul, Korea, October 24-26, 2013, pp. 1-6.
[20] C. Cai, E. Dean-Leon, D. Mendoza, N. Somani and A. Knoll, “Uncalibrated 3D Stereo Image-based Dynamic Visual Servoing for Robot Manipulators,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan, November 3-7, 2013.
[21] “STC630,” http://www.sentechamerica.com/cameras-interlace/STC-630case
d.aspx
[22] “KOWA LM6NCL,” http://www.kowa.eu/fa/en/LM6NCL.php
[23] “Dynamixel RX64 User’s Manual,” – pdf file.
[24] “TMS320C6000 Optimizing Compiler v7.4 User's Guide,” – pdf file.
[25] “TMS320C6713 datasheet,” – pdf file.
[26] “DSK6713實體圖,” http://www.dsprelated.com/blogimages/DavidValencia/GPI
Oarticle/DSK6713.JPG
[27] “TMS320C6713 DSK Technical Reference,” – pdf file.
[28] “RS-232-C AND RS-422 Serial Communication Port Expansion Daughter Cards C5000/C6000 DSK Technical Reference Manual,” – pdf file.
[29] “HighTek HC-01無源RS-232與RS-485轉換器使用說明” – pdf file.
[30] “DMEK642 / DMDK642 User’s Manual,” – pdf file.
[31] “TMS320DM642 datasheet,” – pdf file.
[32] “DMEK642元件配置圖,” http://www.ict.com.tw/DSP/ateme/dedk642.htm
[33] “RGB色彩空間,” http://en.wikipedia.org/wiki/File:RGB_Cube_Show_lowgam
ma_cutout_a.png
[34] “HSV色彩空間,” http://zh.wikipedia.org/wiki/HSL%E5%92%8CHSV%E8%89
%B2%E5%BD%A9%E7%A9%BA%E9%97%B4#mediaviewer/File:Triangulo_HSV.png
[35] “YUV與RGB色彩空間轉換公式,” http://en.wikipedia.org/wiki/YUV
[36] “YUV色彩空間,” http://softpixel.com/~cwright/programming/colorspace/yuv/
yuv-cube.darkside.small.png
[37] “YUV色彩空間分解圖,” http://zh.wikipedia.org/wiki/File:Barn-yuv.png
[38] “Y=0.5之YUV色彩空間” http://zh.wikipedia.org/wiki/File:YUV_UV_plan
e.png
[39] H. Cho, Optomechatronics: Fusion of Optical and Mechatronic Engineering, New York: Taylor & Francis, 2006.
[40] R. Tsai, “A Versatile Camera Calibration Technique For High-accuracy 3D Machine Vision Metrology Using Off-the-shelf TV Cameras and Lenses,” IEEE Journal of robotics and Automation, Vol. 3, No. 4, pp 323-344, 1987.
[41] Z. Zhang, “A flexible New Technique For Camera Calibration,” IEEE Transactions on pattern analysis and machine intelligence, Vol. 22, No. 11, pp 1330-1334, 2000.
[42] M. W. Spong, S. Hutchinson and M. Vidyasagar, Robot modeling and Control, Hoboken, New Jersey: John Wiley, 2006.
[43] 施慶隆, 機電整合控制-多軸運動設計與應用(第二版), 新北市: 全華出版社, 2009.
[44] S. Yoshiaki and I. Hirochika, “Guiding a Robot by Visual Feedback in Assembling Tasks,” Pattern Recognition, Vol. 5, No. 2, pp. 99-108, 1973.
[45] J. Hill and W. T. Park, Real time control of a robot with a mobile camera, Menlo Park, California: SRI International, 1979.
[46] L. E. Weiss, A. C. Sanderson, and C. P. Neuman, “Dynamic Sensor-Based Control of Robots with Visual Feedback,” IEEE Journal of Robotics and Automation, Vol. 3, No. 5, pp. 404-417, 1987.
[47] D. E. Whitney, “Resolved Motion Rate Control Of Manipulators And Human Prostheses,” IEEE Transactions on Man-Machine Systems, Vol. 10, No. 2, pp. 47-53, 1969.