簡易檢索 / 詳目顯示

研究生: 洪偉庭
Wei-Ting Hung
論文名稱: 基於感測器融合之全自主物品剛性估測系統開發
Development of a Sensor Fusion Based Autonomous Object Stiffness Estimation System
指導教授: 林紀穎
Chi-Ying Lin
口試委員: 郭重顯
Chung-Hsien Kuo
顏炳郎
Ping-Lang Yen
學位類別: 碩士
Master
系所名稱: 工程學院 - 機械工程系
Department of Mechanical Engineering
論文出版年: 2015
畢業學年度: 103
語文別: 中文
論文頁數: 116
中文關鍵詞: 影像伺服擴張型卡爾漫濾波器感測器融合最小平方回歸法剛性曲線
外文關鍵詞: Visual servo, Extended Kalman Filter, Sensor fusion, Least Square Regress, Stiffness curve
相關次數: 點閱:197下載:3
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本研究旨在發展一套自動化剛性曲線即時估測系統,經由目標物影像定位後,藉由夾取物品,量測物品的剛性曲線,並線上擬合其曲線方程式做後續相關運用。本研究使用機械手臂結合影像伺服演算法追蹤物品位置,再運用機械視覺與編碼器回傳的資訊,經由擴張型卡爾漫濾波器將兩種感測器的資訊進行融合,以得到更加準確的夾取壓縮量。安裝在手臂夾爪前端的力量感測器則用來量測夾取的力量,根據量得的力量與位置,畫出物體的剛性曲線,之後再運用最小平方回歸法進行曲線擬合獲得當下物品的剛性曲線方程式。實驗結果證實本系統的可行性。


    This research aims to develop an on-line stiffness estimation system which can find a mathematical expression of stiffness property of the grasped object by using sensor fusion technique. This study uses a robot manipulator and visual servo control to autonomously grasp the object. In order to increase the accuracy of the object compression values, an Extended Kalman Filter is applied to fuse the data obtained from the webcam and encoder. Moreover, the grasping forces are measured by a low cost force sensor installed on the jaw of the manipulator. The force and position data are then used to find the stiffness property of the object. An on-line least square method is applied to find the stiffness equation with time-varying parameters. The experimental results verify the feasibility of the proposed method.

    目錄 摘要 Abstract 致謝 目錄 圖目錄 表目錄 第一章 緒論 1.1 前言 1.2 文獻回顧與研究動機 1.3 研究方法 1.4 論文架構與貢獻 第二章 系統架構 2.1 個人電腦 2.2 攝影機 2.3 機械手臂 2.4 Arduino控制板 2.5 Force Sensor 第三章 影像處理與機器視覺 3.1數位影像 3.2 影像前處理 3.2.1 色彩濾波 3.2.2 影像二值化 3.2.3 形態學 3.2.4 物體形心運算 3.3 立體視覺建立與相機校正 3.3.1 立體視覺特性 3.3.2 攝影機成像原理 3.3.3 座標系轉換與介紹 3.3.4 相機校正 3.3.5 三維座標估測 第四章 機械手臂運動分析與控制 4.1 機械手臂座標系統 4.2 正向運動學 4.3 微分運動學 4.4 機械手臂影像伺服控制 第五章 感測器融合演算法 5.1 感測器融合概述 5.2 卡爾漫濾波器概述 5.2.1 擴張型卡爾漫濾波器 5.3 追蹤模型推導 5.3.1 攝影機模型 5.3.2 編碼器模型 第六章 曲線擬合演算法 6.1 最小平方回歸法 6.2 回歸係數推導 第七章 實驗結果與討論 7.1影像伺服靜態追蹤結果 7.2感測器融合之追機結果 7.3剛性曲線估測結果 7.3.1 剛性曲線量測實驗 7.3.2 剛性曲線擬合實驗 7.4 實驗總結 第八章 實驗結果與討論 8.1 結論 8.2 未來研究目標 參考文獻 

    [1] “家用的看護型機器人,” http://www.liceopanamericanito.com/2014/Index.php
    /es/features/mi-control-diario-de-actividades
    [2] “醫療用的手術機器人,” http://www.maxonmotor.com.tw/maxon/view/applica
    tion/SURGICAL-ROBOTS-AB
    [3] “工業型機械手臂,” http://www.taiwantrade.com.tw/CH/bizsearchdetail/72046
    92
    [4] “仿生機器人,” http://tech.big5.enorth.com.cn/system/2011/03/29/006268332.s
    html
    [5] “機械手拿取陶瓷杯,” http://www.genehope.com.tw/modules/tadnews/index.p
    hp?nsn=2334
    [6] “機器人協助人類完成任務,” http://www.tw105.com/joke/19943.html
    [7] P. K. Allen, A. T. Miller, P. Y. Oh, B. S. Leibowitz, “Using Tactile and Visual Sensing with a Robotic Hand,” IEEE International Conference on Robotics and Automation Albuquerque, New Mexico, April 20-25, 1997, pp 676-681.
    [8] C. H. Chuang, C. L. Mu, M. S. Wang, K. F. Lu, Y. C. Yu, C. T. Lin, “Flexible Tactile Sensor for the Grasping Control of Robot Fingers,” International Conference on Advanced Robotics and Intelligent Systems, Tainan, Taiwan, May 31-June 2, 2013, pp 141-146.
    [9] “Honda ASIMO,” http://world.honda.com/ASIMO/technology/2011/performing/
    [10] “Tools for Measuring Limb Tissue Biomechanics,” http://biomech.media.mit.edu
    /#/portfolio_page/socket-fit/
    [11] “夾抓量測剛性示意圖,” http://www.accudyna.com.tw/schunk/news-lwa.html
    [12] S. Tsuji, A. Kimoto, E. Takahashi, “A Multifunction Tactile and Proximity Sensing Method by Optical and Electrical Simultaneous Measurement,” IEEE Transactions on Instrumentation And Measurement, Vol. 61, No. 12, pp 3312-3317, 2012.
    [13] H. Takao, M. Yawata, R. Kodama, K. Sawada, M. Ishida, “Silicon MEMS tactile imager using flexible deformation of integrated pixel array,” World Automation Congress, pp 1-6, September 28 - October 2, 2008.
    [14] “Boston Dynamics BigDog,” http://www.bostondynamics.com/robot_bigdog.
    html
    [15] “使用感測器融合的導航統,” http://www.datawest.com.hk/navi06_01_can.html
    [16] “汽車感測器融合系統圖,” http://www.mem.com.tw/article_content.asp?sn=
    1412050003
    [17] R. E. Kalman “A New Approach to Linear Filtering and Prediction Problems,” Transactions of the ASME - Journal of Basic Engineering, Vol. 82, pp 35-45, 1960.
    [18] R. E. Kalman, R. S. Bucy “New Results in Linear Filtering and Prediction Theory,” Transactions of the ASME-Journal of Basic Engineering, Vol. 83, pp. 95-107, 1961.
    [19] T. H. S. Li, Y. T. Su, S. H. Liu, J. J. Hu, and C. C. Chen “Dynamic Balance Control for Biped Robot Walking Using Sensor Fusion, Kalman Filter, and Fuzzy Logic,” IEEE, Transactions On Industrial Electronics, Vol. 59, No. 11, pp 4394-4408, November 2012.
    [20] J. Ilonen, J. Bohg and V. Kyrki “Fusing Visual and Tactile Sensing for 3-D Object Reconstruction While Grasping” 2013 IEEE International Conference on Robotics and Automation (ICRA) Karlsruhe, Germany, May 6-10, 2013, pp 3547 -3554.
    [21] 黃良吉,“GPS與感測器整合於三維地面車輛定位之應用,”國立台灣科技大學機械工程學系碩士論文,2006.
    [22] 陳宥仁,“全自主電動載具導航與控制之感測器融合系統之開發,”國立成功大學電機工程學系專班碩士論文,2010.
    [23] “OpenCV,” http://zh.wikipedia.org/wiki/OpenCV
    [24] “OpenGL,” http://zh.wikipedia.org/wiki/OpenGL
    [25] “伺服馬達AX-18A,” http://www.trossenrobotics.com/dynamixel-ax-18A-robot
    -actuator.aspx
    [26] “伺服馬達MX-28T,” http://support.robotis.com/en/product/dynamixel/mx_s
    eries/mx-28.htm
    [27] “Arduino UNO控制板,” http://www.arduino.cc/en/Main/ArduinoBoardUno
    [28] “FlexiForce force sensors,” https://www.tekscan.com/product-group/embedded
    -sensing/force-sensors
    [29] http://en.wikipedia.org/wiki/Pixel
    [30] http://en.wikipedia.org/wiki/HSL_and_HSV
    [31] “YUV色彩空間,” http://en.wikipedia.org/wiki/YUV
    [32] “YIQ色彩空間,” http://en.wikipedia.org/wiki/YIQ
    [33] “影像二質化,” http://blog.csdn.net/jia20003/article/details/8074627
    [34] 陳慶昌, “影像自動化微組裝工廠之發展,” 國立成功大學機械工程學系碩士論文,2006.
    [35] “擴張運算示意圖,” http://homepages.inf.ed.ac.uk/rbf/HIPR2/dilate.htm
    [36] “侵蝕運算示意圖,” http://homepages.inf.ed.ac.uk/rbf/HIPR2/erode.htm
    [37] “開放運算示意圖,” http://homepages.inf.ed.ac.uk/rbf/HIPR2/open.htm
    [38] “封閉運算示意圖,” http://homepages.inf.ed.ac.uk/rbf/HIPR2/close.htm
    [39] http://blog.xuite.net/pinholejerry/wretch/139965818-針孔成像的歷史與原理
    [40] http://wwwchangsscom.blogspot.tw/2013/04/el-ojo-humano-camara-oscura.html
    [41] H. Cho, Optomechatronics : Fusion of Optical and Mechatronic Engineering, New York: Taylor & Francis, 2006.
    [42] R. Tsai, “A Versatile Camera Calibration Technique For High-accuracy 3D Machine Vision Metrology Using Off-the-shelf TV Cameras and Lenses,” IEEE Journal of robotics and Automation, Vol. 3, No. 4, pp 323-344, 1987.
    [43] Z. Zhang, “A flexible New Technique For Camera Calibration,” IEEE Transactions on pattern analysis and machine intelligence, Vol. 22, No. 11, pp 1330-1334, 2000.
    [44] “Camera Calibration Toolbox for Matlab,” http://www.vision.caltech.edu
    /bouguetj/calib_doc/htmls/example.html
    [45] M. W. Spong, S. Hutchinson and M. Vidyasagar, Robot modeling and Control, Hoboken, New Jersey: John Wiley, 2006.
    [46] 施慶隆, 機電整合控制-多軸運動設計與應用(第二版), 新北市: 全華出版社, 2009.
    [47] J. Hill and W. T. Park, Real time control of a robot with a mobile camera, Menlo Park, California: SRI International, 1979.
    [48] K. Hashimoto, “A Review on Vision-based Control of Robot Manipulators,” Advanced Robotics, Vol. 17, No. 10, pp. 969-991, 2003.
    [49] http://plato.stanford.edu/entries/bayes-theorem/
    [50] http://www.biodiversitylibrary.org/bibliography/19023#/summary
    [51] http://en.wikipedia.org/wiki/Least_squares
    [52] http://140.135.72.1/~QMLab/AssistanceHandBook/%E7%B0%A1%E5%96%AE%E7%B7%9A%E6%80%A7%E8%BF%B4%E6%AD%B8.pdf
    [53] http://www.slideshare.net/21_venkat/correlation-regression-17406392

    QR CODE