研究生: |
施仁翔 Jen-Hsiang Shih |
---|---|
論文名稱: |
基於雙目視覺與機械手臂之全自主高爾夫桿頭殘砂去除系統 Autonomous Golf Club Head Sand-residual Removal System Based on Stereovision and Robot Arm |
指導教授: |
林其禹
Chyi-Yeu Lin |
口試委員: |
林柏廷
Po-Ting Lin 李維楨 Wei-Chen Lee |
學位類別: |
碩士 Master |
系所名稱: |
工程學院 - 機械工程系 Department of Mechanical Engineering |
論文出版年: | 2022 |
畢業學年度: | 110 |
語文別: | 中文 |
論文頁數: | 77 |
中文關鍵詞: | 高爾夫桿頭除砂 、智慧自動化 、電腦視覺 、雙目視覺 、深度學習 、物件辨識 、六軸機械手臂 |
外文關鍵詞: | Golf club head sand removal, Intelligent automation, Computer vision, Stereo vision, Deep learning, Object detection, 6-axis robot arm |
相關次數: | 點閱:233 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
許多金屬高爾夫桿頭皆以脫蠟鑄造方式製造,脫模後許多模粒(砂粒)經常殘留在桿頭的一些部位。目前皆以人工手持桿頭,放在噴砂機前方,讓噴砂機噴頭高速噴出的金屬砂粒衝擊殘砂,並將以清除。此作業具有低效能和傷害桿頭內部微型結構的疑慮。
本論文研究開發一套取代人力,能夠將脫模後的高爾夫桿頭內殘留的模砂去除的創新智慧自動化系統。本系統使用六軸機械手臂執行夾取桿頭並進行所需空間移動的工作;採用一套微型雙目相機系統負責對桿頭各部位進行影像擷取;開發一套透過深度學習之物件辨識的Mask R-CNN網路架構進行桿頭各部位的殘砂辨識,並取得桿頭上所有的殘砂的空間位置。最後根據該些殘砂分佈位置,即時建立六軸機械手臂的除砂運動軌跡,依序讓桿頭上每一個具殘砂部位被準確移動至噴砂機的噴嘴正前方,進行高效自動噴砂。整套除砂所有動作皆全自主進行,發揮高效能和可靠度。
此套全自動智慧除砂系統除能夠代替工人在高噪音惡劣環境下完成除砂作業,更可以藉智慧系統提高除砂效率,並確保不會對桿頭進行過度噴砂動作導致傷害桿頭內部結構安全。
Many metal golf club heads are manufactured by lost-wax casting, and many mold particles (residual sand) are often left in some parts of the club head after demolding. At present, the golf head is manually held and placed in front of the sandblasting machine, so that the high-speed jetting of the sandblasting machine shoots and thus removes residual sand. This operation has concerns of low efficiency and potential damage to the microstructure inside the golf head.
This paper studies and develops an innovative intelligent automation system that can replace manpower and remove the residual sand in the golf club head after demolding in a fully autonomous manner. The system uses a 6-axis robotic arm to clamp the golf head and move to the required space. A set of miniature stereo camera system is used to capture the image of a number of parts of the golf head. A Mask R-CNN network architecture based on deep learning object detection was developed to identify the residual sand of each of the image-captured parts of the golf head, and obtain the spatial locations of all the residual sand of the golf head. Finally, according to the distribution position of the residual sand, the sand-removal movement trajectory of the 6-axis robotic arm is established immediately, and every part of the golf head with residual sand is accurately moved to the front of the nozzle of the sandblasting machine in order to carry out efficient automatic sandblasting. The complete set of sand removal operations are fully autonomous, with both high-efficiency and reliability.
This autonomous golf club head sand-residual removal system can not only replace workers in noisy and harsh environments, but also improve the sand removal efficiency and ensure that the internal structure of the golf head will not be accidentally damaged.
[1] Ji, W. and L. Wang, Industrial robotic machining: a review. The International Journal of Advanced Manufacturing Technology, 2019. 103(1): p. 1239-1255.
[2] Iqbal, J., et al., Automating industrial tasks through mechatronic systems – A review of robotics in industrial perspective. Tehnicki vjesnik/Technical Gazette, 2016. 23(3).
[3] Hardin, W., Vision enables freestyle bin picking. Vision System Design, 2007. 12(6).
[4] Iversen, W., Vision-guided robotics: in search of the holy grail. Automation World, 2006: p. 28-31.
[5] Oh, J.-K., et al., Development of structured light based bin–picking system using primitive models, in Frontiers of Assembly and Manufacturing. 2010, Springer. p. 141-155.
[6] Oh, J.-K., S. Lee, and C.-H. Lee, Stereo vision based automation for a bin-picking solution. International Journal of Control, Automation and Systems, 2012. 10(2): p. 362-373.
[7] Tsai, C.-Y., et al., Visually guided picking control of an omnidirectional mobile manipulator based on end-to-end multi-task imitation learning. IEEE Access, 2019. 8: p. 1882-1891.
[8] Du, Y.-C., et al., Stereo vision-based object recognition and manipulation by regions with convolutional neural network. Electronics, 2020. 9(2): p. 210.
[9] Bartoš, M., et al., An overview of robot applications in automotive industry. Transportation Research Procedia, 2021. 55: p. 837-844.
[10] Fang, W., X. Xu, and X. Tian, A vision-based method for narrow weld trajectory recognition of arc welding robots. The International Journal of Advanced Manufacturing Technology, 2022: p. 1-12.
[11]http://www.mathworks.com/help/vision/ug/cameracalibration.html?w.mathworks.com
[12] https://medium.com/image-processing-and-ml-note/camera-calibration
[13] Spong, M.W., S. Hutchinson, and M. Vidyasagar, Robot modeling and control. Vol. 3. 2006: Wiley New York.
[14] Hartley, R. and A. Zisserman, Multiple view geometry in computer vision. 2003: Cambridge university press.
[15] Dumont, M., Real-time view interpolation for eye gaze corrected video conferencing. 2015, Universiteit Hasselt.
[16] Mattoccia, S., Stereo vision: Algorithms and applications. University of Bologna, 2011. 22.
[17] Burger, W., Zhang’s camera calibration algorithm: in-depth tutorial and implementation. HGB16-05, 2016: p. 1-6.
[18] Viola, P. and M. Jones, Robust real-time object detection. International journal of computer vision, 2001. 4: p. 34-47.
[19] Liu, L., L. Shao, and P. Rockett, Boosted key-frame selection and correlated pyramidal motion-feature representation for human action recognition. Pattern recognition, 2013. 46(7): p. 1810-1818.
[20] Lowe, D.G., Distinctive image features from scale-invariant keypoints. International journal of computer vision, 2004. 60(2): p. 91-110.
[21] Zhao, Z.-Q., et al., Object detection with deep learning: A review. IEEE transactions on neural networks and learning systems, 2019. 30(11): p. 3212-3232.
[22] https://lucidar.me/en/neural-networks/simplest-perceptron/
[23] Plaut, D.C., Experiments on Learning by Back Propagation. 1986.
[24] Freund, Y. and R.E. Schapire. Large margin classification using the perceptron algorithm. in Proceedings of the eleventh annual conference on Computational learning theory. 1998.
[25] https://chih-sheng-huang821.medium.com
[26] https://zhuanlan.zhihu.com/p/77471866
[27] https://ithelp.ithome.com.tw/articles/10275139
[28] https://ithelp.ithome.com.tw/articles/10220782
[29] He, K., et al. Mask r-cnn. in Proceedings of the IEEE international conference on computer vision. 2017.
[30] https://www.twblogs.net/a/5c4ac807bd9eee6e7d81c2ad
[31] https://www.epcio.com.tw/paper/六軸機械臂之控制理論分析與應用.pdf
[32] Spong, M. and M. Vidyasagar, Robot Dynamics and Control. New York: John Willey & Sons. 1989, Inc.
[33] Tsai, L.-W. and A.P. Morgan, Solving the kinematics of the most general six-and five-degree-of-freedom manipulators by continuation methods. 1985.
[34] Liang, P., et al., Research of hand–eye system with 3D vision towards flexible assembly application. Electronics, 2022. 11(3): p. 354.