簡易檢索 / 詳目顯示

研究生: Chi-Cuong Tran
Chi-Cuong Tran
論文名稱: 以雷射視覺為基礎的機器手臂全自主焊接系統
LaserVision Based Autonomous Industrial Robot Welding Systems
指導教授: 林其禹
Chyi-Yeu Lin
口試委員: 林其禹
Chyi-Yeu Lin
林柏廷
Po-Ting Lin
林紀穎
Chi-Ying Lin
楊谷洋
Kuu-Young Young
陳金聖
Chin-Sheng Chen
學位類別: 博士
Doctor
系所名稱: 工程學院 - 機械工程系
Department of Mechanical Engineering
論文出版年: 2023
畢業學年度: 111
語文別: 英文
論文頁數: 121
中文關鍵詞: LaserVision sensorReal-time seam trackingSeam correctionTrajectory-based controlAutomatic calibration
外文關鍵詞: LaserVision sensor, Real-time seam tracking, Seam correction, Trajectory-based control, Automatic calibration
相關次數: 點閱:167下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

本論文旨在開發一套基於工業機械手臂的全自主焊接系統,透過整合雷射視覺感測器,克服機器人焊接中與系統校正相關限制。傳統手動校正過程耗時且容易出錯,在準確焊接操作上又缺乏精度,限制了當前系統校正的成效。為了解決上述限制,本文提出了一種自動化校正方法,用於機器人焊接系統中並導入雷射視覺感測器。該方法目的於改善校正過程的精度和效率,確保機械手臂在焊接過程中的移動精確性。此外,該提出的解決方案為整體基於雷射視覺的機器人焊接研究中提供基礎,以及評估效率和可行性。
同時,本論文還介紹了一種具有智慧縫線提取的路徑規劃方法,應用於所提出的自動化機器人焊接系統中。該方法在透過手動教導與離線編程方式,為機械手臂建立三維曲線焊道軌跡,提供一種高效的替代方案,並且能夠優先考量焊縫位置。此方法使用兩階段程序,首先結合2D影像與3D點雲進行全域規劃,接著實施基於雷射視覺的局部定位,減小與點雲相關誤差。此外,還研發了一個穩健的即時焊縫追踪系統,利用雷射視覺感測器解決特徵點提取、端點檢測和基於軌跡控制策略上的各種挑戰,並在焊縫追踪方面展示了先進性,提供即時監測能力,提升機械手臂在焊接應用上的效率與精度。總而言之,本研究提供一種智慧系統解決方案,可達到路徑規劃、縫線提取與追踪,在工業應用中加強焊接準確性、效率和整體質量,為自動化機械手臂焊接領域做出貢獻。


This thesis aims to develop an autonomous welding system using industrial robots to tackle the limitations associated with system calibration in robotic welding by integrating laser vision sensors. Manual calibration processes that are time-consuming, prone to human error, and lack the precision required for accurate welding operations may limit current system calibration. To address these limitations, an automated calibration method is proposed for laser vision sensors used in robotic welding systems. The method aims to improve the precision and efficiency of the calibration process while ensuring the accurate movement of the robot arm during welding operations. Furthermore, this solution provides the basis for evaluating the efficiency and accuracy of the entire laser vision-based robotic welding system.
The thesis also introduces a path planning approach with intelligent seam extraction for automatic robot welding systems. This approach offers an efficient alternative to manual teaching programming and offline programming for 3D curved welds while prioritizing seam precision. It utilizes a two-stage process that combines 2D images with a 3D point cloud for global planning. A local positioning approach based on laser vision is then implemented to reduce the error associated with the point cloud. Furthermore, a robust real-time tracking system for weld seams is being developed using a laser vision sensor. The system addresses challenges related to feature point extraction, endpoint detection, and trajectory-based control strategies. The system demonstrates advancements in weld seam tracking, offering real-time and precise monitoring capabilities for improved efficiency and accuracy in robotic welding applications. Overall, this research contributes to the field of automatic robot welding by providing intelligent solutions for path planning, seam extraction, and tracking. These solutions improve the accuracy, efficiency, and overall quality of welds in industrial environments.

摘 要 I Abstract II Table of Contents III List of Figures VI List of Tables X Chapter 1 Introduction 1 1.1 Background and Motivation 1 1.2 Literature Review 3 1.2.1 Teach-playback 3 1.2.2 Offline Programing 4 1.2.3 Sensor-based Positioning 7 1.3 Objectives and Scope of Study 14 1.4 Thesis Organization 15 Chapter 2 Automatic Calibration Method for Laser Vision Sensor in Robotic Welding System 16 2.1 Introduction 16 2.2 System Overview 19 2.2.1 Laser Vision Sensor 20 2.2.2 Camera Parameters 21 2.2.3 Hand-eye Calibration 22 2.2.4 Laser Vision Calibration 24 2.3 Automatic Calibration Method for Robotic Welding System 27 2.3.1 Coordinate System 27 2.3.2 Initial Positions Adjustment 28 2.3.3 Automatic Hand-eye Calibration 29 2.3.4 Automatic Laser Plane Calibration 31 2.3.5 Reconstruction of 3D Point Cloud for Translation Component Adjustment 32 2.4 Experiment and Results 35 2.4.1 System Configurations 35 2.4.2 Verification of Calibration 35 2.5 Conclusion 42 Chapter 3 Intelligent Path Planning of Welding Robot Based on Multisensor Interaction 44 3.1 Introduction 44 3.2 System Overview 48 3.2.1 Vision Sensor 50 3.2.2 Coordinates System 50 3.3 Proposed Global Path Planning 52 3.3.1 Seam Detection Based on Deep Learning 52 3.3.2 Point Cloud Reconstruction 53 3.3.3 Coarse Extraction Using RGB-D Image 55 3.3.4 Seam Pose Estimation 56 3.4 Appropriate Local Path Planning 57 3.4.1 Analysis of Weld Joint Types 57 3.4.2 Laser Stripe Extraction 58 3.4.3 Feature Point Extraction 61 3.5 Experimental Results and Discussion 62 3.5.1 Verification of Global Planning 63 3.5.2 Verification of Local Positioning 67 3.5.3 Results Analysis 70 3.6 Conclusion 72 Chapter 4 Real-time Seam Tracking System Based on Laser Vision Sensor 74 4.1 Introduction 74 4.2 System Overview 76 4.3 Weld Feature Extraction 78 4.3.1 Image Denoise 78 4.3.2 Laser Stripe Extraction 82 4.3.3 Relative Feature Points Extraction 83 4.3.4 Endpoint Detection 84 4.4 Seam Tracking Algorithm 85 4.4.1 Robot-sensor Synchronization 85 4.4.2 Seam Trajectory Planning 86 4.4.3 Initialization Measurement 89 4.4.4 Forward Direction Tracking Strategies 91 4.4.5 Filtering and Smoothing 93 4.5 Experimental Results and Discussion 95 4.5.1 Feature Points Extraction and Endpoint Detection 96 4.5.2 Results Analysis 97 4.6 Conclusions 102 Chapter 5 Conclusions and Future Works 103 References 107

[1] S. B. Chen, “On the key technologies of intelligentized welding robot,” in Robotic Welding, Intelligence and Automation, 2007, pp. 105–115.
[2] S. B. Chen and N. Lv, “Research evolution on intelligentized technologies for arc welding process,” J. Manuf. Process., vol. 16, no. 1, pp. 109–122, 2014.
[3] S.-B. Chen, “On intelligentized welding manufacturing,” in Robotic Welding, Intelligence and Automation, 2015, pp. 3–34.
[4] Y. Chen, S and Qiu, T and Lin, T and Wu, “On intelligentized technologies for modern welding manufacturing,” Chinese J. Mech. Eng., vol. 16, pp. 367--370, 2003.
[5] P. Viola and M. Jones, “Rapid object detection using a boosted cascade of simple features,” in Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001, 2001, pp. I–I.
[6] B. Chen, H. Zhang, J. Feng, and S. Chen, “A study of welding process modeling based on Support Vector Machines,” in Proceedings of 2011 International Conference on Computer Science and Network Technology, 2011, pp. 1859–1862.
[7] E. N. Malamas, E. G. M. Petrakis, M. Zervakis, L. Petit, and J.-D. Legat, “A survey on industrial vision systems, applications and tools,” Image Vis. Comput., vol. 21, no. 2, pp. 171–188, 2003.
[8] H. N. Mohd Shah, M. Sulaiman, A. Z. Shukor, M. Herman Jamaluddin, and M. Zamzuri Ab Rashid, “A review paper on vision based identification, detection and tracking of weld seams path in welding robot environment,” Mod. Appl. Sci., vol. 10, no. 2, p. 83, 2016.
[9] N. R. Nayak and A. Ray, Intelligent seam tracking for robotic welding. Springer Science \& Business Media, 2013.
[10] F. Xu, Y. Xu, H. Zhang, and S. Chen, “Application of sensing technology in intelligent robotic arc welding: A review,” J. Manuf. Process., vol. 79, pp. 854–880, 2022.
[11] Q. Wang, W. Jiao, P. Wang, and Y. Zhang, “Digital twin for human-robot interactive welding and welder behavior analysis,” IEEE/CAA J. Autom. Sin., vol. 8, no. 2, pp. 334–343, 2021.
[12] S. Hongyuan, H. Xixia, L. Tao, and C. Shanben, “Weld formation control for arc welding robot,” Int. J. Adv. Manuf. Technol., vol. 44, no. 5, pp. 512–519, 2009.
[13] T. Chettibi, H. E. Lehtihet, M. Haddad, and S. Hanchi, “Minimum cost trajectory planning for industrial robots,” Eur. J. Mech. - A/Solids, vol. 23, no. 4, pp. 703–715, 2004.
[14] K. Wang, J. Zhang, B. Zhao, and D. Tang, “Research on control technology of trajectory tracking for robotic welding,” in 2010 International Conference on Computational Intelligence and Software Engineering, 2010, pp. 1–4.
[15] A. Rout, B. B. V. L. Deepak, and B. B. Biswal, “Advances in weld seam tracking techniques for robotic welding: A review,” Robot. Comput. Integr. Manuf., vol. 56, no. June 2017, pp. 12–37, 2019.
[16] X. F. Zha and H. Du, “Generation and simulation of robot trajectories in a virtual CAD-based off-line programming environment,” Int. J. Adv. Manuf. Technol., vol. 17, pp. 610–624, 2001.
[17] S. Mitsi, K.-D. Bouzakis, G. Mansour, D. Sagris, and G. Maliaris, “Off-line programming of an industrial robot for manufacturing,” Int. J. Adv. Manuf. Technol., vol. 26, pp. 262–267, 2005.
[18] Z. Pan, J. Polden, N. Larkin, S. Van Duin, and J. Norrish, “Recent progress on programming methods for industrial robots,” in ISR 2010 (41st International Symposium on Robotics) and ROBOTIK 2010 (6th German Conference on Robotics), 2010, pp. 1–8.
[19] A. Heim and O. Von Stryk, “Trajectory optimization of industrial robots with application to computer-aided robotics and robot controllers,” Optimization, vol. 47, no. 3–4, pp. 407–420, 2000.
[20] B. B. V. L. Deepak, C. A. Rao, and B. M. V. A. Raju, “Weld seam tracking and simulation of 3-axis robotic arm for performing welding operation in CAD environment,” in CAD/CAM, Robotics and Factories of the Future, 2016, pp. 405–415.
[21] P. Neto, N. Mendes, R. Araújo, J. Norberto Pires, and A. Paulo Moreira, “High‐level robot programming based on CAD: Dealing with unpredictable environments,” Ind. Robot An Int. J., vol. 39, no. 3, pp. 294–303, Jan. 2012.
[22] G. C. Carvalho, M. L. Siqueira, and S. C. Absi-Alfaro, “Off-line programming of flexible welding manufacturing cells,” J. Mater. Process. Technol., vol. 78, no. 1, pp. 24–28, 1998.
[23] N. Larkin, A. Short, Z. Pan, and S. van Duin, “Automated programming for robotic welding,” in Transactions on Intelligent Welding Manufacturing, 2018, pp. 48–59.
[24] Y. Wang and N. Chi, “Path planning optimization for teaching and playback welding robot,” TELKOMNIKA Indones. J. Electr. Eng., vol. 11, 2013.
[25] Y. Liu, Q. Tang, and X. Tian, “A discrete method of sphere-pipe intersecting curve for robot welding by offline programming,” Robot. Comput. Integr. Manuf., vol. 57, pp. 404–411, 2019.
[26] Y. Liu, Q. Tang, X. Tian, and S. Yang, “A novel offline programming approach of robot welding for multi-pipe intersection structures based on NSGA-Ⅱ and measured 3D point-clouds,” Robot. Comput. Integr. Manuf., vol. 83, p. 102549, 2023.
[27] M. A. Isa and I. Lazoglu, “Design and analysis of a 3D laser scanner,” Measurement, vol. 111, pp. 122–133, 2017.
[28] P. Zanuttigh, G. Marin, C. Dal Mutto, F. Dominio, L. Minto, and G. M. Cortelazzo, “Operating principles of structured light depth cameras,” in Time-of-Flight and Structured Light Depth Cameras: Technology and Applications, Cham, 2016, pp. 43–79.
[29] M. Zollhöfer et al., “State of the art on 3D reconstruction with RGB-D cameras,” Comput. Graph. Forum, vol. 37, no. 2, pp. 625–652, 2018.
[30] M. Mahmud, D. Joannic, M. Roy, A. Isheil, and J.-F. Fontaine, “3D part inspection path planning of a laser scanner with control on the uncertainty,” Comput. Des., vol. 43, no. 4, pp. 345–355, 2011.
[31] M. Dhanda, A. Kukreja, and S. S. Pande, “Region-based efficient computer numerical control machining using point cloud data,” J. Comput. Inf. Sci. Eng., vol. 21, no. 4, 2021.
[32] L. Yang, Y. Liu, J. Peng, and Z. Liang, “A novel system for off-line 3D seam extraction and path planning based on point cloud segmentation for arc welding robot,” Robot. Comput. Integr. Manuf., vol. 64, p. 101929, 2020.
[33] S. M. Ahmed, Y. Z. Tan, C. M. Chew, A. Al Mamun, and F. S. Wong, “edge and corner detection for unorganized 3D point clouds with application to robotic welding,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018, pp. 7350–7355.
[34] R. Bormann, J. Hampp, M. Hägele, and M. Vincze, “Fast and accurate normal estimation by efficient 3D edge detection,” in 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2015, pp. 3930–3937.
[35] C. Choi, A. J. B. Trevor, and H. I. Christensen, “RGB-D edge detection and edge-based registration,” in 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2013, pp. 1568–1575.
[36] R. B. Rusu, N. Blodow, and M. Beetz, “Fast point feature histograms (FPFH) for 3D registration,” in 2009 IEEE International Conference on Robotics and Automation, 2009, pp. 3212–3217.
[37] E. Widyaningrum, B. Gorte, and R. Lindenbergh, “Automatic building outline extraction from ALS point clouds by ordered points aided hough transform,” Remote Sens., vol. 11, no. 14, 2019.
[38] J. Gao, F. Li, C. Zhang, W. He, J. He, and X. Chen, “A method of D-Type weld seam extraction based on point clouds,” IEEE Access, vol. 9, pp. 65401–65410, 2021.
[39] S. Yang, X. Shi, X. Tian, and Y. Liu, “An approach to the extraction of intersecting pipes weld seam based on 3D point cloud,” in 2022 IEEE 11th Data Driven Control and Learning Systems Conference (DDCLS), 2022, pp. 53–58.
[40] S. Lu, X. Shi, X. Tian, and Y. Liu, “Weld seam extraction of intersecting pipelines based on point cloud entropy,” in 2022 IEEE 11th Data Driven Control and Learning Systems Conference (DDCLS), 2022, pp. 385–390.
[41] Z. Ye, G. Fang, S. Chen, and J. J. Zou, “Passive vision based seam tracking system for pulse-MAG welding,” Int. J. Adv. Manuf. Technol., vol. 67, no. 9, pp. 1987–1996, 2013.
[42] B. Q. Leonardo et al., “Vision-based system for welding groove measurements for robotic welding applications,” in 2016 IEEE International Conference on Robotics and Automation (ICRA), 2016, pp. 5650–5655.
[43] Z. Fang, D. Xu, and M. Tan, “Vision-based initial weld point positioning using the geometric relationship between two seams,” Int. J. Adv. Manuf. Technol., vol. 66, no. 9, pp. 1535–1543, 2013.
[44] H. Luo and X. Chen, “Laser visual sensing for seam tracking in robotic arc welding of titanium alloys,” Int. J. Adv. Manuf. Technol., vol. 26, no. 9, pp. 1012–1017, 2005.
[45] P. Q. Xu, X. H. Tang, F. Q. Lu, and S. Yao, “Welded seam 3D calculation and seam location for welding robot system,” Sci. Technol. Weld. Join., vol. 11, no. 3, pp. 352–357, 2006.
[46] M. Wilson, “The role of seam tracking in robotic welding and bonding,” Ind. Robot An Int. J., vol. 29, no. 2, pp. 132–137, Jan. 2002.
[47] T. A. Clarke, K. T. V Grattan, and N. E. Lindsey, “Laser-based triangulation techniques in optical inspection of industrial structures,” in Optical Testing and Metrology III: Recent Advances in Industrial Optical Inspection, C. P. Grover, Ed., SPIE, 1991, pp. 474–486.
[48] W. Huang and R. Kovacevic, “Development of a real-time laser-based machine vision system to monitor and control welding processes,” Int. J. Adv. Manuf. Technol., vol. 63, no. 1, pp. 235–248, 2012.
[49] K. Sung, H. Lee, Y. S. Choi, and S. Rhee, “Development of a multiline laser vision sensor for joint tracking in welding,” Weld. J. (Miami, Fla), vol. 88, pp. 79s-85s, 2009.
[50] Y. Zou, Y. Wang, W. Zhou, and X. Chen, “Real-time seam tracking control system based on line laser visions,” Opt. Laser Technol., vol. 103, pp. 182–192, 2018.
[51] J. Guo, Z. Zhu, B. Sun, and Y. Yu, “Principle of an innovative visual sensor based on combined laser structured lights and its experimental verification,” Opt. Laser Technol., vol. 111, pp. 35–44, 2019.
[52] P. Xu, X. Tang, and S. Yao, “Application of circular laser vision sensor (CLVS) on welded seam tracking,” J. Mater. Process. Technol., vol. 205, no. 1, pp. 404–410, 2008.
[53] N. Jia, Z. Li, J. Ren, Y. Wang, and L. Yang, “A 3D reconstruction method based on grid laser and gray scale photo for visual inspection of welds,” Opt. Laser Technol., vol. 119, p. 105648, 2019.
[54] Y. Long, G. Xiqing, and Y. Jin, “An improved method of contour extraction of complex stripe in 3D laser scanning,” DEStech Trans. Eng. Technol. Res., 2016.
[55] Q.-Q. Wu et al., “A study on the modified Hough algorithm for image processing in weld seam tracking,” J. Mech. Sci. Technol., vol. 29, no. 11, pp. 4859–4865, 2015.
[56] Z. Fang, D. Xu, and M. Tan, “A vision-based self-tuning fuzzy controller for fillet weld seam tracking,” IEEE/ASME Trans. Mechatronics, vol. 16, no. 3, pp. 540–550, 2011.
[57] K. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale image recognition,” arXiv 1409.1556, 2014.
[58] A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet classification with deep convolutional neural networks,” in Advances in Neural Information Processing Systems, 2012, pp. 1097–1105.
[59] R. Alguliyev, R. Aliguliyev, and F. Abdullayeva, “Privacy-preserving deep learning algorithm for big personal data analysis,” J. Ind. Inf. Integr., vol. 15, 2019.
[60] L. Yang, J. Fan, B. Huo, E. Li, and Y. Liu, “Image denoising of seam images with deep learning for laser vision seam tracking,” IEEE Sens. J., vol. 22, no. 6, pp. 6098–6107, 2022.
[61] K. Wu, T. Wang, J. He, Y. Liu, and Z. Jia, “Autonomous seam recognition and feature extraction for multi-pass welding based on laser stripe edge guidance network,” Int. J. Adv. Manuf. Technol., vol. 111, no. 9, pp. 2719–2731, 2020.
[62] Z. Dong, Z. Mai, S. Yin, J. Wang, J. Yuan, and Y. Fei, “A weld line detection robot based on structure light for automatic NDT,” Int. J. Adv. Manuf. Technol., vol. 111, no. 7, pp. 1831–1845, 2020.
[63] Y. Xu and Z. Wang, “Visual sensing technologies in robotic welding: Recent research developments and future interests,” Sensors Actuators A Phys., vol. 320, p. 112551, 2021.
[64] L. Yang, Y. Liu, and J. Peng, “Advances techniques of the structured light sensing in intelligent welding robots: A review,” Int. J. Adv. Manuf. Technol., vol. 110, no. 3, pp. 1027–1046, 2020.
[65] Y. C. Shiu and S. Ahmad, “Calibration of wrist-mounted robotic sensors by solving homogeneous transform equations of the form AX=XB,” IEEE Trans. Robot. Autom., vol. 5, no. 1, pp. 16–29, 1989.
[66] F. C. Park and B. J. Martin, “Robot sensor calibration: solving AX=XB on the Euclidean group,” IEEE Trans. Robot. Autom., vol. 10, no. 5, pp. 717–721, 1994.
[67] J. Wu, Y. Sun, M. Wang, and M. Liu, “Hand-eye calibration: 4-D procrustes analysis approach,” IEEE Trans. Instrum. Meas., vol. 69, no. 6, pp. 2966–2981, 2020.
[68] Z. Wang, J. Fan, F. Jing, S. Deng, M. Zheng, and M. Tan, “An efficient calibration method of line structured light vision sensor in robotic eye-in-hand system,” IEEE Sens. J., vol. 20, no. 11, pp. 6200–6208, 2020.
[69] S. Xing, F. Jing, and M. Tan, “Reconstruction-based hand–eye calibration using arbitrary objects,” IEEE Trans. Ind. Informatics, vol. 19, no. 5, pp. 6545–6555, 2023.
[70] J. Wu et al., “Globally optimal symbolic hand-eye calibration,” IEEE/ASME Trans. Mechatronics, vol. 26, no. 3, pp. 1369–1379, 2021.
[71] J. Zhang, F. Shi, and Y. Liu, “Adaptive motion selection for online hand–eye calibration,” Robotica, vol. 25, no. 5, pp. 529–536, 2007.
[72] Y. Zhang, Z. Qiu, and X. Zhang, “A simultaneous optimization method of calibration and measurement for a typical hand–eye positioning system,” IEEE Trans. Instrum. Meas., vol. 70, pp. 1–11, 2021.
[73] F. Zhou and G. Zhang, “Complete calibration of a structured light stripe vision sensor through planar target of unknown orientations,” Image Vis. Comput., vol. 23, no. 1, pp. 59–67, 2005.
[74] Z. Xie, X. Wang, and S. Chi, “Simultaneous calibration of the intrinsic and extrinsic parameters of structured-light sensors,” Opt. Lasers Eng., vol. 58, pp. 9–18, 2014.
[75] Z. Liu, X. Li, F. Li, and G. Zhang, “Calibration method for line-structured light vision sensor based on a single ball target,” Opt. Lasers Eng., vol. 69, pp. 20–28, 2015.
[76] J. Xu, J. Douet, J. Zhao, L. Song, and K. Chen, “A simple calibration method for structured light-based 3D profile measurement,” Opt. Laser Technol., vol. 48, pp. 187–193, 2013.
[77] R. Xiao, Y. Xu, Z. Hou, C. Chen, and S. Chen, “An automatic calibration algorithm for laser vision sensor in robotic autonomous welding system,” J. Intell. Manuf., vol. 33, no. 5, pp. 1419–1432, 2022.
[78] A. Sioma, “Geometry and resolution in triangulation vision systems,” in Symposium on Photonics Applications in Astronomy, Communications, Industry, and High-Energy Physics Experiments (WILGA), 2020.
[79] R. Y. Tsai and R. K. Lenz, “A new technique for fully autonomous and efficient 3D robotics hand/eye calibration,” IEEE Trans. Robot. Autom., vol. 5, no. 3, pp. 345–358, 1989.
[80] J. C. K. Chou and M. Kamel, “Finding the position and orientation of a sensor on a robot manipulator using quaternions,” Int. J. Rob. Res., vol. 10, no. 3, pp. 240–254, 1991.
[81] K. Daniilidis, “Hand-eye calibration using dual quaternions,” Int. J. Rob. Res., vol. 18, no. 3, pp. 286–298, 1999.
[82] Z. Zhao and Y. Liu, “A hand–eye calibration algorithm based on screw motions,” Robotica, vol. 27, no. 2, pp. 217–223, 2009.
[83] N. Andreff, R. Horaud, and B. Espiau, “Robot hand-eye calibration using structure-from-motion,” Int. J. Rob. Res., vol. 20, no. 3, pp. 228–248, 2001.
[84] R. Horaud and F. Dornaika, “Hand-eye calibration,” Int. J. Rob. Res., vol. 14, no. 3, pp. 195–210, 1995.
[85] F. Dornaika and R. Horaud, “Simultaneous robot-world and hand-eye calibration,” IEEE Trans. Robot. Autom., vol. 14, no. 4, pp. 617–622, 1998.
[86] D. Q. Huynh, R. A. Owens, and P. E. Hartmann, “Calibrating a structured light stripe system: A novel approach,” Int. J. Comput. Vis., vol. 33, no. 1, pp. 73–86, 1999.
[87] Z. Li et al., “Hardware-oriented algorithm for high-speed laser centerline extraction based on Hessian matrix,” IEEE Trans. Instrum. Meas., vol. 70, pp. 1–14, 2021.
[88] W. Guan, W. Li, and Y. Ren, “Point cloud registration based on improved ICP algorithm,” in 2018 Chinese Control And Decision Conference (CCDC), 2018, pp. 1461–1465.
[89] M. A. Fischler and R. C. Bolles, “Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM, vol. 24, no. 6, pp. 381–395, Jun. 1981.
[90] M. Ferreira, A. Moreira, and P. Neto, “A low-cost laser scanning solution for flexible robotic cells: Spray coating,” Int. J. Adv. Manuf. Technol., vol. 58, 2013.
[91] H. Ma, S. Wei, Z. Sheng, T. Lin, and S. Chen, “Robot welding seam tracking method based on passive vision for thin plate closed-gap butt welding,” Int. J. Adv. Manuf. Technol., vol. 48, pp. 945–953, 2010.
[92] M. Graaf, R. Aarts, B. Jonker, and J. Meijer, “Real-time seam tracking for robotic laser welding using trajectory-based control,” Control Eng. Pract., vol. 18, pp. 944–953, 2010.
[93] Z. Jinle et al., “A Weld Position Recognition Method Based on Directional and Structured Light Information Fusion in Multi-Layer/Multi-Pass Welding,” Sensors (Basel)., vol. 18, 2018.
[94] B. Zhou, Y. Liu, Y. Xiao, R. Zhou, Y. Gan, and F. Fang, “Intelligent Guidance Programming of Welding Robot for 3D Curved Welding Seam,” IEEE Access, vol. 9, pp. 42345–42357, 2021.
[95] T. Le and C.-Y. Lin, “Bin-Picking for Planar Objects Based on a Deep Learning Network: A Case Study of USB Packs,” Sensors, vol. 19, p. 3602, Aug. 2019, doi: 10.3390/s19163602.
[96] A. K. Bedaka, A. M. Mahmoud, S.-C. Lee, and C.-Y. Lin, “Autonomous Robot-Guided Inspection System Based on Offline Programming and RGB-D Model,” Sensors, vol. 18, no. 11, 2018.
[97] L. Jing, J. Fengshui, and L. En, “RGB-D sensor-based auto path generation method for arc welding robot,” in 2016 Chinese Control and Decision Conference (CCDC), 2016, pp. 4390–4395.
[98] L. Yang and Y. Liu, “A Novel 3D Seam Extraction Method Based on Multi-Functional Sensor for V-Type Weld Seam,” IEEE Access, vol. 7, pp. 182415–182424, 2019.
[99] P. Zhou, R. Peng, M. Xu, V. Wu, and D. Navarro-Alarcon, “Path Planning With Automatic Seam Extraction Over Point Cloud Models for Robotic Arc Welding,” IEEE Robot. Autom. Lett., vol. PP, p. 1, 2021.
[100] W. Li, H. Li, and H. Zhang, “Light plane calibration and accuracy analysis for multi-line structured light vision measurement system,” Optik (Stuttg)., vol. 207, p. 163882, 2020.
[101] P. Zhou, K. Xu, and D. Wang, “Rail Profile Measurement Based on Line-structured Light Vision,” IEEE Access, vol. 6, pp. 16423–16431, 2018.
[102] L. Yang, E. Li, T. Long, J. Fan, and Z. Liang, “A High-Speed Seam Extraction Method Based on the Novel Structured-Light Sensor for Arc Welding Robot: A Review,” IEEE Sens. J., vol. 18, no. 21, pp. 8631–8641, 2018.
[103] X. Peiquan, T. Xinhua, L. Fenggui, and Y. Shun, “An active vision sensing method for welded seams location using ‘circle–depth relation’ algorithm,” Int. J. Adv. Manuf. Technol., vol. 32, pp. 918–926, 2007.
[104] L. Zhang, W. Ke, Q. Ye, and J. Jiao, “A novel laser vision sensor for weld line detection on wall-climbing robot,” Opt. Laser Technol., vol. 60, pp. 69–79, 2014.
[105] W. Shao, X. Liu, and Z. Wu, “A robust weld seam detection method based on particle filter for laser welding by using a passive vision sensor,” Int. J. Adv. Manuf. Technol., vol. 104, pp. 1–10, 2019.
[106] Y. Ding, W. Huang, and R. Kovacevic, “An on-line shape-matching weld seam tracking system,” Robot. Comput. Integr. Manuf., vol. 42, pp. 103–112, 2016.
[107] Y. Zou, X. Chen, G. Gong, and J. Li, “A seam tracking system based on a laser vision sensor,” Measurement, vol. 127, 2018.
[108] X. Li, X. Li, S. S. Ge, M. O. Khyam, and C. Luo, “Automatic Welding Seam Tracking and Identification,” IEEE Trans. Ind. Electron., vol. 64, no. 9, pp. 7261–7271, 2017.
[109] N. Wang, K. Zhong, X. Shi, and X. Zhang, “A robust weld seam recognition method under heavy noise based on structured-light vision,” Robot. Comput. Integr. Manuf., vol. 61, p. 101821, 2020.
[110] R. G. N. Meegama and J. C. Rajapakse, “NURBS snakes,” Image Vis. Comput., vol. 21, no. 6, pp. 551–562, 2003.
[111] L.-S. Hsu et al., “Using data pre-processing and convolutional neural network (CNN) to mitigate light deficient regions in visible light positioning (VLP) systems,” J. Light. Technol., vol. 40, pp. 1–7, 2022.
[112] G. debat, T. Chauhan, B. Cottereau, T. Masquelier, M. Paindavoine, and R. Baures, “Event-based trajectory prediction using spiking neural networks,” Front. Comput. Neurosci., vol. 15, p. 658764, 2021.
[113] W. Yu, Y. Li, H. Yang, and B. Qian, “The centerline extraction algorithm of weld line structured light stripe based on pyramid scene parsing network,” IEEE Access, vol. 9, pp. 105144–105152, 2021.
[114] R. Xiao, Y. Xu, Z. Hou, C. Chen, and S. Chen, “An adaptive feature extraction algorithm for multiple typical seam tracking based on vision sensor in robotic arc welding,” Sensors Actuators A Phys., vol. 297, p. 111533, 2019.
[115] K. He, G. Gkioxari, P. Dollár, and R. B. Girshick, “Mask {R-CNN},” CoRR, vol. abs/1703.0, 2017.
[116] S. Ren, K. He, R. B. Girshick, and J. Sun, “Faster {R-CNN:} Towards Real-Time Object Detection with Region Proposal Networks,” CoRR, vol. abs/1506.0, 2015.
[117] J. Yang, Y. Gao, D. Li, and S. L. Waslander, “ROBI: A multi-view dataset for reflective objects in robotic bin-picking,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2021, pp. 9788–9795.
[118] J. Yang, D. Li, and S. L. Waslander, “Probabilistic multi-view fusion of active stereo depth maps for robotic bin-picking,” IEEE Robot. Autom. Lett., vol. 6, no. 3, pp. 4472–4479, 2021.
[119] C. Steger, “An unbiased detector of curvilinear structures,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 20, no. 2, pp. 113–125, 1998.
[120] Q. Mourcou, A. Fleury, C. Franco, F. Klopcic, and N. Vuillerme, “Performance Evaluation of Smartphone Inertial Sensors Measurement for Range of Motion,” Sensors, vol. 15, no. 9, pp. 23168–23187, 2015.
[121] E. Hemingway and O. O’Reilly, “Perspectives on Euler angle singularities, gimbal lock, and the orthogonality of applied forces and applied moments,” Multibody Syst. Dyn., vol. 44, 2018.
[122] A. Grunnet-Jepsen, J. N. Sweetser, P. Winer, A. Takagi, and J. Woodfill, “Projectors for intel®realsenseTM depth cameras d4xx,” Intel Support. Interl Corp. St. Clara, CA, USA, 2018.
[123] B. Hong, A. Jia, Y. Hong, X. Li, J. Gao, and Y. Qu, “Online Extraction of Pose Information of 3D Zigzag-Line Welding Seams for Welding Seam Tracking,” Sensors, vol. 21, no. 2, 2021.
[124] Z. Wang and Y. Xu, “Vision-based seam tracking in robotic welding: A review of recent research,” in Transactions on Intelligent Welding Manufacturing, 2020, pp. 61–86.
[125] K. Nomura, K. Fukushima, T. Matsumura, and S. Asai, “Burn-through prediction and weld depth estimation by deep learning model monitoring the molten pool in gas metal arc welding with gap fluctuation,” J. Manuf. Process., vol. 61, pp. 590–600, 2021.
[126] B. Wang, S. J. Hu, L. Sun, and T. Freiheit, “Intelligent welding system technologies: State-of-the-art review and perspectives,” J. Manuf. Syst., vol. 56, pp. 373–391, 2020.
[127] Y. Zou, J. Chen, and X. Wei, “Research on a real-time pose estimation method for a seam tracking system,” Opt. Lasers Eng., vol. 127, p. 105947, 2020.
[128] Y. Xu et al., “Welding seam tracking in robotic gas metal arc welding,” J. Mater. Process. Technol., vol. 248, pp. 18–30, 2017.
[129] J. Muhammad, H. Altun, and E. Abo-Serie, “A robust butt welding seam finding technique for intelligent robotic welding system using active laser vision,” Int. J. Adv. Manuf. Technol., vol. 94, no. 1, pp. 13–29, 2018.
[130] Z. Zhang, G. Wen, and S. Chen, “Weld image deep learning-based on-line defects detection using convolutional neural networks for Al alloy in robotic arc welding,” J. Manuf. Process., vol. 45, pp. 208–216, 2019.
[131] X. Li, X. Li, S. S. Ge, M. O. Khyam, and C. Luo, “Automatic welding seam tracking and identification,” IEEE Trans. Ind. Electron., vol. 64, no. 9, pp. 7261–7271, 2017.
[132] Y. He, Y. Xu, Y. Chen, H. Chen, and S. Chen, “Weld seam profile detection and feature point extraction for multi-pass route planning based on visual attention model,” Robot. Comput. Integr. Manuf., vol. 37, pp. 251–261, 2016.
[133] Z. Fang, D. Xu, and M. Tan, “Visual seam tracking system for butt weld of thin plate,” Int. J. Adv. Manuf. Technol., vol. 49, no. 5, pp. 519–526, 2010.
[134] Y. Liu, A. W. Hoover, and I. D. Walker, “A timing model for vision-based control of industrial robot manipulators,” IEEE Trans. Robot., vol. 20, no. 5, pp. 891–898, 2004.
[135] M. de Graaf, R. Aarts, B. Jonker, and J. Meijer, “Rear-time trajectory generation for sensor-guided robotic laser welding,” IFAC Proc. Vol., vol. 39, no. 15, pp. 382–387, 2006.
[136] W. P. Gu, Z. Y. Xiong, and W. Wan, “Autonomous seam acquisition and tracking system for multi-pass welding based on vision sensor,” Int. J. Adv. Manuf. Technol., vol. 69, no. 1, pp. 451–460, 2013.
[137] L. Zhou, T. Lin, and S. B. Chen, “Autonomous acquisition of seam coordinates for arc welding robot based on visual servoing,” J. Intell. Robot. Syst., vol. 47, no. 3, pp. 239–255, 2006.
[138] D. Xu, Z. Fang, H. Chen, Z. Yan, and M. Tan, “Compact visual control system for aligning and tracking narrow butt seams with CO2 gas-shielded arc welding,” Int. J. Adv. Manuf. Technol., vol. 62, no. 9, pp. 1157–1167, 2012.
[139] M.-T. Puth, M. Neuhäuser, and G. D. Ruxton, “Effective use of pearson’s product–moment correlation coefficient,” Anim. Behav., vol. 93, pp. 183–189, 2014.
[140] M. Graaf et al., “Robot-sensor synchronization for real-time seam tracking in robotic laser welding,” Int. J. Hum. Resour. Dev. Manag., 2005.
[141] J. Li, H. Li, H. Wei, and Y. Gao, “Effect of torch position and angle on welding quality and welding process stability in Pulse on Pulse MIG welding–brazing of aluminum alloy to stainless steel,” Int. J. Adv. Manuf. Technol., vol. 84, no. 1, pp. 705–716, 2016.
[142] J. C. Ehiwario, “Comparative study of bisection, Newton-Raphson and secant methods of root- finding problems,” IOSR J. Eng., vol. 4, pp. 1–7, 2014.

無法下載圖示
全文公開日期 2026/07/12 (校外網路)
全文公開日期 2026/07/12 (國家圖書館:臺灣博碩士論文系統)
QR CODE