研究生: |
蘇榮智 Jung-Chic Su |
---|---|
論文名稱: |
自動化光學檢測之研究 A Study of Automated Optical Inspection |
指導教授: |
唐永新
Yeong-Shin Tarng |
口試委員: |
楊宏智
Hong-Tsu Young 廖運炫 Yun-Shiuan Liao 許新添 Hsin-Teng Hsu 鍾國亮 Kuo-Liang Chung 林原慶 Yuan-Ching Lin 傅光華 Kuang-Hua Fuh 陳亮嘉 Liang-Chia Chen |
學位類別: |
博士 Doctor |
系所名稱: |
工程學院 - 機械工程系 Department of Mechanical Engineering |
論文出版年: | 2006 |
畢業學年度: | 94 |
語文別: | 英文 |
論文頁數: | 152 |
中文關鍵詞: | 測直線 、測圓 、缺陷分類 、結構光 、刀腹磨耗 、砂輪 、微型鑽頭 、喇叭震模 、變阻器 、測邊 、自動化光學檢測 、機器視覺 |
外文關鍵詞: | Machine vision, Automated optical inspection |
相關次數: | 點閱:527 下載:1 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本論文主要的目的是利用機器視覺發展一自動化檢測系統,以提昇製造廠品質管制能力。本研究達成三個項目(1)檢測元件的位置尺寸是否正確,(2)元件的外型是否在接受範圍內,(3)元件的外觀是否有瑕疵。自動化光學檢測包含下列步驟: 擷取影像,影像處理,特徵擷取和判決。本文使用四個應用實例來印證所提出的特徵擷取的方式及對產品瑕疵的辨識,這四個實例分別為(A)二維的輪廓檢測-主要的目的量測砂輪磨耗量的改變,藉此來判別砂輪是否需要再重磨,(B)使用測邊的方式量測微型鑽頭的刀腹磨耗,(C)三維的曲面檢測-使用結構光之照明方式檢測喇叭震模的高度及同心度是否符合要求,及(D)變阻器外觀檢測,主要從影像中擷取此元件特徵,再將這些特徵透過ANFIS訓練,訓練完畢後再對Varistors的缺陷加以分類。由這四個應用實例顯示,本研究所發展的機器視覺系統具有很良好重現精度,及優越瑕疵分類能力。
This thesis presents a machine vision system for automated inspection of industrial parts for post-manufacturing quality control. The aims of the visual inspection process are to determine whether all components have correctly located dimensions, whether all components shaped within acceptable tolerances, to check for structural damage, and to inspect the surface quality of components for defects. Automated optical inspection processes involve the following sequence of steps: image acquisition, image processing, exaction feature, and decision-making. All this must be accomplished while ensuring that overall completion time is comparable to that of a human inspector. We describes, with applications: (1) a two-dimensional contour measurement algorithm using back lighting to solve the problem of measuring wear on a grinding wheel, (2) an automated flank wear measurement scheme with edge detection based on machine vision for a microdrill (3) three-dimensional shape measurement using a structured illumination method to measure the dimensions of a loudspeaker cone, and (4) an automated visual system for inspecting the surface appearance of ring varistors based on an adaptive neuro-fuzzy inference system (ANFIS). The experimental results show that the machine vision system can inspect or classify these components in a highly consistent and accurate manner, and can be a valuable tool for ensuring product quality.
References
[1] Gordon, G. G.., “Automated glass fragmentation analysis,” Machine Vision Applications in Industrial Inspection IV, Procedings of the SPIE, San Jose, CA, pp. 2665-2675 (1996).
[2] Caron, J., Duvieubourg, L., and Postaire, J. G.., “A hyperbolic filter for defect detection in packaging industry,” In Int. Conf. on Quality Control and artificial Vision, Le Creusot, French, pp. 207-211 (1997).
[3] Brzakovic, D. and Vujovic, N., “Designing defect classification system: a cause study,” Pattern Recognition Vol.29, pp.1401-1419 (1992).
[4] Fernandze, C., Platero, C., Campany, P. and Aracil, R., “Vision system for online surface inspection in aluminum casting process,” Proceedings of the IEEE International Conference on Industrial Electrics, Control, Instrumentation and Automation (IECON’93), pp.1854-1859 (1993).
[5] Caron, J., Duvieubourg, L. J., Orteu, J. and Revolte, J. G.., “Automatic inspection system for strip of preweathered zinc,” In Int. Conf. on Applications of photonic Technology, Montréal, Canada, pp.571-576 (1997).
[6] Torres, T., Sebastian, J.M., Aracil, R., Jimenez, L.M., Reinoso, O., “Automated real-time visual inspection system for high-resolution superimposed printings, “Image and Vision Computing, Vol.16, pp.947–958 (1998).
[7] Guerra, E. and Villalobos, J.R., “Three-dimensional automated visual inspection system for SMT assembly,” Computers and Industrial Engineering Vol. 40, pp. 175-190 (2001).
[8] Chou, P. B., Rao, A. R., and Wu, F. Y., “Automatic defect classification for semiconductor manufacturing,” Machine Vision and Applications, pp.201-214 (1997).
[9] Ojala, T., Pietikäinen, M., and Silven, O., “Edge-based texture measures for surface inspection. Processing of the 11th International Conference on Pattern Recognition,” pp.594-598 (1992).
[10] Conners, R. W., Mcmillin, C. W., Lin, K. and Vasquez-Espinosa, R. E.,
“Identifying and locating surface defects in wood: Part of an Automated Lumber Processing System,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. PAMI-5, pp.573-583 (1983).
[11] Anzalone, A., Frucci, M. Machi, A. and Baja, G. S. d.,” Parallel implementation on a MIMD machine of a PCB computer assisted inspection method,” In 6th International Conference on Image Analysis and Processing: Progress in Image Analysis and Processing II, pp.679–687 (1991).
[12] Wilson, D., Greig, A., Gilby, and J., Smith, R., “Using uncertainty techniques to aid defect classification in an automated visual inspection system,” Industrial Inspection, IEE Colloquium, pp.2/1-2/10 (1997).
[13] Kashitani, A., Takanashi, N., and Tagawa, N.,” A solder joint inspection system for surface mounted pin grid arrays,” In Proceeding of the IEEE International Conference on Industrial electronics, and Instrumentation (IECON) ’93, Maui, HA, pp.1865-1870 (1993).
[14] Hattori, T., Nakada, T., Kataoka, M., Nakada, I. M., and Kataoka, I., “A high speed image processor oriented for automated visual inspection system,” Systems Engineering, IEEE International Conference, pp.640 – 643 (1992).
[15] Li, H. and Lin, J.C., “Using fuzzy logic to detect dimple defects of polished wafer surfaces,” IEEE Transactions on Industry Applications 30, pp.1530–1543 (1994).
[16] Lee, M. R. et al., “Machine vision system for curved surface inspection,” Machine Vision and Applications, Vol. 12, pp.177-188 (2000).
[17] Sarigul, E., Abbott, A. L., and Schmoldt, D. L.,” Rule-driven defect detection in CT images of hardwood logs,” Computers and Electronics in Agriculture, Vol.41, pp.101-119 (2003).
[18] Chang, J. G., Han, Valverde, J.M. Griswold, N.C., Duque-Carrillo, J.F. and Cork, S.E., “Quality classification system using a unified image processing and fuzzy-neural network methodology,” IEEE Transactions on Neural Networks, Vol.8, pp. 964–973 (1997).
[19] Sarkodie-Gyan, T. Lam, C.W. Hong, D., and Campbell, A.W., “An efficient object recognition scheme for a prototype component inspection,” Mechatronics, Vol.7, pp.185–197 (1997).
[20] Chen, Y.H., “Computer vision for general purpose visual inspection: a fuzzy logic approach,” Optics and Lasers in Engineering, Vol.2, pp.2181–192 (1995).
[21] Bose, N.K. and Liang, P. Neural Network Fundamentals with Graphs, Algorithms, and Applications, McGraw-Hill, New York (1996).
[22] Malamasa, E. N., Petrakisa, E. G. M., Zervakis, M., Petitb L. and Legat, J. D., “A survey on industrial vision systems, applications and tools,” Image and Vision Computing, Vol.21 pp.171–188 (2003)
[23] Bahlmann, C. and Heidemann, G., H. Ritter, “Artificial neural networks for automated quality control of textile seams,” Pattern Recognition 32, pp.1049–1060 (1999).
[24] Cootes, T.F., Page, G.J. Jackson, C.B. and Taylor, C.J., “Statistical grey level models for object location and identification,” Image and Video Computing Vol.14 pp.533–540 (1996).
[25] Kim, K.H., Kim, Y.W. and Suh, S.W., “Automatic visual inspection system to detect wrongly attached components,” International Conference on Signal Processing Applications and Technology (ICSPAT’98) (1998).
[26] Velten, J., Kummert, A. and Maiwald, D., “Real time railroad tie inspection implemented on DSP and FPGA boards,” International Conference on Signal Processing Applications and Technology (ICSPAT’99) (1999).
[27] Jeng, J.Y., Mau, T.F., and Leu, S.M., “Gap inspection and alignment using a vision technique for laser butt joint welding,” International Journal of Advanced Manufacturing Technology Vol.16, pp.212–216 (2000).
[28] Moreira, M., Fiesler, E. and Pante, G., “Image classification for the quality control of watches,” Journal of Intelligent and Fuzzy Systems Vol. 7, pp.151–158 (1999).
[29] Sonka, M., Hlavac, V. and Boyle, R, “Image Processing, Analysis, and Machine Vision, PWS Publishing,” New York. (1999).
[30] Van Gool, L., Wambacq, P. and Oostterlinck, A, “Intelligence robotic vision systems,” In Intelligent Robotic system, Dekker, New York, pp.457-507 (1991).
[31] Morii, F., “Distortion Analysis on Discrete Laplacian Operators by Introducing Random Images,” Image and Graphics, 2004. Proceedings. Third International Conference on, pp.80- 83(2004).
[32] Zhao, F, deSilva, C.J.S., “Engineering in Medicine and Biology Society,” Proceedings of the 20th Annual International Conference of the IEEE, 2(29), pp.812 – 815(1998).
[33] Pellegrino, F. A., Vanzella, W. and Torre, V., “Edge Detection Revisited,” IEEE Transactions on systems, man, and cybernetics, Vol. 34, pp.3-10 (2004).
[34] Kovesi, P., “Image features from phase congruency,” In Videre. Cambridge, MA: MIT Press, Vol. 1, pp.1–26(1999).
[35] Morrone, M. C. and Burr, D., “Feature detection in human vision: A phase-dependent energy model,” In Proc. Royal Soc. London B, pp.221–245 (1988).
[36] Brunnstrom, K., Lindeberg, T. and Eklundh, J. O., “Active detection and classification of junctions,” In Proc. 2nd Eur. Conf. Computer Vision, St.Margherita Ligure, Italy, pp.701–709 (1992).
[37] Hough, P.V.C. “Method and Means for Recognizing,” Complex Patterns, U.S. Patent 3,069,654, Dec. 18 (1962).
[38] Duda, R.O. and Hart, P. E. “Use of the Hough Transformation to detect lines and curves in pictures,” Common. ACM, 15(1): pp.11-15(1973).
[39] Chen, T. C. and Chung, K. L., “A new randomized algorithm for detecting Lines,” Real-Time Imaging, Vol.7, pp.473-481 (2001).
[40] Duda, R. O. and Hart, P. E., “Use of the Hough transformation to detect lines and curves in pictures,” Comm. Assoc. Compute. Mach. 15, pp.11–15 (1972).
[41] Tien, F. C. and Yen, C. H. “Using eigenvalues of coveriance for automated visual inspection of mocrodrills,” Int J Adv Manuf Technol, Vol. 26, pp.741-749 (2005).
[42] Shimizu, M. and Pkutomi, M, “An analysis of sun-pixel estimation error on area-based image matching,” Digital Signal Processing, DSP 14th International Conference, Vol. 2, pp.1239-1242 (2002).
[43] Schwartz, W. H., “Vision System for Pc Board Inspection,” Assembly Engineering, Vol.29, No.8, pp.8-21(1986).
[44] Moganti, M. and Ercal, F., “Automatic PCB inspection system,” IEEE Potentials, Vol. 14, No. 3, pp. 6-10(1995).
[45] Chou, P. B., Rao, A. R., Sturzenbecker, M. C., Wu, F. Y., and Brecher, V. H., “Automatic defect classification for semiconductor manufacturing”, Machine Vision and Applications 9, pp.201–214 (1997).
[46] Loh, H. H. and Lu, M. S., “SMD inspection using structured light,” Proceedings of the 1996 IEEE IECON.22nd International Conference on Industrial Electronics, Control, and Instrumentation, Vol.2, pp.1076-1081 (1996).
[47] Xu, L., Oja, E., and Kultanan, P., “A new curve detection method: Randomized Hough Transform (RHT),” Pattern Recog. Lett. 11, pp.331–338 (1990).
[48] Xu, L. and Oja, E., “Randomized Hough transform (RHT): Basic mechanisms, algorithms, and computational complexities,” CVGIP: Image Understanding 57, pp.131–154 (1993).
[49] Sugeno, M. and Kang, G. T., “Structure identification of fuzzy model,” Fuzzy set and Systems, Vol.28, pp.15-33 (1988).
[50] Takagi, T. and Sugeno, M., “Fuzzy identification of systems and its applications to modeling and control,” IEEE Transactions on Systems, Man, and Cybernetics, pp.116-132 (1985).
[51] Mamdani, E. H. and Assilian, S., “An experiment in linguistic synthesis with a fuzzy logic controller,” International of Man-Machine Studies. Vol.7, No.13, pp.1-13 (1975).
[52] Tsukamoto, Y., “An approach to fuzzy reasoning method,” Advanced in fuzzy set theory and applications, pp. 137-149 (1973).
[53] Drake, P.R. and Packianather, M.S. “A decision tree of neural networks for classifying images of wood veneer,” International Journal of Advanced Manufacturing Technology, Vol.14, pp.280–285 (1998).
[54] Tsai, D.M., Chen, J.J. and Chen, J.F., “A vision system for surface roughness assessment using neural networks,” International Journal of Advanced Manufacturing Technology, Vol.14, pp.412–422 (1998).
[55] Kim, T.H., Cho, T.H., Moon, Y.S., and Park, S.H., “Visual inspection system for the classification of solder joints,” Pattern Recognition, Vol.32, pp.565–575. (1999).
[56] Jang, J.–S, R., Sun, C.T., and Mizutani, E. Neuro-Fuzzy and soft computing. Pearson Education Taiwan Ltd, Taipei, pp. 104-106.
[57] Furutani, K., Ohguro, N., Hieu, N. T., and Nakamura, T., “In-process measurement of topography change of grinding wheel by hydrodynamic pressure,” International Journal of Machine Tools & Manufacture, pp. 1447-1453 (2002).
[58] Furutani, K., Ohguro, N., Hieu, N. T. and Nakamura, T., “Automatic compensation for grinding wheel wear by pressure based in-process measurement in wet grinding,” Precision Engineering, pp. 9-13 (2003).
[59] Mokbel, A. A. and Maksoud, T. M., “Monitoring of the condition of diamond grinding wheels using acoustic emission technique,” Journal of Materials Processing Technology, pp. 292-297 (2000).
[60] Susič, E., Mužič, P., and Grabec, I., “Description of ground surfaces based upon AE analysis by a neural network,” Ultrasonic, pp. 547-549 (1997).
[61] Lachance, S., Bauer, R. and Warkentin, A., “Application of region growing method to evaluate the surface condition of grinding wheels,” International Journal of Machine Tools and Manufacture, pp. 823-829 (2004).
[62] Sodhi, M. S. and Tiloquine, K., “Surface roughness monitoring using computer vision,” International Journal of Machine Tools and Manufacture, pp. 817-828 (1996).
[63] Weng, J., Cohen, P., and Herniou, M., “Camera calibration with distortion models and accuracy evaluation,” IEEE Trans. Pattern Anal. Mach. Intell. 14, pp. 965–980 (1992).
[64] Tsai, R.Y., “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the shelf TV cameras and lenses,” IEEE Int. J. Robot. Automation. RA-3, pp. 323–344 (1987).
[65] Lee, M. R. et al. “Machine vision system for curved surface inspection,” Machine Vision and Applications, Vol.12 pp.177-188 (2000).
[66] Griffiths, B. J., Middleton, R. H. and Wilkie, B. A, “Condition monitoring of the grinding process using light scattering,” Wear, pp.39-45 (1996).
[67] Loh, H. H., and Lu, M. S, “Printed circuit board inspection using image analysis,” Industry Applications IEEE Transactions on, 35(2) pp.426-432 (1999).
[68] Liang, X. P. and Su, X. Y, “Computer Simulation of a 3-D Sensing System with Structured illumination,” Optics and Lasers in Engineering, Vol.27 pp.379-393 (1997).
[69] Meershoek, L. S., and Schamhardt, H. C., “Oblique scaling: an algorithm to correct foe a non-perpendicular camera view in tendon strain measurements,” Journal of Biomechanics, pp. 1529-1536 (2000).
[70] Lee, B. Y., Liu, H. S. and Tarng, Y. S., “An abductive network for predicting tool life in drilling,” Industry Applications, IEEE Transactions, 35 (1) pp.190-195 (1999).
[71] Everson, C. and Cheraghi, S. H., “The application of acoustic emission for precision drilling process monitoring,” International Journal of Machine Tools and Manufacture, 39(3) pp.371-387 (1999).
[72] Ertunc, H. M. and Oysu, C. “Drill wear monitoring using cutting force signals,” Mechatronics, 14(5) pp.533–548 (2004).
[73] El-Wardany, T.I., Gao, D., and Elbestawi, M.A., “Tool condition monitoring in drilling using Vibration signature analysis,” International Journal of Machine Tools & Manufacture, 36 (6) pp.687–711 (1996).
[74] Hayashi, S.R., Thomas, C.E. and Wildes, D.G., “Tool break detection by monitoring ultrasonic vibrations,” Annals of the CIRP, 37 (1) pp.61–64 (1988).
[75] Dimla, DE, “The correlation of vibration signal features to cutting tool wear in a metal turning operation,” Introduction Journal Advanced Manufacturing Technology, 19(10) pp.705–713 (2002).
[76] Lin, S.C. and Ting, C.J., “Tool wear monitoring in drilling using force signals,” Wear, 180 (1) pp.53–60 (1995).
[77] Ertunc, H.M. and Loparo, K.A., “A decision fusion algorithm for tool wear condition monitoring in drilling,” International Journal of Machine Tools & Manufacture, 41 pp.1347–1362 (2001).
[78] Li, X., Dong, S. and Venuvinod, P. K., “Hybrid learning for tool wear monitoring,” Introduction Journal Advanced Manufacturing Technology, Vol.16 pp.303–307 (2000).
[79] Nickel, J., Shuaib, A.N., Yilbas, B.S. and Nizam, S.M., “Evaluation of the wear of plasma-nitrided and TiN-coated HSS drills using conventional and Micro-PIXE techniques,” Wear, Vol.239 pp.155–167 (2000).
[80] Pedersen, K. B., “Wear measurement of cutting tools by computer vision,” Int. J. Machine tools Manufacturing, 30(1) pp.131-139 (1990).
[81] Jeon, J. U. and Kim, S. W., “Optical flank wear monitoring of cutting tools by image processing,” Wear, Vol.127 pp.207-117 (1988).
[82] L. Hazra et al., “Inspection of reground drill point geometry using three silhouette images,” Journal of Materials Processing Technology, 127(2) pp.169-173 (2002).
[83] Ramirez, C.N. and Thornh, R. J., “Automated measurement of flank wear of circuit board drills,” ASME Journal of Electric Packaging, Vol.114 pp.93-96 (1992).
[84] Jang, J.-S. R., “Self-learning fuzzy controllers based on temporal backpropagation,” IEEE Trans. Neural Networks 3 (5), pp.714–723 (1992).
[85] Jang, J.-S. R., “ANFIS: Adaptive-network-based fuzzy inference system,” IEEE Trans. Syst. Man Cybern. 23 (3), pp.665–685 (1993).
[86] Gonzales, R. C. and Woods, R. E., “Digital Image Processing,” Prentice Hall, New Jersey. pp. 50-52 (2002).