簡易檢索 / 詳目顯示

研究生: HOANG THANH NHAN
HOANG THANH NHAN
論文名稱: 利用深度學習和機械手臂實現全自主螺旋槳研磨系統
Implementation of Autonomous Propeller Grinding System with Deep Learning and Robotic Arm
指導教授: 林其禹
Chyi-Yeu Lin
口試委員: 李維楨
莊景崴
學位類別: 碩士
Master
系所名稱: 工程學院 - 機械工程系
Department of Mechanical Engineering
論文出版年: 2023
畢業學年度: 112
語文別: 英文
論文頁數: 106
中文關鍵詞: 螺旋槳
外文關鍵詞: Propeller, Grinding, Point cloud, Transform matrix, AI
相關次數: 點閱:53下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

螺旋槳作為一種機械裝置,在運動時在空氣或水等工作流體中產生線性推力。 鑑於螺旋槳複雜的螺旋槳幾何形狀,脫蠟鑄造成為最合適的製造方法。 除了螺旋槳的設計之外,一個關鍵的考慮因素還包括透過確保螺旋槳鑄造後表面的光滑度來最大限度地減少流體動力阻力。 然而,在生產過程中不可避免地會出現缺陷,包括由於不理想的金屬液體收縮而導致的裂縫、表面不規則、熱點和膨脹。 為了解決這些問題,螺旋槳表面精加工工藝,特別是螺旋槳表面的研磨變得至關重要。 然而,由於傳統製程上,人工進行表面研磨操作效率低且很難維持一致的品質,因此需要生產技術的突破。
本研究提出了一種用於可全自主進行研磨螺旋槳葉片的創新自動化加工系統。 該系統整合了六軸工業機器手臂的軌跡規劃、深度學習AI應用程式和具順應功能的磨削設備。 在本文中,提出了一種採用參數模型和點雲模型的機器手臂軌跡規劃方法。 最初,螺旋槳的 3D CAD檔用於建立點雲模型。 隨後,透過參考路徑和最近鄰搜尋的結合,產生表面軌跡。 這些軌跡可以進行轉換,以與機器手臂的研磨過程座標對齊。 此外,還制定了基於深度學習實例分割的 Mask R-CNN 網路架構來檢測每個螺旋槳葉片表面的缺陷。 隨著缺陷檢測模型的實施,系統得以僅需要研磨有缺陷的表面區域。
綜上所述,本論文提出了一種用於對複雜結構螺旋槳工件進行研磨加工的綜合全自動化研磨解決方案。 透過在真實環境設置中使用6軸工業機器手臂進行的一系列實驗試驗,證實了所提出系統的功效。


A propeller functions as a mechanical component that, when in motion, generates linear thrust in a working fluid like air or water. Given the complex helicoidal surface geometry of the propeller, lost wax casting emerges as the most suitable manufacturing method. Beyond the propeller’s design, a crucial consideration involves minimizing fluid dynamic drag by ensuring the smoothness of the propeller’s surface. However, defects are inevitable during production, including cracks, surface irregularities, hot spots, and swells due to unfavorable metal liquid shrinkage. To address these issues, a propeller’s surface polishing process, particularly propeller’s surface grinding, becomes essential. However, this conventional manual operation raises concerns due to its observed low efficiency and the potential for inconsistency in maintaining quality standards.
This research proposes a novel approach to the development of an automated machining system designed for grinding propeller blades. The system integrates trajectory planning for a 6-axis industrial robot, a deep learning application, and compliant grinding equipment. In this thesis, a methodology for robotic trajectory planning is proposed, employing both parametric and point cloud models. Initially, the 3D CAD model of the propeller is used to create a point cloud model. Subsequently, through the integration of the reference path and nearest neighbor search, the surface trajectories are generated. These trajectories can be transformed to align with the robot’s coordinate for grinding process. Furthermore, a Mask R-CNN network architecture, rooted in deep learning instance segmentation, was formulated to detect defects on the surface of each propeller blade. With the implementation of the defect detection model, the system is required to grind only the surface areas that exhibit defects.
In summary, this thesis presents a comprehensive automated grinding solution for executing the grinding process on complex structure propeller workpieces. The efficacy of the proposed system is affirmed through a series of experimental trials conducted with a 6-axis industrial robot in a real environment setup.

Table of Contents 摘要…………… I ABSTRACT…… II ACKNOWLEDGEMENTS III TABLE OF CONTENTS IV INDEX OF TABLES VII INDEX OF FIGURES VIII CHAPTER 1. INTRODUCTION 1 1.1 Background and History 1 1.2 Literature Review 6 1.2.1 Trajectory planning using point cloud model 6 1.2.2 Precision surface grinding 7 1.2.3 Artificial Intelligent (AI)-based defect detection 9 1.3 Thesis Organization 10 CHAPTER 2. ROBOT TRAJECTORY PLANNING 11 2.1 Overview of Edge Feature Generation 11 2.2 Feature Point Generation Based on CAD Model. 12 2.2.1 Offline Programming Platform 12 2.2.2 Feature Point Generation 12 2.3 Trajectory Planning Based on Point Cloud Model 13 2.3.1 Poisson Disk Sampling 13 2.3.2 Preliminary Planning of Grinding Trajectory 14 2.3.3 Creating Subsequent Trajectory and Tuning Estimated Vectors 19 CHAPTER 3. CALIBRATION METHOD FOR ROBOT TRAJECTORY IN GRINDING 21 3.1 Industrial Robot Coordinate Control 21 3.1.1 Introduction to Robot Coordinate System 21 3.1.2 Kinematics of Robot Arm 21 3.2 Grinding Trajectory Calibration 26 3.2.1 Robot grinding trajectory coordinate 26 3.2.2 Transform matrix from Point-cloud to End-effector 29 3.2.3 Transform matrix from Robot to Grinding coordinate 31 CHAPTER 4. DEFECT DETECTION AND DEEP LEARNING 34 4.1 Camera calibration 34 4.1.1 Pinhole Camera Model 34 4.1.2 Intrinsic Parameter 36 4.1.3 Extrinsic Parameter 37 4.1.4 Distortion Coefficient 38 4.2 Object Identification using Deep Learning 39 4.2.1 Introduction to Deep Learning 40 4.2.2 Feature Extraction 43 4.3 The architect of Mask R-CNN 47 4.3.1 Introduction to Mask RCNN 47 4.3.2 Instance Segmentation 48 4.4 Experimental Results and Discussion 51 4.4.1 Image Calibration Results 51 4.4.2 Object Detection and Instance Segmentation 54 CHAPTER 5. EXPERIMENTAL SETUP AND EVALUATION OF ROBOT TRAJECTORY 61 5.1 Experimental Setup 61 5.1.1 FANUC 6-axis Robot Arm and End Effector 61 5.1.2 Camera System 62 5.1.3 Grinding Machine 64 5.1.4 System Overview 65 5.2 Experimental Results and Discussions 69 5.2.1 Defect Detection Result 69 5.2.2 Propeller Grinding Result 74 5.2.3 The System’s Overall Running Time 81 CHAPTER 6. CONCLUSION AND FUTURE WORK 82 6.1 Conclusion 82 6.2 Future Work 83 REFERENCE 85

Reference
[1] V. Graefe and R. Bischoff, “From ancient machines to intelligent robots — A technical evolution —,” in 2009 9th International Conference on Electronic Measurement & Instruments, 2009, pp. 3-418-3–431.
[2] I. Zamalloa et al., “Dissecting Robotics - historical overview and future perspectives,” Apr. 2017.
[3] A. Gasparetto and L. Scalera, “A Brief History of Industrial Robotics in the 20th Century,” 2019.
[4] International federation of robotics., World robotics 2022: industrial robots. VDMA Services GmbH, 2022.
[5] A. Dzedzickis, J. Subačiūtė-Žemaitienė, E. Šutinys, U. Samukaitė-Bubnienė, and V. Bučinskas, “Advanced Applications of Industrial Robotics: New Trends and Possibilities,” Applied Sciences, vol. 12, no. 1, 2022, [Online]. Available: https://www.mdpi.com/2076-3417/12/1/135
[6] R. Bogue, “Finishing robots: a review of technologies and applications,” Industrial Robot: An International Journal, vol. 36, no. 1, pp. 6–12, Jan. 2009.
[7] S. Malkin and C. Guo, Grinding technology: theory and application of machining with abrasives. Industrial Press Inc., 2008.
[8] M. L. Oyoun, “Computer Numerical Control (CNC),” Researchgate, Oct, 2020.
[9] X. Ke et al., “Review on robot-assisted polishing: Status and future trends,” Robot Comput Integr Manuf, vol. 80, p. 102482, 2023.
[10] T. E. of E. Britannica, “Propeller,” Encyclopedia Britannica. Accessed: Oct. 04, 2023. [Online]. Available: https://www.britannica.com/technology/propeller
[11] E. Chica, S. Agudelo, and N. Sierra, “Lost wax casting process of the runner of a propeller turbine for small hydroelectric power plants,” Renew Energy, vol. 60, pp. 739–745, 2013, [Online].
[12] Gerr Dave, Propeller Handbook. The Complete Reference for Choosing, Installing and Understanding Boat Propellers. 2001.
[13] H.-C. Kuo and W.-Y. Dzan, “The analysis of NC machining efficiency for marine propellers,” J Mater Process Technol, vol. 124, no. 3, pp. 389–395, 2002, [Online].
[14] A. Driemeyer Wilbert, B. Behrens, C. Zymla, O. Dambon, and F. Klocke, “Robotic finishing process – An extrusion die case study,” CIRP J Manuf Sci Technol, vol. 11, pp. 45–52, 2015, [Online].
[15] L. Güvenç and K. Srinivasan, “An overview of robot-assisted die and mold polishing with emphasis on process modeling,” J Manuf Syst, vol. 16, no. 1, pp. 48–58, 1997, [Online].
[16] X. Ren and B. Kuhlenkötter, “Real-time simulation and visualization of robotic belt grinding processes,” The International Journal of Advanced Manufacturing Technology, vol. 35, no. 11, pp. 1090–1099, 2008, [Online].
[17] G. Xiao, K. Song, S. Liu, Y. Wu, and W. Wang, “Comprehensive investigation into the effects of relative grinding direction on abrasive belt grinding process,” J Manuf Process, vol. 62, pp. 753–761, 2021, [Online].
[18] L. Li, X. Ren, H. Feng, H. Chen, and X. Chen, “A novel material removal rate model based on single grain force for robotic belt grinding,” J Manuf Process, vol. 68, pp. 1–12, 2021, [Online].
[19] W.-L. Li, H. Xie, G. Zhang, S.-J. Yan, and Z.-P. Yin, “Hand–Eye Calibration in Visually-Guided Robot Grinding,” IEEE Trans Cybern, vol. 46, no. 11, pp. 2634–2642, 2016.
[20] Z. Wang et al., “Study on passive compliance control in robotic belt grinding of nickel-based superalloy blade,” J Manuf Process, vol. 68, pp. 168–179, Aug. 2021.
[21] J. Beltrá Fuerte, “Trajectory Generation for Robotic Applications using Point Cloud Data.” [Online]. Available: http://hdl.handle.net/10045/135245
[22] Z. Matthias et al., “6 - RENDERING,” in Point-Based Graphics, M. GROSS and H. PFISTER, Eds., Burlington: Morgan Kaufmann, 2007, pp. 246–339. [Online].
[23] Y.-S. Cheng, S. H. Shah, S.-H. Yen, A. R. Ahmad, and C.-Y. Lin, “Enhancing Robotic-Based Propeller Blade Sharpening Efficiency with a Laser-Vision Sensor and a Force Compliance Mechanism,” Sensors, vol. 23, no. 11, 2023, [Online].
[24] W. Li, H. Xie, G. Zhang, S. Yan, and Z. Yin, “3-D Shape Matching of a Blade Surface in Robotic Grinding Applications,” IEEE/ASME Transactions on Mechatronics, vol. 21, no. 5, pp. 2294–2306, 2016.
[25] P. Wang, X. Dong, Y. Yang, and Y. Cheng, “Robot-Assisted Trajectory Planning Method for NDT,” in 2023 IEEE 6th International Conference on Industrial Cyber-Physical Systems (ICPS), 2023, pp. 1–6.
[26] M. Kong, A. Li, and F. Gao, “A Trajectory Planning Algorithm of Spray Robot for Workpieces with Complex Geometric Features based on 3D Point Cloud,” in 2021 IEEE 5th Information Technology,Networking,Electronic and Automation Control Conference (ITNEC), 2021, pp. 13–20.
[27] Y. Lv, Z. Peng, C. Qu, and D. Zhu, “An adaptive trajectory planning algorithm for robotic belt grinding of blade leading and trailing edges based on material removal profile model,” Robot Comput Integr Manuf, vol. 66, p. 101987, 2020, [Online].
[28] D. Zhu et al., “Robotic grinding of complex components: A step towards efficient and intelligent machining – challenges, solutions, and applications,” Robot Comput Integr Manuf, vol. 65, p. 101908, 2020.
[29] X. Ren, M. Cabaravdic, X. Zhang, and B. Kuhlenkötter, “A local process model for simulation of robotic belt grinding,” Int J Mach Tools Manuf, vol. 47, no. 6, pp. 962–970, 2007, [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0890695506001866
[30] O. Sörnmo, A. Robertsson, and A. Wanner, “Force controlled knife-grinding with industrial robot,” in 2012 IEEE International Conference on Control Applications, 2012, pp. 1356–1361.
[31] S. Yan, X. xu, Z. Yang, D. Zhu, and H. Ding, “An improved robotic abrasive belt grinding force model considering the effects of cut-in and cut-off,” J Manuf Process, vol. 37, pp. 496–508, Jan. 2019.
[32] X. Xu, W. Chen, D. Zhu, S. Yan, and H. Ding, “Hybrid active/passive force control strategy for grinding marks suppression and profile accuracy enhancement in robotic belt grinding of turbine blade,” Robot Comput Integr Manuf, vol. 67, p. 102047, 2021, [Online].
[33] R. Zhang, Y. Wu, W. Jin, and X. Meng, “Deep-Learning-Based Point Cloud Semantic Segmentation: A Survey,” Electronics (Basel), vol. 12, no. 17, 2023, [Online].
[34] F. Wang and Z. Zhao, “A survey of iterative closest point algorithm,” in 2017 Chinese Automation Congress (CAC), 2017, pp. 4395–4399.
[35] K. Fu, J. Luo, X. Luo, S. Liu, C. Zhang, and M. Wang, “Robust Point Cloud Registration Framework Based on Deep Graph Matching,” IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 45, no. 05, pp. 6183–6195, May 2023.
[36] C. Sánchez-Belenguer, S. Ceriani, P. Taddei, E. Wolfart, and V. Sequeira, “Global matching of point clouds for scan registration and loop detection,” Rob Auton Syst, vol. 123, p. 103324, 2020, [Online].
[37] D. Chetverikov, D. Svirko, D. Stepanov, and P. Krsek, “The Trimmed Iterative Closest Point algorithm,” in 2002 International Conference on Pattern Recognition, 2002, pp. 545–548 vol.3.
[38] W. Yang, “A Survey of Surface Defect Detection Based on Deep Learning,” in Proceedings of the 2022 7th International Conference on Modern Management and Education Technology (MMET 2022), Atlantis Press, 2022, pp. 362–367.
[39] Y. Chen, Y. Ding, F. Zhao, E. Zhang, Z. Wu, and L. Shao, “Surface Defect Detection Methods for Industrial Products: A Review,” Applied Sciences, vol. 11, no. 16, 2021, [Online].
[40] P. M. Bhatt et al., “Image-Based Surface Defect Detection Using Deep Learning: A Review,” J Comput Inf Sci Eng, vol. 21, no. 4, Feb. 2021.
[41] A. Saberironaghi, J. Ren, and M. El-Gindy, “Defect Detection Methods for Industrial Products Using Deep Learning Techniques: A Review,” Algorithms, vol. 16, no. 2, 2023, [Online].
[42] J. Zhang, G. Cosma, and J. Watkins, “Image Enhanced Mask R-CNN: A Deep Learning Pipeline with New Evaluation Measures for Wind Turbine Blade Defect Detection and Classification,” J Imaging, vol. 7, no. 3, 2021, [Online].
[43] Open CASCADE SAS, “Open CASCADE Technology (OCCT).” Accessed: Oct. 04, 2023. [Online]. Available: https://dev.opencascade.org/
[44] A. K. Bedaka and C.-Y. Lin, “CAD-based robot path planning and simulation using OPEN CASCADE,” Procedia Comput Sci, vol. 133, pp. 779–785, 2018, [Online].
[45] Q.-Y. Zhou, J. Park, and V. Koltun, “Open3D: A Modern Library for 3D Data Processing,” Oct. 2018.
[46] R. B. Rusu and S. Cousins, “3D is here: Point Cloud Library (PCL),” in 2011 IEEE International Conference on Robotics and Automation, 2011, pp. 1–4.
[47] J. L. Bentley, “Multidimensional Binary Search Trees Used for Associative Searching,” Commun. ACM, vol. 18, no. 9, pp. 509–517, Sep. 1975, [Online].
[48] J. Berkmann and T. Caelli, “Computation of surface geometry and segmentation using covariance techniques,” IEEE Trans Pattern Anal Mach Intell, vol. 16, no. 11, pp. 1114–1116, 1994.
[49] J. A. Marvel and K. Van Wyk, “Simplified framework for robot coordinate registration for manufacturing applications,” in 2016 IEEE International Symposium on Assembly and Manufacturing (ISAM), 2016, pp. 56–63.
[50] R. Zhao, Z. Shi, Y. Guan, Z. Shao, Q. Zhang, and G. Wang, “Inverse kinematic solution of 6R robot manipulators based on screw theory and the Paden–Kahan subproblem,” Int J Adv Robot Syst, vol. 15, p. 172988141881829, Nov. 2018.
[51] A. el Sherbiny, M. Elhosseini, and A. Haikal, “A comparative study of soft computing methods to solve inverse kinematics problem,” Ain Shams Engineering Journal, vol. 9, Aug. 2017.
[52] B. Siciliano, L. Sciavicco, L. Villani, and G. Oriolo, Robotics: Modelling, Planning and Control, 1st ed. Springer Publishing Company, Incorporated, 2008.
[53] I. S. Mohamed, A. Kumar, and V. Serrano, Serial Link Manipulator With 6-DoF. 2016.
[54] S. Asif and P. Webb, “Kinematics Analysis of 6-DoF Articulated Robot with Spherical Wrist,” Math Probl Eng, vol. 2021, p. 6647035, 2021.
[55] L.-W. Tsai and A. P. Morgan, “Solving the Kinematics of the Most General Six- and Five-Degree-of-Freedom Manipulators by Continuation Methods,” Journal of Mechanisms Transmissions and Automation in Design, vol. 107, pp. 189–200, 1985, [Online]. Available: https://api.semanticscholar.org/CorpusID:43660798
[56] X. Wang, J. Cao, L. Chen, and H. Hu, “Two Optimized General Methods for Inverse Kinematics of 6R Robots Based on Machine Learning,” Math Probl Eng, vol. 2020, p. 8174924, 2020.
[57] Y. Sun, D. J. Giblin, and K. Kazerounian, “Accurate robotic belt grinding of workpieces with complex geometries using relative calibration techniques,” Robot Comput Integr Manuf, vol. 25, no. 1, pp. 204–210, 2009, [Online].
[58] J. Hallenberg, “Robot Tool Center Point Calibration using Computer Vision,” 2007. [Online]. Available: https://api.semanticscholar.org/CorpusID:111027432
[59] R. M. Murray, S. S. Sastry, and L. Zexiang, A Mathematical Introduction to Robotic Manipulation, 1st ed. USA: CRC Press, Inc., 1994.
[60] MathWorks, “What Is Camera Calibration?” Accessed: Oct. 18, 2023. [Online]. Available: https://www.mathworks.com/help/vision/ug/camera-calibration.html
[61] D. J. Yeong, G. Velasco-Hernandez, J. Barry, and J. Walsh, “Sensor and Sensor Fusion Technology in Autonomous Vehicles: A Review,” Sensors, vol. 21, p. 2140, Mar. 2021.
[62] R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision, 2nd ed. Cambridge: Cambridge University Press, 2004. [Online].
[63] E. Tola, “Multi-view 3D Reconstruction of a Scene Containing Independently Moving Objects,” Jan. 2005.
[64] B. Nenovski and I. Nedelkovski, Defining a feature-rich end-to-end augmented reality platform for spatial exploration. 2018.
[65] X. Jiang, A. Hadid, Y. Pang, E. Granger, and X. Feng, Deep Learning in Object Detection and Recognition, 1st ed. Springer Publishing Company, Incorporated, 2019.
[66] S. Abdul-Khalil, S. Rahman, S. Mutalib, S. Kamarudin, and S. Kamaruddin, “A review on object detection for autonomous mobile robot,” IAES International Journal of Artificial Intelligence (IJ-AI), vol. 12, p. 1033, Oct. 2023.
[67] X. Wu, D. Sahoo, and S. C. H. Hoi, “Recent advances in deep learning for object detection,” Neurocomputing, vol. 396, pp. 39–64, 2020, [Online].
[68] A. Lohia, K. Kadam, R. Joshi, and A. Bongale, “Bibliometric Analysis of One-stage and Two-stage Object Detection,” Feb. 2021.
[69] Alberto Rizzoli, “Object Detection Guide,” V7 labs. Accessed: Nov. 01, 2023. [Online]. Available: https://www.v7labs.com/blog/object-detection-guide
[70] L. Alzubaidi et al., “Review of deep learning: concepts, CNN architectures, challenges, applications, future directions,” J Big Data, vol. 8, no. 1, p. 53, 2021.
[71] Z.-Q. Zhao, P. Zheng, S.-T. Xu, and X. Wu, “Object Detection With Deep Learning: A Review,” IEEE Trans Neural Netw Learn Syst, vol. PP, pp. 1–21, Jan. 2019.
[72] Dhanoop Karunakaran, “Intro to deep learning,” Intro to Artificial Intelligence. Accessed: Nov. 01, 2023. [Online]. Available: https://medium.com/intro-to-artificial-intelligence/deep-learning-series-1-intro-to-deep-learning-abb1780ee20
[73] K. He, G. Gkioxari, P. Dollár, and R. Girshick, “Mask R-CNN,” in 2017 IEEE International Conference on Computer Vision (ICCV), 2017, pp. 2980–2988.
[74] R. Girshick, “Fast R-CNN,” in 2015 IEEE International Conference on Computer Vision (ICCV), 2015, pp. 1440–1448.
[75] S. Ren, K. He, R. Girshick, and J. Sun, “Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks,” in Advances in Neural Information Processing Systems, C. Cortes, N. Lawrence, D. Lee, M. Sugiyama, and R. Garnett, Eds., Curran Associates, Inc., 2015. [Online].
[76] H. Jung, B. Lodhi, and J. Kang, “An automatic nuclei segmentation method based on deep convolutional neural networks for histopathology images,” BMC Biomed Eng, vol. 1, Oct. 2019.
[77] M. Raghu and E. Schmidt, A Survey of Deep Learning for Scientific Discovery. 2020.
[78] W. Gu, S. Bai, and L. Kong, “A review on 2D instance segmentation based on deep neural networks,” Image Vis Comput, vol. 120, p. 104401, 2022.
[79] M. Everingham, S. M. A. Eslami, L. Van Gool, C. K. I. Williams, J. Winn, and A. Zisserman, “The Pascal Visual Object Classes Challenge: A Retrospective,” Int J Comput Vis, vol. 111, no. 1, pp. 98–136, 2015.
[80] W. Burger, Zhang’s Camera Calibration Algorithm: In-Depth Tutorial and Implementation. 2016.
[81] E. Brachmann, F. Michel, A. Krull, M. Y. Yang, S. Gumhold, and C. Rother, “Uncertainty-Driven 6D Pose Estimation of Objects and Scenes from a Single RGB Image,” in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, pp. 3364–3372.
[82] Facebook AI Research, “Detectron2.” Accessed: Nov. 08, 2023. [Online]. Available: https://github.com/facebookresearch/detectron2
[83] “Camera acA2500-14uc.” Accessed: Nov. 14, 2023. [Online]. Available: https://docs.baslerweb.com/aca2500-14uc

無法下載圖示 全文公開日期 2034/01/30 (校內網路)
全文公開日期 本全文未授權公開 (校外網路)
全文公開日期 2044/01/30 (國家圖書館:臺灣博碩士論文系統)
QR CODE