簡易檢索 / 詳目顯示

研究生: 顏瑜君
YU-JUN YAN
論文名稱: 深度學習應用於角度計算之作業員智慧輔助系統
A Smart Operator Assistance System Using Deep Learning for Angle Measurement
指導教授: 王孔政
Kung-Jeng Wang
口試委員: 歐陽超
Chao Ou-Yang
黃忠偉
Jong-Woei Whang
學位類別: 碩士
Master
系所名稱: 管理學院 - 工業管理系
Department of Industrial Management
論文出版年: 2021
畢業學年度: 109
語文別: 英文
論文頁數: 45
中文關鍵詞: 角度計算動作辨識深度學習作業程序監控
外文關鍵詞: angle measurement, action recognition, deep learning, task monitoring
相關次數: 點閱:198下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 在產品組裝線上,由於作業員相較於機器而言擁有較高的適應能力,因此使人力工作站成為維持產線彈性的關鍵角色。然而,隨著組裝作業的複雜度提升,作業員亦須承受更重的認知負荷,進而提高產品品質受到人為疏失所影響的風險。其中一項常見的人為疏失,是當作業員以手持工具進行組裝程序時,使用不當的工作角度,而使產品品質受損,導致生產效率下降。對此,有必要建立相應的協助機制,用於提醒作業員調整不當的工作角度。基於上述,本研究提出一套角度監控系統,可應用於相關組裝程序中,且在最小化對作業干擾的前提下,檢測多種組裝工具的工作角度,並提供即時回饋。此系統的架構由兩個模型組成,分別為角度計算與動作辨識模型,兩者都是基於深度學習中的物件辨識方法所建構而成。此外,本研究以高階顯示卡的螺絲鎖附程序作為個案來評估模型效率。研究結果顯示,角度計算與動作辨識模型分別可達到95.83%及99.83%的準確度。在實務應用上,本研究所提出的角度監控系統,透過即時且有效地預防角度相關的錯誤程序,可維持組裝品質的標準,並達成生產效率之提升。


    Manual workstations play a critical role in flexible assembly lines by enabling human responses to reconfiguration that are faster than machine responses. However, with increasing complexity of tasks, product quality has become highly susceptible to human error due to increments in operators’ cognitive load. One of the errors that affect assembly quality is the operator’s use of a hand-held tool with an unfavorable working angle when handling the workpiece. In this case, an assistive mechanism to remind workers about the wrong working angle is necessary to support the process. To this end, this study proposes an angle monitoring system to inspect the working angle of various hand-held tools and provide feedback in real time with minimal interruption to the assembly process. The proposed system consists of an angle measurement model and an action recognition model, which are both built using deep-learning based object detection algorithm. A case study on fastening a high-end graphical processing unit card is investigated to evaluate their performance. Results show 95.83% and 99.83% accuracy of the models. In practice, the proposed study is expected to facilitate the assembly quality by preventing the failure of angle-related operation in a timely and reliable manner.

    Abstract i 摘要 ii Table of content iii List of Figures iv List of Tables v Chapter 1. Introduction 1 Chapter 2. Literature review 3 2.1 Assembly assistance system 3 2.2 Tilt angle measurement 4 2.3 Assembly action recognition 6 2.4 Object detection 7 Chapter 3. Method 10 3.1 Research framework 10 3.2 Angle measurement model 11 3.3 Action recognition model 16 3.4 Implementation of YOLOv3 18 3.5 Angle monitoring system 22 Chapter 4. Experiment results and discussion 23 4.1 Experiment setup 23 4.2 Model evaluation 23 4.2.1 Angle measurement model 23 4.2.2 Action recognition model 25 4.3 Benchmark test 27 4.3.1. YOLOv3 compared with faster RCNN in angle measurement 28 4.3.2. YOLOv3 compared with CNN in action recognition 29 4.4 Real-time angle-monitoring system application 31 Chapter 5. Conclusions 33 References 35 Appendix 1. YOLOv3 models in angle measurement 39 Appendix 2. YOLOv3 model in action recognition 44

    Aehnelt, M., & Bader, S. (2015). Information Assistance for Smart Assembly Stations. In ICAART (2) (pp. 143-150).
    Al-Amin, M., Tao, W., Doell, D., Lingard, R., Yin, Z., Leu, M. C., & Qin, R. (2019). Action recognition in manufacturing assembly using multimodal sensor fusion. Procedia Manufacturing, 39, 158-167.
    Bader, S., & Aehnelt, M. (2014). Tracking Assembly Processes and Providing Assistance in Smart Factories. In ICAART (1) (pp. 161-168).
    Cho, J., Lee, M., Chang, H. J., & Oh, S. (2014). Robust action recognition using local motion and group sparsity. Pattern Recognition, 47(5), 1813-1825.
    Coupeté, E., Moutarde, F., & Manitsaris, S. (2015). Gesture recognition using a depth camera for human robot collaboration on assembly line. Procedia Manufacturing, 3, 518-525.
    Chen, C., Jafari, R., & Kehtarnavaz, N. (2017). A survey of depth and inertial sensor fusion for human action recognition. Multimedia Tools and Applications, 76(3), 4405-4425.
    Chen, Q., Niu, X., Kuang, J., & Liu, J. (2019). IMU mounting angle calibration for pipeline surveying apparatus. IEEE Transactions on Instrumentation and Measurement, 69(4), 1765-1774.
    Chen, C., Wang, T., Li, D., & Hong, J. (2020). Repetitive assembly action recognition based on object detection and pose estimation. Journal of Manufacturing Systems, 55, 325-333.
    Dalle Mura, M., Dini, G., & Failli, F. (2016). An integrated environment based on augmented reality and sensing device for manual assembly workstations. Procedia CIRP, 41, 340-345.
    Falck, A. C., Örtengren, R., & Rosenqvist, M. (2014). Assembly failures and action cost in relation to complexity level and assembly ergonomics in manual assembly (part2). International Journal of Industrial Ergonomics, 44(3), 455-459.
    Fischer, C., Lušić, M., Faltus, F., Hornfeck, R., & Franke, J. (2016). Enabling live data controlled manual assembly processes by worker information system and nearfield localization system. Procedia CIRP, 55, 242-247.
    Faccio, M., Ferrari, E., Galizia, F. G., Gamberi, M., & Pilati, F. (2019). Real-time assistance to manual assembly through depth camera and visual feedback. Procedia CIRP, 81, 1254-1259.
    Han, S., Achar, M., Lee, S., & Peña-Mora, F. (2013). Empirical assessment of a RGB-D sensor on motion capture and action recognition for construction worker monitoring. Visualization in Engineering, 1(1), 1-13.
    Huettemann, G., Gaffry, C., & Schmitt, R. H. (2016). Adaptation of reconfigurable manufacturing systems for industrial assembly–review of flexibility paradigms, concepts, and outlook. Procedia CIRP, 52, 112-117.
    Hinrichsen, S., Riediger, D., & Unrau, A. (2016). Assistance systems in manual assembly. In Proceedings of 6th International conference on Production Engineering and Management, 29 September 2016 (pp. 3-14).
    Hinrichsen, S., & Bendzioch, S. (2018). How digital assistance systems improve work productivity in assembly. In International Conference on Applied Human Factors and Ergonomics (pp. 332-342). Springer, Cham.
    Ivoilov, A. Y., Zhmud, V. A., & Trubin, V. G. (2018, March). The tilt angle estimation in the inverted pendulum stabilization task. In 2018 Moscow Workshop on Electronic and Networking Technologies (MWENT) (pp. 1-9). IEEE.
    Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C. Y., & Berg, A. C. (2016). Ssd: Single shot multibox detector. In European conference on computer vision (pp. 21-37). Springer, Cham.
    Lipka, M., Sippel, E., Hehn, M., Adametz, J., Vossiek, M., Dobrev, Y., & Gulden, P. (2018). Wireless 3D localization concept for industrial automation based on a bearings only extended Kalman filter. In 2018 Asia-Pacific Microwave Conference (APMC) (pp. 821-823). IEEE.
    Matsumoto, T., Doyo, D., Shida, K., & Kanazawa, T. (2011). Development and implementation of a skill transfer system for a self-Tapping screw-tightening operation. Industrial Engineering and Management Systems, 10(3), 209-220.
    Meinel, D., Ehler, F., Lipka, M., & Franke, J. (2018). Supporting manual assembly through merging live position data and 3d-cad data using a worker information system. In Tagungsband des 3. Kongresses Montage Handhabung Industrieroboter (pp. 187-194). Springer Vieweg, Berlin, Heidelberg.
    Mengoni, M., Ceccacci, S., Generosi, A., & Leopardi, A. (2018). Spatial Augmented Reality: An application for human work in smart manufacturing environment. Procedia Manufacturing, 17, 476-483.
    Niedersteiner, S., Pohlt, C., & Schlegl, T. (2015). Smart workbench: A multimodal and bidirectional assistance system for industrial application. In IECON 2015-41st Annual Conference of the IEEE Industrial Electronics Society (pp. 002938-002943). IEEE.
    Obinata, T., Kawamoto, H., & Sankai, Y. (2020). Development of Real-time Assembly Work Monitoring System Based on 3D Skeletal Model of Arms and Fingers. In 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC) (pp. 363-368). IEEE.
    Pasaye, J. J. R., Valencia, J. A. B., & Pérez, F. J. (2013). Tilt measurement based on an accelerometer, a gyro and a kalman filter to control a self-balancing vehicle. In 2013 IEEE International Autumn Meeting on Power Electronics and Computing (ROPEC) (pp. 1-5). IEEE.
    Petzoldt, C., Keiser, D., Beinke, T., & Freitag, M. (2020). Requirements for an incentive-based assistance system for manual assembly. In International Conference on Dynamics in Logistics (pp. 541-553). Springer, Cham.
    Petzoldt, C., Keiser, D., Beinke, T., & Freitag, M. (2020). Functionalities and implementation of future informational assistance systems for manual assembly. In International Conference on Subject-Oriented Business Process Management (pp. 88-109). Springer, Cham.
    Qian, J., Fang, B., Yang, W., Luan, X., & Nan, H. (2011). Accurate tilt sensing with linear model. IEEE Sensors Journal, 11(10), 2301-2309.
    Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. (2016). You only look once: Unified, real-time object detection. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 779-788).
    Redmon, J., & Farhadi, A. (2017). YOLO9000: better, faster, stronger. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 7263-7271).
    Redmon, J., & Farhadi, A. (2018). Yolov3: An incremental improvement. arXiv preprint arXiv:1804.02767.
    Shi, L., He, Y., Luo, Q., He, W., & Li, B. (2017). Tilt angle on-line prognosis by using improved sparse LSSVR and dynamic sliding window. IEEE Transactions on Instrumentation and Measurement, 67(2), 296-306.
    Shi, L., He, Y., Li, B., Wu, Y., Huang, Y., & Cheng, T. (2018). On-line measurement of dynamic tilt angle by compensating gyroscope drift error. IEEE Transactions on Instrumentation and Measurement, 68(9), 3244-3252.
    Tzutalin. (2015). LabelImg. Git code. Accessed July 1, 2020. https://github.com/tzutalin/labelImg
    Tao, W., Lai, Z. H., Leu, M. C., & Yin, Z. (2018). Worker activity recognition in smart manufacturing using IMU and sEMG signals with convolutional neural networks. Procedia Manufacturing, 26, 1159-1166.
    Vysocky, A., & Novak, P. (2016). Human-Robot collaboration in industry. MM Science Journal, 9(2), 903-906.
    Wang K. J., Rizqi D. A. and Nguyen P. H. (2021) Skill transfer support model based on deep learning. Journal of Intelligent Manufacturing. 32(4), 1129-1146.
    Zhiqiang, W., & Jun, L. (2017). A review of object detection based on convolutional neural network. In 2017 36th Chinese Control Conference (CCC) (pp. 11104-11109). IEEE.
    Zhao, Z. Q., Zheng, P., Xu, S. T., & Wu, X. (2019). Object detection with deep learning: A review. IEEE Transactions on Neural Networks and Learning Systems, 30(11), 3212-3232.

    無法下載圖示 全文公開日期 2024/06/30 (校內網路)
    全文公開日期 2024/06/30 (校外網路)
    全文公開日期 2024/06/30 (國家圖書館:臺灣博碩士論文系統)
    QR CODE