簡易檢索 / 詳目顯示

研究生: 陳映竹
Ying-Chu Chen
論文名稱: 基於人工智慧之小型自主水下機器人目標辨識、追蹤與導航
Artificial Intelligence Based Object Recognition and Tracking with Navigation for a Small Autonomous Underwater Robot
指導教授: 李敏凡
Min-Fan Lee
口試委員: 蔡明忠
Ming-Jong Tsai
湯梓辰
Tzu-Chen Tang
學位類別: 碩士
Master
系所名稱: 工程學院 - 自動化及控制研究所
Graduate Institute of Automation and Control
論文出版年: 2021
畢業學年度: 109
語文別: 英文
論文頁數: 87
中文關鍵詞: 避障深度學習機電整合目標追蹤水下機器人
外文關鍵詞: collision avoidance, deep learning, mechatronics, object tracking, underwater robot
相關次數: 點閱:345下載:18
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 由於複雜的海洋環境和水下通信的困難,目標追蹤、導航和數據傳輸是自主水下機器人的三個重要問題。然而目前基於非深度學習的目標追蹤存在嚴重的局限性,這會導致追蹤延遲,對目標的誤判以及部分或完全遮擋、光照、季節和視角的變化。本研究提出了一種自主無人水下載具對運動目標的追蹤與導航系統,包括動態目標的檢測、實時追蹤和導航三種架構。自製的三軸螺旋槳浮標式自主水下載具實現了克服數據傳輸困難的硬體架構,並應用慣性測量單元紀錄軌跡以進行驗證和比較。暹羅孿生網絡追蹤演算法應用兩個共享權重來追蹤運動中的目標。必須克服的關鍵是當物體不明時的單次檢測任務。通過深度學習模型的預測指標(準確性、精確度、召回率、P-R曲線、F1),將各種複雜且不確定的環境應用於評估所提出的系統。基於孿生網絡的追蹤率高達180FPS。本文對無人水下研究領域做出了兩大貢獻,即低成本的機構設計解決了過長電纜的通信衰減問題,而孿生網絡追蹤演算法克服了環境的不確定性。


    Due to the complicated marine environment and difficulty in underwater communication, object tracking, navigation and data transmission are three important issues for autonomous underwater vehicles. However, object tracking based on non-deep learning suffers from serious limitations, which lead to tracking delays, missed or mis-detection of the object and partial or complete occlusion, changes on object shape, illumination, season and viewpoints. An object in motion tracking and navigation system of autonomous underwater vehicle is proposed in this study, which includes three architectures: detecting dynamic targets, real-time tracking, and navigating to the target. A custom-made triple propeller buoy-tethered autonomous underwater vehicle implemented the hardware architecture that overcomes the difficulty of data transmission, and the inertial measurement unit is implemented to record the trajectory for verification and comparison. Siamese Region Proposal Network tracking algorithm using two weight sharing applied to rack the target in motion. The key point to overcome is one-shot detection task when object is unidentified. Various complex and uncertain environment scenarios are applied to evaluate the proposed system via the deep learning model's predictions metrics (accuracy, precision, recall, P-R curve, F1). The tracking rate based on Siamese Region Proposal Network Algorithm is up to 180 FPS. This study has made two major contributions to the field of unmanned underwater research, which are the low-cost buoy-tethered mechanism design solves the communication attenuation problem of excessively long cables, and Siamese Region Proposal Network tracking algorithm overcomes the uncertainty of the environment.

    Table of Contents Acknowledgements..............................................................I 摘要..........................................................................II ABSTRACT.....................................................................III Table of Contents.............................................................IV List of Figures................................................................V List of Tables...............................................................VII Chapter 1 Introduction.........................................................1 Chapter 2 Methods..............................................................5 2.1 Robotic System...................................................6 2.2 Algorithm.......................................................23 2.2.1 Panoramic image stitching with underwater image enhancement...24 2.2.2 Siamese Region Proposal Network for Object Tracking...........30 Chapter 3 Results.............................................................37 Chapter 4 Discussions.........................................................68 Chapter 5 Conclusions.........................................................71 References....................................................................72

    References
    [1] I. Schjølberg, T. B. Gjersvik, A. A. Transeth, and I. B. Utne, “Next Generation Subsea Inspection, Maintenance and Repair Operations,” IFAC-PapersOnLine., vol. 49, no. 23, pp. 434-439, Sep. 2016.
    [2] A. Martins, J. Almeida, C. Almeida, A. Dias, N. Dias, J. Aaltonen, A. Heininen, K. T. Koskinen, C. Rossi, S. Dominguez, C. Vörös, S. Henley, M. McLoughlin, H. Moerkerk, J. Tweedie, B. Bodo, N. Zajzon, and E. Silva,“UX 1 system design - A robotic system for underwater mining exploration,” in Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems, Madrid, Spain, 2018, pp. 1494-1500.
    [3] G. Wang, J. Hwang, K. Williams, and G. Cutter, “Closed-Loop Tracking-by-Detection for ROV-Based Multiple Fish Tracking,” in Proc. ICPR Workshop on Computer Vision for Analysis of Underwater Imagery, Cancun, Mexico, 2016, pp. 7-12.
    [4] M. M. Marques, M. Gatta, M. Barreto, V. Lobo, A. Matos, B. Ferreira, P. J. Santos, P. Felisberto, S. Jesus, F. Zabel, R. Mendonca, and F. Marques, “Assessment of a Shallow Water Area in the Tagus Estuary Using Unmanned Underwater Vehicle (or AUV's), Vector-Sensors, Unmanned Surface Vehicles, and Hexacopters – REX'17,” in Proc. OCEANS - MTS/IEEE Kobe Techno-Oceans, Kobe, Japan, 2018, pp. 1-5.
    [5] J. Ji, Y. Sun, T. Zhou, and J. Xu, “Study on method of cooperative laying mines with submarine and reconnaissance force based on joint blockade combat,” in Proc. IEEE International Conference on Computer Supported Cooperative Work in Design,
    Nanchang, China, 2016, pp. 31-34.
    [6] Y. Yang, Y. Chiu, C. Chen, L. Mu, W. Yang, and J. Guo, “Technology Analysis for
    Autonomous Underwater Vehicle,” in Proc. Ocean Engineering Conference, Tainan,
    Taiwan, 2012, pp. 739-744.
    [7] F. Xu, X. Ding, J. Peng, G. Yuan, Y. Wang, J. Zhang, and X. Fu, “Real-Time Detecting Method of Marine Small Object with Underwater Robot Vision,” in Proc. OCEANS -MTS/IEEE Kobe Techno-Oceans, Kobe, Japan, 2018, pp. 1-4.
    [8] D. K. Rout, B. N. Subudhi, T. Veerakumar, and S. Chaudhury, “Walsh–HadamardKernel-Based Features in Particle Filter Framework for Underwater Object Tracking,” IEEE Transactions on Industrial Informatics., vol. 16, no. 9, pp. 5712-5722, Sep. 2020.
    [9] S. A. T. Randeni P., N. R. Rypkema, E. M. Fischell, A. L. Forrest, M. R. Benjamin, and H. Schmidt, “Implementation of a Hydrodynamic Model-Based Navigation System for a Low-Cost AUV Fleet,” in Proc. IEEE/OES Autonomous Underwater Vehicle
    Workshop, Porto, Portugal, 2018, pp. 1-6.
    [10] S. Wu, M. Cai, C. Yang, W. Wu, and T. Wang, “Hydrodynamic Shape Optimization of an Autonomous Underwater Vehicle with an Integrated Lifting Line and Viscous
    Continuous Adjoint Approach,” IEEE Journal of Oceanic Engineering., vol. 45, no. 3,
    pp. 851-861, July 2020.
    [11] S. Kappagantula, G. A. Ramadass, and S. D. Adlinge, “Design of a Biomimetic Robot Fish for Realization of Coefficient of Drag with Control Architecture and Fuzzy Logic Algorithm for Autonomous Obstacle Avoidance,” in Proc. International Conference for Convergence in Technology, Pune, India, 2018, pp. 1-9.
    [12] C. Jing, Z. Lin, and J. Li, “Detection and tracking of an underwater target using the combination of a particle filter and track-before-detect,” in Proc. OCEANS - Shanghai, Shanghai, China, 2016, pp. 1-5.
    [13] B. Zhao, Y. Liang, X. Dong, Q. Li, and Z. Ren, “An Improved Motion Capture System for Multiple Wheeled Mobile Robots Based on KCF and GMM,” in Proc. Chinese Control Conference, Guangzhou, China, 2019, pp. 4095-4100.
    [14] S. Kong, X. Fang, X. Chen, Z. Wu, and J. Yu, “A real-time underwater robotic visual tracking strategy based on image restoration and kernelized correlation filters,” in Proc. Chinese Control and Decision Conference, Shenyang, China, 2018, pp. 6436-6441.
    [15] L. Bertinetto, J. Valmadre, J. F. Henriques, A. Vedaldi, and P. H. S. Torr, “FullyConvolutional Siamese Networks for Object Tracking,” in Lecture Notes in Computer Science, vol. 9914, G. Hua, and H. Jégou E d. European: Springer, Cham, 2016, pp. 850-865.
    [16] S. Jia, R. Zang, X. Li, X. Zhang, and M. Li, “Monocular Robot Tracking Scheme Based on Fully-Convolutional Siamese Networks,” in Proc. Chinese Automation Congress,Xi'an, China, 2018, pp. 2616-2620.
    [17] D. B. Mesquita, R. F. d. Santos, D. G. Macharet, M. F. M. Campos, and E. R.
    Nascimento, “Fully Convolutional Siamese Autoencoder for Change Detection in UAV
    Aerial Images,” IEEE Geoscience and Remote Sensing Letters., vol. 17, no. 8, pp. 1455-1459, Aug. 2020.
    [18] V. Bobkov, S. Melman, A. Kudrashov, and A. Scherbatyuk, “Vision-based navigation method for a local maneuvering of the autonomous underwater vehicle,” in Proc. IEEE Underwater Technology, Busan, Korea (South), 2017, pp. 1-5.
    [19] L. Zhang, Y. Li, G. Pan, Y. Zhang, and S. Li, “Terminal Stage Guidance Method for Underwater Moving Rendezvous and Docking Based on Monocular Vision,” in Proc.
    OCEANS - Marseille, Marseille, France, 2019, pp. 1-6.
    [20] J. Bosch, K. Istenič, N. Gracias, R. Garcia, and P. Ridao, “Omnidirectional
    Multicamera Video Stitching Using Depth Maps,” IEEE Journal of Oceanic
    Engineering., vol. 45, no. 4, pp. 1337-1352, Oct. 2020.
    [21] Y. Wang, W. Song, G. Fortino, L. Qi, W. Zhang, and A. Liotta, “An Experimental Based Review of Image Enhancement and Image Restoration Methods for Underwater
    Imaging,” IEEE Access., vol. 7, pp. 140233-140251, Oct. 2019.
    [22] Y. Chen, “Mass Shifter Mechanism for Small Rudderless Autonomous Underwater
    Vehicle,” M.S. thesis, Dept. Automation and Control. Eng., Taiwan Tech Univ., Taipei, Taiwan, 2021.
    [23] Y. Liu, Y. Yang, H. Zhang, and L. Zhang, “Computational Fluid Dynamics Prediction of the Dynamic Behavior of Autonomous Underwater Vehicles,” IEEE Journal of Oceanic Engineering., vol. 45, no. 3, pp. 724-739, July. 2021.
    [24] H. Su, C. Chen, Q. Cai, Z. Feng, T. Wang, Y. Lu, and J. Leng, “Computational Fluid Dynamics Study of Autonomous Underwater Vehicle with Vectorial Thrusters,” in
    Proc. OCEANS - MTS/IEEE Kobe Techno-Oceans, Kobe, Japan, 2018, pp. 1-5.
    [25] L. Meng, J. Shao, Z. Bian, and S. Huang, “Finite Element Analysis and Test
    Verification of T-groove Seal Structure,” in Proc. International Conference on Quality, Reliability, Risk, Maintenance, and Safety Engineering, Zhangjiajie, China, 2019, pp. 996-1001.
    [26] Y. Wang, C. Tan, G. Kou, and T. Shen, “Research Status of Mechanical Seal Dynamic Parameters,” in Proc. International Forum on Electrical Engineering and Automation, Hefei, China, 2020, pp. 516-521.
    [27] J. H. Kepper, B. C. Claus, and J. C. Kinsey, “A Navigation Solution Using a MEMS IMU, Model-Based Dead-Reckoning, and One-Way-Travel-Time Acoustic Range
    Measurements for Autonomous Underwater Vehicles,” IEEE Journal of Oceanic
    Engineering., vol. 44, no. 3, pp. 664-682, July. 2019.
    [28] F. Qasem, T. B. Susilo, S. Said, Z. Alarbash, M. Hasan, B. Jabakhanji, T. Beyrouthy, and S. alkork, “Preliminary Engineering Implementation on Multisensory Underwater Remotely Operated Vehicle (ROV) for Oil Spills Surveillance,” in Proc. International Conference on Bio-engineering for Smart Technologies, Paris, France, 2019, pp. 1-5.
    [29] Ø . Grefstad, and I. Schjølberg, “Navigation and collision avoidance of underwater vehicles using sonar data,” in Proc. IEEE/OES Autonomous Underwater Vehicle Workshop, Porto, Portugal, 2018, pp. 1-6.
    [30] A. Zarkasi, R. F. Malik, M. Ravi, E. Yudi, I. Angkotasan, and R. Zulfahmi, “Artificial Intelligence in Underwater Robot Navigation Using Sonar Sensors with Fuzzy Logic Method,” in Proc. International Conference on Informatics, Multimedia, Cyber and Information System, Jakarta, Indonesia, 2019, pp. 207-212.
    [31] P. Jiang, Q. Wei, Y. Chen, C. Yang, J. Fan, Z. Shou, and Z. Huang, “Real-time
    panoramic system for underwater cleaning robot,” in Proc. International Conference
    on Mechanical and Intelligent Manufacturing Technologies, Cape Town, South Africa,
    2018, pp. 155-159.
    [32] X. Ji, X. Xiang, and J. Huang, “Real-Time Panorama Stitching Method for UAV Sensor Images Based on the Feature Matching Validity Prediction of Grey Relational Analysis,” in Proc. International Conference on Control, Automation, Robotics and Vision, Singapore, Singapore, 2018, pp. 1454-1459.
    [33] E. C. Gallardo, C. F. M. Garcia, A. Zhu, D. C. Silva, J. A. G. González, D. M. Ortiz, S. Fernández, B. Urriza, J. V. López, A. Marín, H. Pérez, J. I. Reyes, and R. B. Bello, “A Comparison of Feature Extractors for Panorama Stitching in an Autonomous Car Architecture,” in Proc. International Conference on Mechatronics, Electronics and Automotive Engineering, Cuernavaca, Mexico, 2019, pp. 50-55.
    [34] W. Zhang, L. Dong, X. Pan, P. Zou, L. Qin, and W. Xu, “A Survey of Restoration and Enhancement for Underwater Images,” IEEE Access., vol. 7, pp. 182259-182279, Dec. 2019.
    [35] X. Chen, P. Zhang, L. Quan, C. Yi, and C. Lu, “Underwater Image Enhancement based on Deep Learning and Image Formation Model.” in Proc. arXiv preprint, pp. 1-7. Jan. 2021.
    [36] C. Wang, X. Sun, X. Chen, and W. Zeng, “Real-Time Object Tracking with Motion
    Information,” in Proc. IEEE Visual Communications and Image Processing, Taichung,
    Taiwan, 2018, pp. 1-4.
    [37] H. Pang, Q. Xuan, M. Xie, C. Liu, and Z. Li, “Target Tracking Based on Siamese
    Convolution Neural Networks,” in Proc. International Conference on Computer, Information and Telecommunication Systems, Hangzhou, China, 2020, pp. 1-5.
    [38] S. Jia, R. Zang, X. Li, X. Zhang, and M. Li, “Monocular Robot Tracking Scheme Based on Fully-Convolutional Siamese Networks,” in Proc. Chinese Automation Congress, Xi'an, China, 2018, pp. 2616-2620.
    [39] H. Zhou, and B. Ni, “Tracking of drone flight by neural network Siamese-RPN,” in Proc. International Conference on Engineering, Applied Sciences and Technology,
    Chiang Mai, Thailand, 2020, pp. 1-3.
    [40] H. Fan, and H. Ling, “Siamese Cascaded Region Proposal Networks for Real-Time
    Visual Tracking,” in Proc. IEEE/CVF Conference on Computer Vision and Pattern
    Recognition, CA, USA, 2019, pp. 7944-7953.
    [41] E. Shelhamer, J. Long, and T. Darrell, “Fully Convolutional Networks for Semantic Segmentation,” IEEE Transactions on Pattern Analysis and Machine Intelligence., vol. 39, no. 4, pp. 640-651, April. 2017.

    QR CODE