簡易檢索 / 詳目顯示

研究生: 林紀揚
JI-Yang Lin
論文名稱: 水敏試紙偵測與噴灑分析之研究
Water-Sensitive Paper Detection and Spray Analysis
指導教授: 楊傳凱
Chuan-Kai Yang
口試委員: 林伯慎
Bor-Shen Lin
劉力瑜
Li-Yu Liu
學位類別: 碩士
Master
系所名稱: 管理學院 - 資訊管理系
Department of Information Management
論文出版年: 2022
畢業學年度: 110
語文別: 中文
論文頁數: 82
中文關鍵詞: 水敏試紙圖像分割精準農業植保機噴灑影像處理
外文關鍵詞: Water-Sensitive Paper, Image Segmentation, Precision Farming, Drone Spraying, Image Processing
相關次數: 點閱:116下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 全球人口從過去19世紀的10億人口,到現今已成長至79億人口,預計到2050年將達到92億。為了滿足日益增長的糧食需求,我們需要提升農作物的產量。然而,要將農作物一直維持在高產量的狀態是個艱困挑戰。在糧食生產的過程中受到病蟲害的威脅尤其嚴重,而噴灑農藥是現代農業中常用且有效的病蟲害防治方式。但是過度的使用農藥除了會造成生態環境的破壞,農藥殘留問題也會對人體造成傷害。而如何才能夠方便且精確地評估噴灑效果為本論文的首要目標。
    本論文採用水敏試紙(Water-Sensitive Paper, WSP)作為噴灑評估工具。過去也有相關研究將評估噴灑結果的任務移至行動裝置上進行,雖能即時的分析結果,但因其有諸多限制如:試紙拍攝角度需垂直、試紙影像要清楚無雜訊等,故本論文將提出一套能夠突破過去研究限制的試紙影像處理流程。
    本論文透過訓練Faster R-CNN模型來辨識圖片中的試紙,利用Grab Cut提取試紙影像,然後校正試紙為標準尺寸,經由去除陰影並使用Otsu閾值方法與K-means分群法分割噴灑液滴區域。試紙辨識的Bounding box MIoU為0.9766;試紙頂點的MIoU為0.9824;並在後續實驗中呈現標準試紙(Regular WSP)、掃描試紙(Scan WSP)、戶外試紙(Outdoor WSP)等三種不同類型的試紙影像的噴灑覆蓋百分比評估結果,並與先前研究提出的DropLeaf 水敏試紙分析APP進行液滴分割結果比較。
    在研究水敏試紙時發現,目前並沒有提供水敏試紙資料庫的來源。因缺乏適當的資料庫,故本論文將提供一種試紙影像合成方式,透過輸入掃描試紙(Scan WSP)作為輸入影像,並產生出合成試紙影像,以利後續想在水敏試紙上做不同實驗的研究者。


    The world population has grown from 1 billion in the 19th century to 7.9 billion in 2022, and is expected to reach 9.2 billion in 2050. In order to satisfy the increasing demand of food, we need to enhance the crop productivity. However, keeping the crop in high productivity is a challenge. The pests and disease are the most harmful factors for the crop growth and harvest. Spraying pesticides is one of the effective and commonly used methods to control pests and disease in modern commercial crop system. Nevertheless, overusing pesticides would destroy the crop field’s ecosystem and pesticide residues would be harmful for human body. Therefore, how to evaluate the spraying result precisely and conveniently is our main purpose.
    This paper uses Water-Sensitive Paper (WSP) image as the spraying evaluation tool. There are some limitations in previous research, such as the capturing angle should be perpendicular to WSP and image quality cannot contain shadow or noise. We propose a method which can overcome these limitations.
    We train a Faster R-CNN model to detect WSP in images, and use Grab Cut algorithms to extract the WSP from predicted bounding box, then calibrate WSP to the standard paper size. Since the WSP image was captured in an outdoor environment, we use some image processing method to remove the uneven shadows in the WSP image. In segmentation step, we combine Otsu binarization and K-means clustering method in RGB and HSV color space to partition spraying drops.
    In the experiment, the WSP bounding box MIoU is 0.9766; the 4 corners points of WSP MIoU is 0.9824. We use Regular WSP, Scan WSP and Outdoor WSP three different WSP image’s coverage area percentage to compare with DropLeaf App’s result. Because there is no dataset of WSP image, we provide a synthesis method to create synthetics WSP images by the input of a real WSP image. It may help researchers who are interested in WSP processing.

    目錄 摘要 III Abstract IV 誌謝 V 目錄 VI 圖目錄 VIII 表目錄 XI 第一章 緒論 1 1.1研究背景 1 1.2研究動機與目的 1 1.3論文架構 3 第二章 文獻探討 3 2.1水敏試紙(Water-Sensitive Paper) 3 2.2物件辨識(Object Detection) 4 2.3圖像分割(Image Segmentation) 5 第三章 演算法設計與系統實作 9 3.1系統流程 9 3.2試紙辨識(WSP Detection) 10 3.3試紙分割(WSP Segmentation) 12 3.4試紙校正(WSP Calibration) 15 3.4.1試紙邊角特徵偵測 (WSP Corner Detection) 15 3.4.2試紙邊線特徵偵測 (WSP Edge Detection) 18 3.5試紙分析(WSP Analysis) 21 3.5.1基於閾值分割方法 21 3.5.2基於機器學習分割方法 22 3.5.3清除雜訊 23 3.6試紙影像合成(WSP Image Synthesis) 24 3.6.1合成方法 24 第四章 實驗結果與評估 29 4.1系統環境 29 4.2試紙影像來源 29 4.3試紙辨識結果與評估 31 4.4試紙分割結果與評估 34 4.5試紙分析結果與評估 38 4.5.1標準水敏試紙 (Regular WSP) 評估 38 4.5.2掃描水敏試紙 (Scanning WSP) 評估 40 4.5.3戶外水敏試紙 (Outdoor WSP) 評估 45 4.5.4清除雜訊 48 4.5.5噴灑分析 52 4.6試紙影像合成結果與分析 53 4.6.1 使用者研究(User Study) 56 第五章 結論與未來展望 63 參考文獻 64

    參考文獻

    [1] Roser, M., Ritchie, H., & Ortiz-Ospina,, "World population growth.," Our world in data, First published in 2013; most recent substantial revision in May 2019..
    [2] A. Neveu, “Feeding the world in 2050.,” Comptes Rendus de l'Académie d'Agriculture de France, pp. 95(1), 57-72., 2009.
    [3] Tsouros, D. C., Bibi, S., & Sarigiannidis, P. G., “A review on UAV-based applications for precision agriculture.,” Information, pp. 10(11), 349., 2019.
    [4] Radoglou-Grammatikis, P., Sarigiannidis, P., Lagkas, T., & Moscholios, I., "A compilation of UAV applications for precision agriculture.," Computer Networks, p. 172: 107148., 2020.
    [5] Fisher, R. W., & Menzies, D. R, “Effect of spray droplet density and exposure time on the immobilization of newly-hatched oriental fruit moth larvae.,” Journal of Economic Entomology, pp. 69.4: 438-440, 1976.
    [6] Brandoli, B., Spadon, G., Esau, T., Hennessy, P., Carvalho, A. C., Amer-Yahia, S., & Rodrigues-Jr, J. F., "DropLeaf: A precision farming smartphone tool for real-time quantification of pesticide application coverage.," Computers and Electronics in Agriculture, p. 180: 105906, 2021.
    [7] Zhu, H., Salyani, M., & Fox, R. D., "A portable scanning system for evaluation of spray deposit distribution.," Computers and Electronics in Agriculture, pp. 76.1: 38-43, 2011.
    [8] Nansen, C., Ferguson, J. C., Moore, J., Groves, L., Emery, R., Garel, N., & Hewitt, A., “Optimizing pesticide spray coverage using a novel web and smartphone tool, SnapCard.,” Agronomy for Sustainable Development, pp. 35.3: 1075-1085, 2015.
    [9] Ferguson, J. C., Chechetto, R. G., O’Donnell, C. C., Fritz, B. K., Hoffmann, W. C., Coleman, C. E., ... & Hewitt, A. J., “Assessing a novel smartphone application – SnapCard, compared to five imaging systems to quantify droplet deposition on artificial collectors,” Computers and Electronics in Agriculture, pp. 128: 193-198, 2016.
    [10] Machado, B. B., Spadon, G., Arruda, M. S., Goncalves, W. N., Carvalho, A. C., & Rodrigues-Jr, J. F., “A smartphone application to measure the quality of pest control spraying machines via image analysis.,” Proceedings of the 33rd Annual ACM Symposium on Applied Computing, pp. 956-963, 2018.
    [11] Marcal, A. R., “Robust Detection of Water Sensitive Papers,” International Conference Image Analysis and Recognition. Springer, Cham, pp. 218-226, 2018.
    [12] Turner, C. R., & Huntington, K. A., "The use of a water sensitive dye for the detection and assessment of small spray droplets.," Journal of Agricultural Engineering Research, pp. 15.4: 385-387., 1970.
    [13] Syngenta, T, “Water-sensitive paper for monitoring spray distributions,” Basel: Syngenta Crop Protection AG, 2002.
    [14] Cunha, M., Carvalho, C., & Marcal, A. R, "Assessing the ability of image processing software to analyse spray quality on water-sensitive papers used as artificial targets.," Biosystems engineering, pp. 111.1: 11-23, 2012.
    [15] Cunha, J. P. A. R., Farnese, A. C., & Olivet, J. J., "Computer programs for analysis of droplets sprayed on water sensitive papers.," Planta Daninha, pp. 31.3: 715-720, 2013.
    [16] Özlüoymak, Ö. B., & Bolat, A., “Development and assessment of a novel imaging software for optimizing the spray parameters on water-sensitive papers.,” Computers and Electronics in Agriculture, pp. 168, 105104, 2020.
    [17] Hoffmann, W. C., & Hewitt, A. J., “Comparison of three imaging systems for water-sensitive papers.,” Applied engineering in agriculture, pp. 21(6), 961-964, 2005.
    [18] D. G. Lowe, “Object recognition from local scale-invariant features.,” In Proceedings of the seventh IEEE international conference on computer vision, pp. 1150-1157, September 1999.
    [19] Dalal, N., & Triggs, B., “Histograms of oriented gradients for human detection.,” IEEE computer society conference on computer vision and pattern recognition, pp. 886-893, June 2005.
    [20] LeCun, Y., Bengio, Y., & Hinton, G., “Deep learning,” nature, pp. 521(7553), 436-444., 2015.
    [21] Redmon, J., Divvala, S., Girshick, R., & Farhadi, A., “You only look once: Unified, real-time object detection.,” Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 779-788, 2016.
    [22] Girshick, R., Donahue, J., Darrell, T., & Malik, J., “Rich feature hierarchies for accurate object detection and semantic segmentation.,” In Proceedings of the IEEE conference on computer vision and pattern recognition., pp. 580-587, 2014.
    [23] R. Girshick, “Fast r-cnn.,” In Proceedings of the IEEE international conference on computer vision , pp. 1440-1448, 2015.
    [24] Ren, S., He, K., Girshick, R., & Sun, J., “Faster r-cnn: Towards real-time object detection with region proposal networks.,” Advances in neural information processing systems, p. 28., 2015.
    [25] Cheng, H. D., Jiang, X. H., Sun, Y., & Wang, J., “Color image segmentation: advances and prospects.,” Pattern recognition, pp. 34(12), 2259-2281., 2001.
    [26] Otsu, N., “A threshold selection method from gray-level histograms.,” IEEE transactions on systems, man, and cybernetics, pp. 9(1), 62-66., 1979.
    [27] Al-Amri, S. S., & Kalyankar, N. V., “Image segmentation by using threshold techniques.,” arXiv preprint arXiv:1005.4020., 2010.
    [28] Sharif, M., Mohsin, S., Javed, M. Y., & Ali, M. A., “Single Image Face Recognition Using Laplacian of Gaussian and Discrete Cosine Transforms.,” Int. Arab J. Inf. Technol., pp. 9(6), 562-570., 2012.
    [29] Kang, W. X., Yang, Q. Q., & Liang, R. P., “The comparative research on image segmentation algorithms.,” In 2009 First International Workshop on Education Technology and Computer Science, pp. (Vol. 2, pp. 703-707). IEEE., ,March 2009.
    [30] Beucher, S., "The watershed transformation applied to image segmentation.," Scanning Microscopy, pp. 1992(6), 28., 1992.
    [31] C. R. Jung, “ Multiscale image segmentation using wavelets and watersheds.,” In 16th Brazilian symposium on computer graphics and image processing (SIBGRAPI 2003), pp. (pp. 278-284). IEEE., October 2003.
    [32] Dhanachandra, N., & Chanu, Y. J., “A survey on image segmentation methods using clustering techniques.,” European Journal of Engineering and Technology Research, pp. 2(1), 15-20., 2017.
    [33] Naz, S., Majeed, H., & Irshad, H., “Image segmentation using fuzzy clustering: A survey.,” In 2010 6th international conference on emerging technologies (ICET), pp. 181-186. IEEE., 2010.
    [34] Boykov, Y. Y., & Jolly, M. P., “Interactive graph cuts for optimal boundary & region segmentation of objects in ND images.,” In Proceedings eighth IEEE international conference on computer vision., pp. ICCV 2001 (Vol. 1, pp. 105-112). IEEE., 2001.
    [35] M. Marsh, "Implementing the "GrabCut" Segmentation Technique as a Plugin for the GIMP," [Online]. Available: https://www.cs.ru.ac.za/research/g02m1682/. [Accessed 20 July 2022].
    [36] Rother, C., Kolmogorov, V., & Blake, A., “" GrabCut" interactive foreground extraction using iterated graph cuts.,” ACM transactions on graphics (TOG), pp. 23(3), 309-314., 2004.
    [37] Minaee, S., Boykov, Y. Y., Porikli, F., Plaza, A. J., Kehtarnavaz, N., & Terzopoulos, D., “Image segmentation using deep learning: A survey.,” IEEE transactions on pattern analysis and machine intelligence., 2021.
    [38] Campadelli, P., Medici, D., & Schettini, R., “Color image segmentation using Hopfield networks.,” Image and Vision Computing, pp. 15(3), 161-166., 1997.
    [39] Ronneberger, O., Fischer, P., & Brox, T., “U-net: Convolutional networks for biomedical image segmentation.,” In International Conference on Medical image computing and computer-assisted intervention, pp. 234-241., 2015.
    [40] Bolelli, F., Allegretti, S., & Grana, C., “One DAG to rule them all.,” IEEE Transactions on Pattern Analysis and Machine Intelligence., 2021.
    [41] Bolelli, F., Allegretti, S., Baraldi, L., & Grana, C., “Spaghetti labeling: Directed acyclic graphs for block-based connected components labeling.,” IEEE Transactions on Image Processing, pp. 29, 1999-2012., 2019.
    [42] Harris, C., & Stephens, M., “A combined corner and edge detector.,” In Alvey vision conference, pp. Vol. 15, No. 50, pp. 10-5244, August 1988.
    [43] Sklansky, J., “Finding the convex hull of a simple polygon.,” Pattern Recognition Letters, pp. 1(2), 79-83., 1982.
    [44] Liao, S., Liu, L. Y., Chen, T. A., Chen, K. Y., & Hsieh, F., “Color-complexity enabled exhaustive color-dots identification and spatial patterns testing in images.,” PloS one, pp. 16(5), e0251258., 2021.
    [45] T. WOLF, "Water Sensitive Paper for Assessing Spray Coverage," 20 May 2015. [Online]. Available: https://sprayers101.com/wsp-coverage/. [Accessed 1 7 2022].
    [46] Tadić, V., Marković, M., Plaščak, I., Stošić, M., Lukinac-Čačić, J., & Vujčić, B., “Impact of technical spraying factors on leaf area coverage in an apple orchard.,” Tehnički vjesnik, pp. 21(5), 1117-1124., 2014.

    無法下載圖示 全文公開日期 2025/08/15 (校內網路)
    全文公開日期 2028/08/15 (校外網路)
    全文公開日期 2028/08/15 (國家圖書館:臺灣博碩士論文系統)
    QR CODE