簡易檢索 / 詳目顯示

研究生: 吳霽函
Wu, Ge-Han
論文名稱: 基於深度學習影像辨識之芒果溫室薊馬監測系統實作
Thrips Monitoring System in Mango Greenhouses based on Deep Learning Image Recognition
指導教授: 黃能富
Huang, Nen-Fu
口試委員: 陳俊良
Chen, Chun-Liang
張耀中
Chang, Yao-Chung
學位類別: 碩士
Master
系所名稱: 電機資訊學院 - 資訊系統與應用研究所
Institute of Information Systems and Applications
論文出版年: 2023
畢業學年度: 111
語文別: 英文
論文頁數: 70
中文關鍵詞: 害蟲監測智慧農業物件偵測深度學習類神經網路芒果薊馬小黃薊馬花薊馬
外文關鍵詞: Pest Monitoring, Intelligent Agriculture, Object Detection, Deep Learning, Neuron Network, Mango, Thrips, Yellow Tea Thrips, Flower Thrips
相關次數: 點閱:3下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 芒果(Mangifera indica L.)是台灣乃至世界的具有很高經濟價值的熱門水果。薊馬(order Thysanoptera)是包括芒果在內的多種農作物的害蟲,芒果若遭薊馬叮咬會導致著果率下降與降低果實價值等,造成巨大的經濟損失,是溫室栽培中最重要的害蟲之一。對害蟲數量的準確監測是減少損失的重要手段和具有成本效益的控制方法,但傳統方法依賴於人力和專業知識,難以提供連續和可擴展的監測解決方案。
    在本文中,我們通過將高分辨率相機裝置與基於 Faster R-CNN 架構的影像識別模型相結合,提出並開發了一種可行的薊馬監測系統。經過驗證,我們的設備能夠每半小時以高達 6400萬像素的分辨率收集包含薊馬的黏蟲紙圖像,我們的模型識別薊馬的準確率為 90%且F1-score達到 0.95。我們還開發了一個易於使用的、基於網頁應用的使用者介面,以提供視覺圖像識別結果和結合環境數據的薊馬數量分析。


    The Mango (Mangifera indica L.) is a popular fruit with high economic value in not only Taiwan but even the world. Thrips (order Thysanoptera) are pests of many crops including mangoes, and bites by thrips will lower the fruiting rate and/or reduce fruit value which results in huge economic losses. It is one of the most notorious pests in greenhouse cultivation. Accurate monitoring of pest populations is an important mean and a cost-effective control method of reducing losses, but not only does traditional methods rely on manpower and expertise, it is also difficult to provide continuous and scalable monitoring solutions.
    In this thesis, we propose and develop a feasible thrip monitoring system by combining a high-resolution camera setup with an image recognition model based on the Faster R-CNN architecture. After validation, our device was able to collect sticky trap images at resolutions up to 64 megapixels every half an hour, and our model identified thrips with a 90% accuracy and an F1-score of 0.95. We also developed an easy-to-use, web-based user interface to provide visual image recognition results and thrip population analysis combined with environmental data.

    Abstract i 中文摘要 ii Contents iii List of Figures v List of Tables ix Chapter 1. Introduction 1 Chapter 2. Related Works 5 2.1 Pest Monitoring Research 5 2.2 Integrated Pest Detection Systems 9 2.3 Object Detection Methods Based on CNN/DCNN 11 Chapter 3. Design and Implementation 14 3.1 High Resolution Camera Devices 15 3.2 Dataset and Image Preprocess 22 3.2.1 Scanned_202202 dataset 22 3.2.2 V4k_202206 dataset 27 3.2.3 ‘tonly’ dataset suffix 29 3.3 Thrips Detection Model Design and Training 30 3.3.1 Model Architecture 30 3.3.2 Image Augmentation 32 3.3.3 Model Training and Fine-tuning 34 3.4 Monitoring System Design and Architecture 35 3.4.1 Camera System and Event Scheduler 37 3.4.2 Inference Starter 37 3.4.3 IPAP Inference System 39 3.4.4 Result Post-Processor 40 3.5 The ThripsUI 42 Chapter 4. Experimental Result 47 4.1 Validation Metrics 47 4.2 Experimental Results 51 4.2.1 Models Training on scanned_202202 52 4.2.2 Models Training on v4k_202206 55 Chapter 5. Conclusion and Future Work 60 References 64

    [1] Geoff M. Nicki, Ming C. Su, Chin H. Chang, "The environmental pressures of eating a mango: a preliminary life cycle assessment of mango production in," in A&WMA’s 109th Annual Conference & Exhibition, New Orleans, Louisiana, 2016.
    [2] Council of Agriculture Editors, Nong Ye Tong Ji Nian Bao = Agricultural Statistics yearbook, Taipei, Taiwan, ROC: Council of Agriculture, Executive Yuan, 2018.
    [3] Council of Agriculture Editors, “農產品別(COA)資料查詢,” Council of Agriculture, Executive Yuan, 2022. [線上]. Available: https://agrstat.coa.gov.tw/sdweb/public/trade/TradeCoa.aspx. [存取日期: 18 10 2022].
    [4] 蔡啟東, 張采蘋, 黃啟東, “芒果國際流通概況,” 於 提昇臺灣芒果產業價值鏈研討會專刊, 台南市, 2013.
    [5] 林鳳琪, 寧方俞, 陳巧燕, “為害油茶之薊馬及其防治策略,” 農業試驗所技術服務季刊, 編號 114, pp. 16-19, 01 06 2018.
    [6] R. G. Hollingsworth, K. T. Sewake, and J. W. Armstrong, "Scouting methods for detection of Thrips (Thysanoptera: Thripidae) on dendrobium orchids in Hawaii," Environmental Entomology, vol. 31, no. 3, p. 523–532, 2002.
    [7] T.-T. Hsieh, "Utilization and modification of CC Traps for monitoring Scirtothrips dorsalis (Hood) (Thysanoptera: Thripidae) in lemon and mango orchard," National Pingtung University of Science and Technology, Pingtung, Taiwan, 2007.
    [8] C. Xia, T.-S. Chon, Z. Ren and J.-M. Lee, "Automatic identification and counting of small size pests in greenhouse conditions with low computational cost," Ecological Informatics, Vols. 29, Part 2, pp. 139-146, September 2015.
    [9] M. Cardim Ferreira Lima, M. E. Damascena de Almeida Leandro, C. Valero, L. C. Pereira Coronel, and C. O. Gonçalves Bazzo, "Automatic detection and monitoring of insect pests—a review," Agriculture, vol. 10, no. 5, p. 161, May 2020.
    [10] J. Cho, J. Choi, M. Qiao, C. Ji, H. Kim, K. Uhm and T. Chon, "Automatic identification of whiteflies, aphids and thrips in greenhouse based on image analysis," International Journal of Mathematics and Computers in Simulation, vol. 1, pp. 46-53, 2007.
    [11] C. Li and J. Wu, "A Study on Automatic Recognition and Counting Techniques for Small Size Pests in Greenhouse Condition," National Chung Hsing University, Taichung, Taipei, 2016.
    [12] A. Nieuwenhuizen, J. Hemming, και H. K. Suh, "Detection and classification of insects on stick-traps in a tomato crop using Faster R-CNN," in The Netherlands Conference on Computer Vision, Eindhoven, 2018.
    [13] W. Li, D. Wang, M. Li, Y. Gao, J. Wu, and X. Yang, "Field detection of tiny pests from sticky trap images using deep learning in Agricultural Greenhouse," Computers and Electronics in Agriculture, vol. 183, p. 106048, 2021.
    [14] R. Wang, L. Jiao, C. Xie, P. Chen, J. Du and R. Li, "S-RPN: Sampling-balanced region proposal network for small crop pest detection," Computers and Electronics in Agriculture, vol. 187, p. 106290, 2021.
    [15] D. J. A. Rustia, C. E. Lin, J.-Y. Chung, Y.-J. Zhuang, J.-C. Hsu and T.-T. Lin, "Application of an image and environmental sensor network for automated greenhouse insect pest monitoring," Journal of Asia-Pacific Entomology, vol. 23, no. 1, pp. 17-28, 2020.
    [16] D. J. A. Rustia, J.-J. Chao, L.-Y. Chiu, Y.-F. Wu, J.-Y. Chung, J.-C. Hsu and T.-T. Lin, "Automatic greenhouse insect pest detection and recognition based on a cascaded deep learning classification method," Journal of Applied Entomology, vol. 145, no. 3, pp. 206-222, 2021.
    [17] Preti, M., Verheggen, F. & Angeli, S., "Insect pest monitoring with camera-equipped traps: strengths and limitations.," Journal of Pest Science, no. 94, pp. 203-217, 01 03 2021.
    [18] plantfellow, "ScoutCam," plantfellow, [Online]. Available: https://plantfellow.com/en/scoutcam-en/. [Accessed 10 01 2023].
    [19] "Monitoring thrips with image technology unique, biological pest controllers soon to be counted digitally," VerticalFarmDaily.com, [Online]. Available: https://www.verticalfarmdaily.com/article/9449816/monitoring-thrips-with-image-technology-unique-biological-pest-controllers-soon-to-be-counted-digitally/. [Accessed 10 01 2023].
    [20] "The Pest Detection System - PDS," TAIWAN HIPOINT CORPORATION, 01 07 2021. [Online]. Available: https://www.twhipoint.com/en/news_detail.asp?nid=13. [Accessed 10 01 2023].
    [21] R. Girshick, J. Donahue, T. Darrell, and J. Malik, "Rich feature hierarchies for accurate object detection and semantic segmentation," in 2014 IEEE Conference on Computer Vision and Pattern Recognition, 2014.
    [22] R. Girshick, "Fast R-CNN," in 2015 IEEE International Conference on Computer Vision (ICCV), 2015.
    [23] S. Ren, K. He, R. Girshick, and J. Sun, "Faster R-CNN: Towards real-time object detection with region proposal networks," in IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017.
    [24] J. Dai, Y. Li, K. He, and J. Sun, R-FCN: Object detection via region-based fully convolutional networks, arXiv.org, 2016.
    [25] T.-Y. Lin, P. Dollar, R. Girshick, K. He, B. Hariharan, and S. Belongie, "Feature Pyramid Networks for Object Detection," in 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2017.
    [26] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, "You only look once: Unified, real-time object detection," in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016.
    [27] J. Redmon, "Darknet: Open Source Neural Networks in C," https://pjreddie.com/, 2013-2016. [Online]. Available: http://pjreddie.com/darknet/. [Accessed 2022].
    [28] J. Redmon and A. Farhadi, "Yolo9000: Better, faster, stronger," in 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2017.
    [29] J. Redmon and A. Farhadi, Yolov3: An incremental improvement, arXiv.org, 2018.
    [30] A. Bochkovskiy, C.-Y. Wang, and H.-Y. M. Liao, Yolov4: Optimal Speed and accuracy of object detection, arXiv.org, 2020.
    [31] Wang, Chien-Yao and Liao, Hong-Yuan Mark and Yeh, I-Hau and Wu, Yueh-Hua and Chen, Ping-Yang and Hsieh, Jun-Wei, CSPNet: A New Backbone that can Enhance Learning Capability of CNN, arXiv, 2019.
    [32] G. Jocher, "YOLOv5," GitHub, Inc, 2020. [Online]. Available: https://github.com/ultralytics/yolov5. [Accessed 03 11 2022].
    [33] "PyTorch," The Linux Foundation, 2016. [Online]. Available: https://pytorch.org/. [Accessed 03 11 2022].
    [34] Xu, Shangliang and Wang, Xinxin and Lv, Wenyu and Chang, Qinyao and Cui, Cheng and Deng, Kaipeng and Wang, Guanzhong and Dang, Qingqing and Wei, Shengyu and Du, Yuning and Lai, Baohua, PP-YOLOE: An evolved version of YOLO, arXiv, 2022.
    [35] Li, Chuyi and Li, Lulu and Jiang, Hongliang and Weng, Kaiheng and Geng, Yifei and Li, Liang and Ke, Zaidan and Li, Qingyuan and Cheng, Meng and Nie, Weiqiang and Li, Yiduo and Zhang, Bo and Liang, Yufei and Zhou, Linyuan and Xu, Xiaoming and Chu, Xiangxia, YOLOv6: A Single-Stage Object Detection Framework for Industrial Applications, arXiv, 2022.
    [36] Wang, Chien-Yao and Bochkovskiy, Alexey and Liao, Hong-Yuan Mark, YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors, arXiv, 2022.
    [37] Y. Wu, A. Kirillov, F. Massa , W.-Y. Lo, and R. Girshick, "Facebookresearch/Detectron2: Detectron2 is a platform for object detection, segmentation and other visual recognition tasks.," GitHub, 2019. [Online]. Available: https://github.com/facebookresearch/detectron2. [Accessed 23 09 2022].
    [38] Amazon Web Services, "Amazon S3," Amazon Web Services, Inc., 26 10 2022. [Online]. Available: https://aws.amazon.com/tw/s3/. [Accessed 26 10 2022].
    [39] He, Kaiming and Zhang, Xiangyu and Ren, Shaoqing and Sun, Jian, Deep Residual Learning for Image Recognition, arXiv, 2015.
    [40] Ghiasi, Golnaz and Cui, Yin and Srinivas, Aravind and Qian, Rui and Lin, Tsung-Yi and Cubuk, Ekin D. and Le, Quoc V. and Zoph, Barret, Simple Copy-Paste is a Strong Data Augmentation Method for Instance Segmentation, arxiv, 2020.
    [41] Jianghao Rao and Jianlin Zhang, "Cut and Paste: Generate Artificial Labels for Object Detection," in ICVIP 2017: Proceedings of the International Conference on Video and Image Processing, 2017.
    [42] Amazon Web Services, Inc., "Amazon EventBridge," Amazon Web Services, Inc., 26 10 2022. [Online]. Available: https://aws.amazon.com/eventbridge/?nc1=h_ls. [Accessed 26 10 2022].
    [43] Amazon Web Services, Inc., "AWS Lambda," Amazon Web Services, Inc., 2022. [Online]. Available: https://aws.amazon.com/tw/lambda/. [Accessed 03 11 2022].
    [44] C.-H. Chung, On the Design and Implementation of Image-based AI Model Service Platform, Hsinchu, Taiwan: National Tsing Hua University, 2022.
    [45] E. You, "Vue.js," 2022. [Online]. Available: https://vuejs.org/. [Accessed 28 10 2022].
    [46] High Speed Network Lab, "Smart Farming Platform," High Speed Network Lab, National Tsing Hua University, Taiwan, 2022. [Online]. Available: https://nthu-smart-farming.kits.tw/. [Accessed 28 10 2022].
    [47] Y. Sasaki, The truth of the F-measure, 2007.
    [48] N. Dalal and B. Triggs, "Histograms of oriented gradients for human detection," in 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05), 2005.
    [49] Everingham, M., Eslami, S.M.A., Van Gool, L., Christopher K. I. Williams, John Winn and Andrew Zisserman, "The PASCAL Visual Object Classes Challenge: A Retrospective," International Journal of Computer Vision, IJCV, vol. 111, pp. 98-136, 2015.
    [50] T.-Y. Lin, M. Maire, S. Belongie, L. Bourdev, R. Girshick, J. Hays, P. Perona, D. Ramanan and C. L. Zitnick, Microsoft COCO: Common Objects in Context, arXiv, 2014.
    [51] "64MP Autofocus Camera for Raspberry Pi," ArduCam, 2023. [Online]. Available: https://www.arducam.com/64mp-ultra-high-res-camera-raspberry-pi/. [Accessed 11 01 2023].

    QR CODE