研究生: |
劉欣豪 Liu, Hsin-Hao |
---|---|
論文名稱: |
應用機器視覺於蝴蝶蘭盆苗葉片長度辨識 Leaf Length Estimation of Orchid Seedlings Using Machine Vision |
指導教授: |
陳榮順
Chen, Rong-Shun |
口試委員: |
白明憲
Bai, Ming-Sian 陳宗麟 Chen, Tsung-Lin |
學位類別: |
碩士 Master |
系所名稱: |
工學院 - 動力機械工程學系 Department of Power Mechanical Engineering |
論文出版年: | 2022 |
畢業學年度: | 110 |
語文別: | 中文 |
論文頁數: | 61 |
中文關鍵詞: | 蘭花盆苗生長監控 、機器視覺 、玫瑰曲線方程式 、深度影像 |
外文關鍵詞: | Orchid Plant Growth Monitoring, Rose Curve Equation, Depth Image, Machine Vision |
相關次數: | 點閱:3 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
溫室內蘭花盆苗苗株數量龐大以及生長緩慢,以人工採樣定期檢測,時常需要花費大量時間與人力成本。故本研究致力於研發應用於溫室蘭花盆栽苗葉成長尺寸測量之機器視覺辨識系統。以機器視覺週期性測量上層苗葉長度的生長趨勢,並將量測結果與合作廠商所提供之蘭花盆苗生長良率指標進行比較,確認生長狀態是否達到預期生長良率標準。本研究在實作上,使用深度相機以俯拍的方式,取得單一盆苗上層苗葉之彩色影像與深度影像。利用影像處理進行降噪與平滑化,並使用分水嶺演算法,將盆苗之上層葉片影像進行分割。透過修正後玫瑰曲線方程式,擬合外圍輪廓,取得擬合參數,並定義擬合度指標,判斷苗葉影像分割的正確性以及葉片完整度,從中篩選出完整的苗葉。並針對完整苗葉影像,沿曲線之對稱線分割採樣進行反投影並依序疊加,測量真實苗葉長度。而為驗證此機器視覺測量系統在蘭花盆栽苗株生長過程的適用性,本研究於合作廠商蘭花培育溫室內,進行苗株長期追蹤實驗,透過定期測量相同苗株之上層苗葉之長度,比較其不同時期生長趨勢,並進行測量精準度與苗株生長良率的評估。
Sampling and testing of orchid potted seedlings by manual are often time consuming and labor costing because of the large number of seedlings in greenhouse and their slow growing process. Therefore, this research is dedicated to the development of a machine vision recognition system to the measurement of leaf length growth of orchid potted seedlings in greenhouse. The growth trend of the upper seedlings leaf length is periodically observed and recorded by machine vision. In addition, the measurement results are compared with the growth yield index of orchid potted seedlings, provided by the cooperative manufacturer to confirm whether the growth state meets the expected growth yield standard. In practice, a depth camera was used to obtain the color and depth images of the upper seedling leaves of a single pot of seedlings in an overhead shot. Image processing is used for noise reduction and smoothing, and the watershed algorithm is applied to segment the image of the upper leaves of the potted seedlings. Moreover, through the modified rose curve equation fitting, the fitting parameters for the leaf of orchid seeding is obtained. In order to judge the correctness of the image segmentation of the seedling leaves and the integrity of the leaves, and to screen out the complete seedling leaves, a fitting criteria is defined in this study. For the completed seedling leaf image, the sample points are divided along the symmetry line of the curve for deprojection and superimposed in sequence to measure the actual seedling leaf length. To verify the feasibility of the proposed system in the growth process of orchid pot seedlings, in this study a long-term tracking experiments of seedlings are conducted in the orchid cultivation greenhouse of the cooperative manufacturer. By regularly measuring the growth length of the upper seedling leaves of the same seedling, the growth trend is compared, and the measurement accuracy and the growth yield of the seedling are evaluated.
[1] 李翎竹、林家伃、楊智凱、徐武煥,“智慧農業應用發展現況與
潛在人才需求研析",
農政與農情,337 期,頁 6672 ,2020。
[2] 行政院農委會,農業就業人口統計。[Online]. Available: https:
//agrstat.coa.gov.tw/sdweb/public/inquiry/InquireAdvance.aspx,
accessed: 20210826.
[3] 行政院農委會,單一農產品進出口量值 ─ 按國家別。[Online].
Available: https://agrstat.coa.gov.tw/sdweb/public/trade/tradereport.
aspx, accessed: 20210826.
[4] 皇基股份有限公司彰化研發中心,“良率預估標準",2021。
[5] A. Paturkar, G. Sen Gupta, and D. Bailey, “Making use of 3D models
for plant physiognomic analysis: A review,” Remote Sensing, vol. 13,
no. 11, p. 2232, 2021.
[6] Z. Li, R. Guo, M. Li, Y. Chen, and G. Li, “A review of computer vi
sion technologies for plant phenotyping,” Computers and Electronics
in Agriculture, vol. 176, p. 105672, 2020.
[7] K. Itakura and F. Hosoi, “Automatic leaf segmentation for estimat
ing leaf area and leaf inclination angle in 3D plant images,” Sensors,
vol. 18, no. 10, p. 3576, 2018.
[8] Y. Chéné, D. Rousseau, P. Lucidarme, J. Bertheloot, V. Caffier,
P. Morel, É. Belin, and F. ChapeauBlondeau, “On the use of depth
camera for 3D phenotyping of entire plants,” Computers and Electron
ics in Agriculture, vol. 82, pp. 122–127, 2012.
[9] J. Zou and G. Nagy, “Evaluation of modelbased interactive flower
recognition,” Proceedings of the 17th International Conference on Pat
tern Recognition, 2004. ICPR 2004., vol. 2, pp. 311–314, Cambridge,
UK, August 2326, 2004.
[10] Y.A. Chan, M.S. Liao, C.H. Wang, Y.C. Lee, and J.A. Jiang, “Im
age repainted method of overlapped leaves for orchid leaf area esti
mation,” 2015 9th International Conference on Sensing Technology
(ICST), pp. 205–210, Auckland, New Zealand, December 0810, 2015.
[11] P. Shi, D. A. Ratkowsky, Y. Li, L. Zhang, S. Lin, and J. Gielis, “A
general leaf area geometric formula exists for plants—evidence from
the simplified gielis equation,” Forests, vol. 9, no. 11, p. 714, 2018.
[12] 陳昶廷,“以機器視覺為基礎之蘭花種苗分級系統",國立中山大
學碩士論文,2019 年 8 月。
[13] P. E. L. Otoya and S. R. P. Gardini, “Realtime noninvasive leaf area
measurement method using depth images,” 2020 IEEE ANDESCON,
pp. 1–6, 2020.
[14] L. Keselman, J. I. Woodfill, A. GrunnetJepsen, and A. Bhowmik,
“Intel(r) realsense(tm) stereoscopic depth cameras,” 2017 IEEE Con
ference on Computer Vision and Pattern Recognition Workshops
(CVPRW), pp. 1267–1276, Honolulu, HI, USA, July 2126, 2017.
[15] Intel Corporation. Depth camera d435 intel realsense. [Online]. Avail
able: https://www.intelrealsense.com/depthcamerad435i/, accessed:
202106.
[16] Y.C. Du, M. Muslikhin, T.H. Hsieh, and M.S. Wang, “Stereo vision
based object recognition and manipulation by regions with convolu
tional neural network,” Electronics, vol. 9, no. 2, p. 210, 2020.
[17] G. Bradski, “The opencv library,” Dr.Dobb’s Journal, vol. 25, no. 11,
pp. 120–125, 2000.
[18] M. A. Gehan, N. Fahlgren, A. Abbasi, J. C. Berry, S. T. Callen,
L. Chavez, A. N. Doust, M. J. Feldman, K. B. Gilbert, J. G. Hodge
et al., “Plantcv v2: Image analysis software for highthroughput plant
phenotyping,” PeerJ, vol. 5, p. e4088, 2017.
[19] P. Virtanen, R. Gommers, T. E. Oliphant, M. Haberland, T. Reddy,
D. Cournapeau, E. Burovski, P. Peterson, W. Weckesser, J. Bright,
S. J. van der Walt, M. Brett, J. Wilson, K. J. Millman, N. Mayorov,
A. R. J. Nelson, E. Jones, R. Kern, E. Larson, C. J. Carey, İ. Polat,
Y. Feng, E. W. Moore, J. VanderPlas, D. Laxalde, J. Perktold, R. Cim
rman, I. Henriksen, E. A. Quintero, C. R. Harris, A. M. Archibald,
A. H. Ribeiro, F. Pedregosa, P. van Mulbregt, and SciPy 1.0 Contribu
tors, “SciPy 1.0: Fundamental Algorithms for Scientific Computing in
Python,” Nature Methods, vol. 17, pp. 261–272, 2020.
[20] J. Canny, “A computational approach to edge detection,” IEEE Trans
actions on pattern analysis and machine intelligence, no. 6, pp. 679–
698, 1986.
[21] N. Kanopoulos, N. Vasanthavada, and R. L. Baker, “Design of an image
edge detection filter using the sobel operator,” IEEE Journal of solid
state circuits, vol. 23, no. 2, pp. 358–367, 1988.
[22] H. Heijmans and L. Vincent, “Graph morphology in image analysis,”
Mathematical Morphology in Image Processing, vol. 34, pp. 171–203,
1992.
[23] S. Narkhede. Understanding confusion matrix. [Online]. Available:
https://towardsdatascience.com/understandingconfusionmatrix, ac
cessed: 202207.