ارائه یک مدل مبتنی بر یادگیری عمیق برای مکان‌یابی، تشخیص و طبقه‌بندی علفهای ‌هرز در مزارع سیب‌زمینی

نوع مقاله : مقاله کامپیوتر

نویسندگان

1 گروه مهندسی کامپیوتر، دانشکده مهندسی، دانشگاه کردستان، سنندج، ایران

2 گروه مهندسی بیوسیستم، دانشکده کشاورزی، دانشگاه کردستان، سنندج، ایران

چکیده

برداشت محصول سیب‌زمینی در زمین‌های کشاورزی با کمترین مصرف مواد اولیه (علف کش­ها یا عملیات مکانیکی) یکی از اهداف اصلی کشاورزی پایدار است. در کشاورزی دقیق، سیستم‌های مبتنی بر بینایی ماشین به عنوان حسگرهایی که در حالت حرکت برای تشخیص علف‌ها از گیاه اصلی کاربرد دارند، می­توانند مورد استفاده قرار گیرند. این مقاله یک مدل جدید بر اساس یادگیری عمیق برای تشخیص علف‌های هرز در زمین‌های سیب‌زمینی را ارائه می‌کند. در این راستا، ابتدا یک پایگاه داده جامع از زمین تحت کشت سیب‌زمینی شامل تصاویری از مراحل مختلف رشد گیاه، در فواصل مختلف دوربین از زمین، در ساعات مختلف روز و در شرایط محیطی متفاوت تهیه شده است. سپس، با استفاده از الگوریتم عمیق YOLOV3، موقعیت کل گیاهان در زمین تعیین می­گردد. در نهایت، برای جداسازی علف‌های هرز از گیاه اصلی و تعیین نوع علف هرز سه شبکه‌های عصبی کانولوشنی توسعه داده شده است. نتایج نشان داده که الگوریتم YOLOV3 قادر است به خوبی مکان گیاهان را در تصاویر تشخیص دهد. همچنین این روش می­تواند با دقت 99.64٪ علف‌ها را از گیاهان سیب‌زمینی در مجموعه تصاویر آزمایشی تشخیص دهد. نتایج دسته‌بندی 9 گونه مختلف علف‌های هرز نیز با استفاده از مدل‌های یادگیری عمیق توسعه داده شده قابل قبول است؛ به طوری که دقت کلی مدل‌های EN-Inception-V3، EN-VGG-16 و HCNN به ترتیب در فاز آموزش 99.82٪، 99.89٪ و 92.83٪ و در فاز آزمایش 96.69٪، 90.32٪ و 82.67٪ می­باشد. البته ترکیب این مدل­ها دقت تشخیص را تا 98.2% ارتقا می­ دهد.

کلیدواژه‌ها

موضوعات


عنوان مقاله [English]

A New Deep Vision-Based Identifier As An Intelligent Herbicide Spraying Agent For Potato Farm Application

نویسندگان [English]

  • Halo Omer Anvar 1
  • Fardin Akhlaghian Tab 1
  • Mohsen Ramezani 1
  • Kaveh Mollazade 2
1 Department of Computer Engineering, University of Kurdistan, Sanandaj, Iran
2 Department of Biosystems Engineering, University of Kurdistan, Sanandaj, Iran
چکیده [English]

Weeding in potato fields with the least consumption of inputs (herbicide or mechanical operations) is one of the main goals of sustainable agriculture. In precision agriculture, machine vision-based systems are used as on-the-go sensing units to detect weeds from the main plant. This paper presents a new approach based on deep learning to detect weeds in potato fields. For this purpose, first, a comprehensive database was created, including the acquired images of the potato field (at different stages of plant growth, at different distances of the camera from the ground, at different hours of the day, and in different environmental conditions). Then, the location of the plants in the field (including weeds and potato plants) was determined using the deep YOLOV3 algorithm. Finally, to separate weeds from the main plant as well as to determine the type of weed, three different types of convolutional neural networks were developed. The results showed that the YOLOV3 algorithm is well able to localize the plants in the images. EN-Inception-V3 classifier was able to distinguish weeds from potato plants in the set of test images with 99.42% accuracy. The classification results of 9 different weed species using the developed deep learning models were satisfactory; so that the overall accuracy of EN-Inception-V3, EN-VGG-16, and HCNN models was 99.82%, 99.89%, and 92.83% in the training phase, and 96.69%, 90.32%, and 82.67% in the test phase, respectively. It should be noted that the combination of models leads to 98.2% accuracy in detecting the type of weeds.

کلیدواژه‌ها [English]

  • Convolutional neural networks
  • Machine vision
  • Precision agriculture
  • Deep model
  • Environmentally friendly model
[1] FAO, "FAOSTAT database", June 10, 2022, https://www.fao.org/faostat/en/#data/QCL/visualize.
[2] N. Korres, N. Burgos, and S. Duke. "Weed control: sustainability, hazards, and risks in cropping systems worldwide." CRC Press, Dec 19, 2018.
[3] A. Sharshar, E.H. Hassanein, O. ElSaid Shaltout, M. Yousry, and A. El-Gamal. "Effect of Some Weed Control Treatments on Potato (Solanum tuberosum, L) Crop." Journal of the Advances in Agricultural Researches 20, no. 2 (2015): 238-253.
[4] P.J. Herrera, J. Dorado, and Á. Ribeiro. "A novel approach for weed type classification based on shape descriptors and a fuzzy decision-making method." Sensors 14, no. 8 (2014): 15304-15324.
[5] “Weed control in potato. Agriculture, Aquaculture, and Fisheries, Government of New Brunswick”, 2021, https://www2.gnb.ca/content/dam/gnb/Departments/10/pdf/Agriculture/WeedControlPotato.pdf.
[6] M.P. Rico-Fernández, R. Rios-Cabrera, M. Castelán, H.I. Guerrero-Reyes, and A. Juarez-Maldonado. "A contextualized approach for segmentation of foliage in different crop species." Computers and Electronics in Agriculture 156 (2019): 378-386.
[7] A. Wang, W. Zhang, and X. Wei. "A review on weed detection using ground-based machine vision and image processing techniques." Computers and Electronics in Agriculture 158 (2019): 226-240.
[8] L. Alzubaidi, J. Zhang, A.J. Humaidi, A. Al-Dujaili, Y. Duan, O. Al-Shamma, J. Santamaría, M.A. Fadhel, M. Al-Amidie, and L. Farhan. "Review of deep learning: concepts, CNN architectures, challenges, applications, future directions." Journal of Big Data 8 (2021): 1-74.
[9] M. Dyrmann, H. Karstoft, and H.S. Midtiby. "Plant species classification using deep convolutional neural network." Biosystems Engineering 151 (2016): 72-80.
[10] H. Jiang, C. Zhang, Y. Qiao, Z. Zhang, W. Zhang, and C. Song. "CNN feature based graph convolutional network for weed and crop recognition in smart farming." Computers and Electronics in Agriculture 174 (2020): 105450.
[11] M. Mardani, and M. Salarpour. "Measuring technical efficiency of potato production in Iran using robust data envelopment analysis." Information Processing in Agriculture 2, no. 1 (2015): 6-14.
[12] S. Sabzi, Y. Abbaspour-Gilandeh, and J. Ignacio Arribas. "An automatic visible-range video weed detection, segmentation and classification prototype in potato field." Heliyon 6, no. 5 (2020).
[13] F. Lin, D. Zhang, Y. Huang, X. Wang, and X. Chen. "Detection of corn and weed species by the combination of spectral, shape and textural features." Sustainability 9, no. 8 (2017): 1335.
[14] M.A. Molina-Villa, and L.E. Solaque-Guzmán. "Machine vision system for weed detection using image filtering in vegetables crops." Revista Facultad de Ingeniería Universidad de Antioquia 80 (2016): 124-130.
[15] M.R Golzarian, and R.A. Frick. "Classification of images of wheat, ryegrass and brome grass species at early growth stages using principal component analysis." Plant Methods 7 (2011): 1-11.
[16] A. Bakhshipour, and A. Jafari. "Evaluation of support vector machine and artificial neural networks in weed detection using shape features." Computers and Electronics in Agriculture 145 (2018): 153-160.
[17] H.C. Ngo, U. Raba’ah Hashim, Y.W. Sek, Y. Jaya Kumar, and W. Sing Ke. "Weeds detection in agricultural fields using convolutional neural network." International Journal of Innovative Technology and Exploring Engineering 8, no. 11 (2019): 292-296.
[18] F.J. Knoll, V. Czymmek, S. Poczihoski, T. Holtorf, and S. Hussmann. "Improving efficiency of organic farming by using a deep learning classification approach." Computers and Electronics in Agriculture 153 (2018): 347-356.
[19] T. Sarvini, T. Sneha, S. Gowthami GS, S. Sushmitha, and R. Kumaraswamy. "Performance comparison of weed detection algorithms." In 2019 International Conference on Communication and Signal Processing (ICCSP), pp. 0843-0847. IEEE, 2019.
[20] G.L. Grinblat, L.C. Uzal, M.G. Larese, and P.M. Granitto. "Deep learning for plant identification using vein morphological patterns." Computers and Electronics in Agriculture 127 (2016): 418-424.
[21] A. dos Santos Ferreira, D. Matte Freitas, G. Gonçalves da Silva, H. Pistori, and M. Theophilo Folhes. "Weed detection in soybean crops using ConvNets." Computers and Electronics in Agriculture 143 (2017): 314-324.
[22] H.K. Suh, J. Ijsselmuiden, J. Willem Hofstee, and E.J. van Henten. "Transfer learning for the classification of sugar beet and volunteer potato under field conditions." Biosystems Engineering 174 (2018): 50-65.
[23] A. Olsen, D.A. Konovalov, B. Philippa, P. Ridd, J.C. Wood, J. Johns, W. Banks et al. "DeepWeeds: A multiclass weed species image dataset for deep learning." Scientific Reports 9, no. 1 (2019): 2058.
[24] G.G. Peteinatos, P. Reichel, J. Karouta, D. Andújar, and R. Gerhards. "Weed identification in maize, sunflower, and potatoes with the aid of convolutional neural networks." Remote Sensing 12, no. 24 (2020): 4185.
[25] J. Gao, A.P. French, M.P. Pound, Y. He, T.P. Pridmore, and J.G. Pieters. "Deep convolutional neural networks for image-based Convolvulus sepium detection in sugar beet fields." Plant Methods 16 (2020): 1-12.
[26] C. Shorten, and T.M. Khoshgoftaar. "A survey on image data augmentation for deep learning." Journal of Big Data 6, no. 1 (2019): 1-48.
[27] X. Ying. "An overview of overfitting and its solutions." In Journal of Physics: Conference Series, vol. 1168, p. 022022. IOP Publishing, 2019.
[28] A.A. Binguitcha-Fare, and P. Sharma. "Crops and weeds classification using convolutional neural networks via optimization of transfer learning parameters." Int J Eng Adv Technol (IJEAT) 8, no. 5 (2019): 2249-8958.
[29] Z. Gao, L. Wang, L. Zhou, and J. Zhang. "HEp-2 cell image classification with deep convolutional neural networks." IEEE Journal Of Biomedical And Health Informatics 21, no. 2 (2016): 416-428.
[30] S. Kornblith, J. Shlens, and Q.V. Le. "Do better imagenet models transfer better?." In Proceedings Of The IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 2661-2671. 2019.
[31] M.S. Islam, F.A. Foysal, N. Neehal, E. Karim, and S.A. Hossain. "InceptB: a CNN based classification approach for recognizing traditional bengali games." Procedia Computer Science 143 (2018): 595-602.
[32] M. Shu. "Deep learning for image classification on very small datasets using transfer learning." (2019).
[33] M. Yani, S.S.M. Budhi Irawan, and S.M. Casi Setiningsih. "Application of transfer learning using convolutional neural network method for early detection of terry’s nail." In Journal of Physics: Conference Series, vol. 1201, no. 1, p. 012052. IOP Publishing, 2019.
[34] Z. Jiang. "A novel crop weed recognition method based on transfer learning from VGG16 implemented by Keras." In IOP Conference Series: Materials Science and Engineering, vol. 677, no. 3, p. 032073. IOP Publishing, 2019.
[35] A. Garcia-Perez, F. Gheriss, and D. Bedford. "Measurement, reliability, and validity." In Designing and Tracking Knowledge Management Metrics, pp. 163-182. Emerald Publishing Limited, 2019.
[36] A. Kapoor, R. Shah, R. Bhuva, and T. Pandit. Understanding inception network architecture For image classification. 2020.
[37] W. Fang, L. Wang, and P. Ren. "Tinier-YOLO: A real-time object detection method for constrained environments." Ieee Access 8 (2019): 1935-1944.
[38] I. Kandel, and M. Castelli. "Transfer learning with convolutional neural networks for diabetic retinopathy image classification. A review." Applied Sciences 10, no. 6 (2020): 2021.
[39] V. Maeda-Gutiérrez, C.E. Galvan-Tejada, L.A. Zanella-Calzada, J.M. Celaya-Padilla, J.I. Galván-Tejada, H. Gamboa-Rosales, H. Luna-Garcia, R. Magallanes-Quintanar, C.A. Guerrero Mendez, and C.A. Olvera-Olvera. "Comparison of convolutional neural network architectures for classification of tomato plant diseases." Applied Sciences 10, no. 4 (2020): 1245.
[40] K. Avery, J. Pan, C. Carvalho Engler-Pinto, Z. Wei, F. Yang, S. Lin, L. Luo, and D. Konson. "Fatigue behavior of stainless steel sheet specimens at extremely high temperatures." SAE International Journal of Materials and Manufacturing 7, no. 3 (2014): 560-566.
[41] M. Buzzy, V. Thesma, M. Davoodi, and J. Mohammadpour Velni. "Real-time plant leaf counting using deep object detection networks." Sensors 20, no. 23 (2020): 6896.
[42] S. Sabzi, and Y. Abbaspour-Gilandeh. "Using video processing to classify potato plant and three types of weed using hybrid of artificial neural network and partincle swarm algorithm." Measurement 126 (2018): 22-36.