Leveraging deep semantic segmentation for assisted weed detection

Published:18 February 2025
Abstract Views: 0
PDF_early view: 0
Publisher's note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Authors

In agriculture, it is crucial to identify and control weeds as these plant species pose a significant threat to the growth and development of crops by competing for vital resources such as nutrients, water, and light. A promising solution to this problem is adopting smart weed control systems (SWCS) that significantly reduce the use of harmful chemical products. Furthermore, SWCS leads to reduced production costs and a more sustainable and eco-friendly approach to farming. However, implementing SWCS in natural fields can be challenging, mainly due to difficulties in accurately localizing plants. To address this issue, a visual identification system can be employed to label plants from images using a process known as semantic segmentation. In this work, we have implemented, validated, and compared three deep learning approaches, including Mask Region-based Convolutional Neural Network (Mask R-CNN), Mask R-CNN enhanced with an Atrous Spatial Pyramid Pooling module (Mask R-CNN-ASPP), and a proposed model named Residual U-Net architecture, for the semantic pixel segmentation of high densities of both crops (Zea mays) and weeds (including narrow-leaf weeds and broad-leaf weeds). Data augmentation and transfer learning have also been implemented. The performance of the models was evaluated with the well-known metrics Precision, Recall, Dice similarity coefficient (DSC), and mean Intersection-Over-Union (mIoU). As a result of the analysis, the DSC and mIoU of Mask R-CNN-ASPP based models were up to 10.63% and 10.54% superior to that of the Mask R-CNN based models. Nonetheless, the proposed Residual U-Net architecture outperformed Mask R-CNN-ASPP based networks in all the metrics, reaching a DSC of 92.98% and mIoU of 87.12%. Thus, we have concluded that the proposed Residual U-Net-like architecture is the best alternative for the semantic segmentation task in images with high plant density. Our research addresses the challenge of weed identification and control in agriculture, helping farmers produce crops more efficiently while minimizing environmental impact.

Dimensions

Altmetric

PlumX Metrics

Downloads

Download data is not yet available.

Citations

Crossref
Scopus
Google Scholar
Europe PMC
Arai, K., Barakbah, A.R. 2007. Hierarchical k-means: an algorithm for centroids initialization for k-means. Rep. Fac. Sci. Eng. 36:22-31.
Chen, L.-C., Papandreou, G., Kokkinos, I., Murphy, K., Yuille, A.L. 2017. DeepLab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected crfs. IEEE T. Pattern Anal. 40:834-848.
Dentika, P., Ozier-Lafontaine, H., Penet, L. 2021. Weeds as pathogens hosts and disease risk for crops in the wake of a reduced use of herbicides: Evidence from yam (Dioscorea alata) fields and Colletotrichum pathogens in the tropics. J. Fungi 7:283.
Dutta, A., Zisserman, A. 2019. The VIA annotation software for images, audio and video. MM ’19, Proc. 27th ACM Int. Conf. on Multimedia, New York, pp. 2276-2279.
Dyrmann, M., Mortensen, A.K., Midtiby, H.S., Jørgensen, R.N. 2016. Pixel-wise classification of weeds and crops in images by using a fully convolutional neural network. In: CIGR-AgEng conference, Aarhus. Available from: https://conferences.au.dk/uploads/tx_powermail/cigr2016paper_semanticsegmentation.pdf
Fawakherji, M., Youssef, A., Bloisi, D.D., Pretto, A., Nardi, D. 2020. Crop and weed classification using pixel-wise segmentation on ground and aerial images. Int. J. Robot. Comput. 2:39-57.
Garibaldi-Márquez, F., Flores, G., Mercado-Ravell, D.A., Ramírez-Pedraza, A., Valentín-Coronado, L.M. 2022. Weed classification from natural corn field-multi-plant images based on shallow and deep learning. Sensors (Basel) 22:3021.
He, K., Gkioxari, G., Dollár, P., Girshick, R. 2017. Mask R-CNN. Proc. 2017 IEEE Int. Conf. on Computer Vision (ICCV), Venice; pp. 2980-2988.
He, K., Zhang, X., Ren, S., Sun, J. 2016. Deep residual learning for image recognition. Proc. 2016 IEEE Conf. Computer Vision and Pattern Recognition (CVPR), Las Vegas; pp. 770-778.
Jadhav, S.B., Udupi, V.R., Patil, S. 2021. Identification of plant diseases using convolutional neural networks. Int. J. Inf. Tecnol. 13:2461-2470.
Kamath, R., Balachandra, M., Vardhan, A., Maheshwari, U. 2022. Classification of paddy crop and weeds using semantic segmentation. Cogent Engin.9:2018791.
Khan, A., Talha, I., Umraiz, M., Ibna Mannan, Z., Kim, H. 2020. Ced-net: Crops and weeds segmentation for smart farming using a small cascaded encoder-decoder architecture. Electronics (Basel) 9:1602.
Kolhar, S., Jayant, J. 2021. Convolutional neural network based encoder-decoder architectures for semantic segmentation of plants. Ecol Inform, 64:101373.
Krizhevsky, A., Sutskever, I., Hinton, G. 2012. ImageNet classification with deep convolutional neural networks. In: F. Pereira, C.J.C. Burges, L. Bottou and K. Weinberger (eds.), Adv. in Neural Inform. Process. Syst. 25. Curran Associates, Inc. Lake Tahoe.
Long, J., Shelhamer, E., Darrell, T. 2015. Fully convolutional networks for semantic segmentation. Proc. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), Boston; pp. 3431-3440.
Ma, X., Deng, X., Qi, L., Jiang, Y., Li, H., Wang, Y., Xing, X. 2019. Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields. PLoS One 14:e0215676.
Monteiro, A,. Santos, S. 2022. Sustainable approach to weed management: The role of precision weed management. Agronomy (Basel) 12:118.
Montes de Oca, A., Arreola, L., Flores, A., Sánchez, J., Flores, G. 2018. Low-cost multispectral imaging system for crop monitoring. Proc. Int. Conf. on Unmanned Aircraft Systems (ICUAS), Dallas; pp. 443-451.
Montes de Oca, A., Flores, G. 2021a. The AgriQ: A low-cost unmanned aerial system for precision agriculture. Expert Syst. Appl. 182:115–163.
Montes de Oca, A., Flores, G. 2021b. A UAS equipped with a thermal imaging system with temperature calibration for crop water stress index computation. Proc. Int. Conf. on Unmanned Aircraft Systems (ICUAS), Athens; pp. 714-720.
Nedeljković, D., Knežević, S., Božić, D., Vrbnićanin, S. 2021. Critical time for weed removal in corn as influenced by planting pattern and pre-herbicides. Agriculture (Basel) 11:587.
Nikolić, N., Rizzo, D., Marraccini, E., Ayerdi Gotor, A., Mattivi, P., Saulet, P., Persichetti, A., Masin, R. 2021. Site- and time-specific early weed control is able to reduce herbicide use in maize- a case study. Ital. J. Agron. 16:1780.
Peng, H., Li, Z., Zhou, Z., Shao, Y. 2022. Weed detection in paddy field using an improved RetinaNet network. Comput. Electron. Agr. 199:107179.
Picon, A., San-Emeterio, M., Bereciartua-Perez, A., Klukas, C., Eggers, T., ad Navarra- Mestre, R. 2022. Deep learning-based segmentation of multiple species of weeds and corn crop using synthetic and real image datasets. Comput. Electron. Agr. 194:106719.
Quan, L., Wu, B., Mao, S., Yang, C., Li, H. 2021. An instance segmentation-based method to obtain the leaf age and plant centre of weeds in complex field environments. Sensors (basel) 21:3389.
Ren, X. Malik, J. 2003. Learning a classification model for segmentation. Proc. 9th IEEE Int. Conf. on Computer Vision, Nice; pp. 10-17.
Ronneberger, O., Fischer, P., Brox, T. 2015. U-net: Convolutional networks for biomedical image segmentation. In N. Navab, J. Hornegger, W. Wells, and A. Frangi (eds.), Medical image computing and computer-assisted intervention Vol. 9351. Lecture Notes in Computer Science. Cham, Springer. pp 234-241.
Shi, J., Malik, J. 2000. Normalized cuts and image segmentation. IEEE T. Pattern Anal. 22:888-905.
Simonyan, K., Zisserman, A. 2015. Very deep convolutional networks for large-scale image recognition. Proc. 3rd Int. Conf. on Learning Representations, San Diego.
Taha, M.F., Abdalla, A., ElMasry, G., Gouda, M., Zhou, L., Zhao, N., et al. 2022. Using deep convolutional neural network for image-based diagnosis of nutrient deficiencies in plants grown in aquaponics. Chemosensors (Basel) 10:45.
Tang, J.L., Chen, X.Q., Miao, R.H., Wang, D. 2016. Weed detection using image precision under different illumination for site-specific areas spraying. Comput. Electron. Agr. 122:103-111.
Zenkl, R., Timofte, R., Kirchgessner, N., Roth, L., Hund, A., Van Gool, L., et al. 2022. Outdoor plant segmentation with deep learning for high-throughput field phenotyping on a diverse wheat dataset. Front. Plant Sci. 12:774068.
Zhang, H., Peng, Q. 2022. Pso and k-means-based semantic segmentation toward agricultural products. Future Gener. Comp. Sy. 126:82–87.

How to Cite

Garibaldi-Márquez, F., Flores, G. and Valentín-Coronado, L. M. (2025) “Leveraging deep semantic segmentation for assisted weed detection”, Journal of Agricultural Engineering. doi: 10.4081/jae.2025.1741.

Similar Articles

1 2 3 4 5 6 7 8 9 10 > >> 

You may also start an advanced similarity search for this article.