Comparison of semantic segmentation algorithms for the estimation of botanical composition of clover-grass pastures from RGB images


KARTAL S.

ECOLOGICAL INFORMATICS, cilt.66, 2021 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 66
  • Basım Tarihi: 2021
  • Doi Numarası: 10.1016/j.ecoinf.2021.101467
  • Dergi Adı: ECOLOGICAL INFORMATICS
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, PASCAL, BIOSIS, CAB Abstracts, Geobase, Pollution Abstracts, Veterinary Science Database
  • Anahtar Kelimeler: Semantic segmentation, Clover-grass, Precision agriculture, Deep learning
  • Çukurova Üniversitesi Adresli: Evet

Özet

In dairy industry, estimation of the in-field clover-grass ratio is an important factor in composing feed ratios for cows. Accurate estimation of the grass and clover ratios enables smart decisions to optimize seeding density and fertilization, resulting in increased yield and reduced amount of chemicals used. In practice, this process is still primarily performed by human-eye, which is labor-intensive, subjective, and error-prone. Therefore, plant species ratio estimation using traditional methods is hardly possible and misleading. Modern semantic segmentation models on digital images offer a promising alternative to overcome these drawbacks. In this paper, an extensive comparison of Deep Learning (DL) models for estimating the ratio of clover, grass, and weeds in red, green, and blue (RGB) images is presented. Three DL architectures (Unet, Linknet, FPN) are combined with ten randomly initialized encoders (variations of VGG, DenseNet, ResNet, Inception and EfficientNet) to construct thirty different segmentation models. Evaluation of models was performed on a publicly available dataset provided by the Biomass Prediction Challenge. The best segmentation accuracy was reached by the FPNInceptionresnetv2 model by 76.7%. This result indicates the great potential in deep convolutional neural networks for the segmentation of plant species in RGB images. Furthermore, this study lays the foundation for our next set of experiments with DL to improve the benchmarks and will further the quest to identify phenotype characteristics from agricultural imagery collected from the field.