Synthetic Datasets

Model Based Sythetic Datasets for Sugar Beets Crop/Weeds Detection

Selective weeding is one of the key challenges in the field of agriculture robotics. To accomplish this task, a farm robot should be able to accurately detect plants and to distinguish them between crop and weeds. Most of the promising state-of-the-art approaches make use of appearance-based models trained on large annotated datasets. Unfortunately, creating large agricultural datasets with pixel-level annotations is an extremely time consuming task, actually penalizing the usage of data-driven techniques. We face this problem by procedurally generating large realistic synthetic datasets by using the UNREAL 4 graphic engine (more information in the related publication reported below). The generated data can be directly used to train the model or to supplement real-world images. The complete dataset list:

  • Synthetic SugarBeet Random Weeds: The dataset is composed by sugarbeet instances and random weeds
  • Synthyetic SugarBeet Galium: The dataset is composed by sugarbeet instances and Galium Aparinae weeds
  • Synthyetic SugarBeet Capsella: The dataset is composed by sugarbeet instances and Capsella Bursa Pastoris weeds
  • Synthyetic SugarBeet Galium Capsella: The dataset is composed by sugarbeet instances and both the aforementioned weeds

Images in the datasets are arranged according to the following folder list:
rgb: folder containing RGB synthetic images
gt: folder containing grayscale, single channel pixelwise annotations. (0 soil; 1 Crop; 2 weed)
gt_color: folder containing RGB, 3 channels pixelwise annotations. (black soil; green crop; red weeds)

check out also our video:

When using this dataset in your research, we will be happy if you cite us:

          author    = {Di Cicco, Maurilio and Potena, Ciro and Grisetti, Giorgio and Pretto, Alberto},
          title     = {Automatic Model Based Dataset Generation for Fast and Accurate Crop and Weeds Detection},
          booktitle = {Proc. of the {IEEE/RSJ} International Conference on Intelligent Robots and Systems ({IROS})},
          year      = {2017}