Mapping Datasets

Multi-Sensor Datasets for Mapping and Pose Estimation in Farming Scenarios

Self-localization and mapping capabilities are crucial components for Unmanned Ground Vehicles (UGVs) in farming applications. Approaches based only on visual cues or on low-cost GPS are easily prone to fail in such scenarios: multi-sensors based approaches are more effective in such cases. In order to benchmark pose estimation and mapping algorithm in a agricultural scenario, we collected two challenging datasets in Eschikon, Switzerland, at ETH crop science facility. Both datasets include several heterogeneous sensors with their relative ground truth.

In the following, the complete dataset list:



Data Format

Dataset are stored in '.bag' format, and readable by the rosbag tools. For the convenience, we temporally split the entire dataset into files of about 6GB each. Filename format:
<DatasetA or DatasetB>_<Start timestamp>_<End timestamp>.bag
Topic names and descriptions:
  • /robot/odom: robot odometry, stored as nav_msgs/Odometry
  • /sensor/camera/downJai/*: downward looking RGB-NIR camera, stored as sensor_msgs/Image
  • /sensor/camera/frontJai/*: forward looking RGB-NIR camera, stored as sensor_msgs/Image
  • /sensor/camera/grasshopper: high resolution RGB camera forward looking, stored as sensor_msgs/Image
  • /sensor/camera/vi_sensor/*: Visual Inertial sensor composed by a stereo camera pair and an IMU, stored as sensor_msgs/Image and sensor_msgs/Imu
  • /sensor/camera/grasshopper: high resolution RGB camera forward looking, stored as sensor_msgs/Image
  • /sensor/gps/evk7ppp/ublox_gps_evk7ppp/fix : Precise Point Positioning GPS, stored as sensor_msgs/NavSatFix
  • /sensor/gps/neomp8/ublox_gps_mp8/fix: RTK GPS, stored as sensor_msgs/NavSatFix
  • /sensor/laser/vlp16/front/velodyne_packets: forward looking Lidar sensor, stored as sensor_msgs/Image
  • /sensor/laser/leica/position: high resolution RGB camera forward looking, stored as velodyne_msgs/VelodyneScan
  • /tf: transforms among the reference systems fixed on the robot
Bags can be merged accoring to their time stamps by using the TSS in the pose graph tools library. Calibration among sensors and the /odom reference frame are reported in calibration file.


In addition to the datasets, we provide some software packages we developed within the Flourish project:

  • TSS: a tool developed to manage sensory data gathered by the farming robots [source code]
  • MCAPS: Multi-Cue Agricultural Positioning System, a multi-sensor, pose-graph based framework developed for accurate localization within the field [source code]

check out also our video:

When using this dataset and/or software in your research, we will be happy if you cite us:

                                        title={An Effective Multi-Cue Positioning System for Agricultural Robotics},
                                        author={Imperoli, Marco and Potena, Ciro and Nardi, Daniele and Grisetti, Giorgio and Pretto, Alberto},
                                        journal={IEEE Robotics and Automation Letters},
                                        volume    = {3},
                                        number    = {4},
                                        year      = {2018},
                                        pages     = {3685--3692},


We are grateful to Wolfram Burgard for providing us with the Bosch BoniRob, and to Raghav Khanna and Frank Liebisch to help us in acquiring the datasets.