Autonomous Space Robotics LabGravel Pit
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
OverviewThe Gravel Pit Lidar Intensity Imagery Dataset is a collection of 77,754 high-framerate laser range and intensity images gathered at a suitable planetary analogue environment in Sudbury, Ontario, Canada. The data were collected during a visual teach and repeat experiment in which a 1.1km route was taught and then autonomously re-traversed (i.e., the robot drove in its own tracks) every 2-3 hours for 25 hours. The dataset is subdivided into the individual 1.1km traversals of the same route, at varying times of day (ranging from full sunlight to full darkness). This data should be of interest to researchers who develop algorithms for visual odometry, simultaneous localization and mapping (SLAM) or place recognition in three-dimensional, unstructured and natural environments. In concert with state-of-the-art techniques, this dataset creates ample opportunity for loop closure; in addition to having multiple traversals of the same path, the trajectory was specifically chosen to include both small- and large-scale loops. The lidar scans were taken with a $480 \times 360$ resolution at 2Hz, while driving roughly 0.3-0.4 meters per second; therefore, one of the challenges in using this dataset is to compensate for the motion distortion in the data. All of the data are presented in either human-readable text files or images, and are accompanied by Matlab parsing scripts for ease of use. |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Hardware SetupThe robuROC 6 used to gather the dataset carried a number of payloads. The relevant payloads for this dataset consisted of a high-framerate Autonosys LVC0702 lidar, and a Thales DG-16 Differential GPS unit. The Autonosys LVC0702 lidar provides 500,000 measurements per second with a 15-bit intensity at a maximum range of ~53.5m. In this dataset, the lidar was configured to have a 90°H/30°V FOV, and capture images with a resolution of 480x360 at 2Hz. The Thales DG-16 Differential GPS unit has a Circular Error Probability (CEP) of 0.4m, with 95% of measurements occuring within 0.9m. |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Overview of DatasetsEach of the entries below contain data products related to a unique traversal of the 1.1km route (conducted at different times of day). For ease of use, the data has been post-processed and packaged into a few different products. The Header .zip file contains general dataset information, including the GPS tracks and a few alignment matrices. The first major data product, Imagestacks, contains a sequence of images, generated from the raw sensor data, in the Tagged Image File Format (TIFF). Each intensity image in the sequence is accompanied by a corresponding azimuth, elevation, range and timestamp image. TIFF was chosen as it is supports both 32- and 64-bit floating point images. Also, TIFF is simple to load using either Matlab, or OpenCV, which leverages LibTiff (link). The second major data product, SURF and Matches, contains a set of sparse SURF features extracted using the ASRL GPU accelerated SURF implementation (link). The sparse measurements are then corrected using the supplied calibration model (see the autonosys_apply_calib helper function). Finally, two sets of frame-to-frame matches are provided. The first set of matches are simply the initial guesses based on only the SURF descriptor. The second set of matches are the inliers after a being passed through a RANSAC algorithm that accounts for the motion distortion in the image. |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
In summary, the available downloads are: DownloadsTo start, it is recommended that the user download either the sample dataset (.zip [189 MB]), or the Header and Imagestacks for the 'teach 1' dataset, and get a feel for the data by using some of the example visualization code available in the Helpful Tools. The sample set consists of typical data collected over 30 meters of traversal (including both straight motion and a gradual turn). Important note: The dataset provides ftp download links. Some browsers no longer enable ftp links by default and you need to change the settings if you want to download using the browser. Using the wget command is an alternative to avoid the browser.
Additional Notes |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description of Data ProductsIn this section, we detail the format of the data in addition to specifics such as experimental considerations and post-processing details. Each traverse dataset consists of a series of folders corresponding to the various data producs. These folders contain either TIFF images or comma-delimited human-readable text files. Coordinate FramesFor clarity, we provide an image below illustrating the various coordinate frames that relate to the measurement data. The three frames in the image above are the sensor frame, , the GPS frame, , and the inertial frame, . This dataset uses homogeneous transformation matrices to express the translation and rotation between frames. For example, to transform a 3D point from to , a transformation matrix, , may be used in the following manner: where is the vector from to point , expressed in , is the vector from to point , expressed in , is the rotation matrix from to , and is the translation from to , expressed in . The first matrix we provide is the 4x4 homogeneous transformation matrix relating the sensor frame and GPS frame. For simplicity and due to the scale of the CEP, this transform is assumed static, and provided only for the nominal position of the pods. Second, each traverse dataset contains a transformation matrix to bring the initial local GPS frame into angular alignment with the inertial GPS data. This alignment matrix is calculated by performing a simple point-to-point least-squares optimization between the first 30 meters of our visual odometry estimate and the GPS data. The file format for matrices, matrix_<name>.txt, is straight forward. The first line contains the comma separated number of rows and columns in the matrix. The following lines contain the floating-point data of the matrix (comma separated for columns and and line separated for rows). Dataset Header FileThe header file is a human-readable, comma-delimited text file with information pertaining to the contents of the dataset. There is a single header line in the file to describe the contents of each data column. The following shows the file naming convention and more detailed descriptions of the contained items: <dataset-name>_header.txtGPS Data FileThe GPS data file is a human-readable, comma-delimited text file containing the GPS coordinate at each frame capture. During the collection phase, the GPS and camera were not synchronized. To account for this, the GPS coordinates that are provided have been interpolated from the original data using the timestamp at each frame. There is a single header line in the file to describe the contents of each data column. The following shows the file naming convention and more detailed descriptions of the contained items: <dataset-name>_gps.txtImage Stack
For each frame listed in the header file, there exists a set of .tif images that make up a single image stack. Images of different types are separated and named accordingly: img_azimuth/<dataset-name>_<id>_img_azimuth.tif - 32-bit floating point azimuth angle (radians)img_elevation/<dataset-name>_<id>_img_elevation.tif - 32-bit floating point elevation angle (radians) img_range/<dataset-name>_<id>_img_range.tif - 32-bit floating point range measurement (meters) img_time/<dataset-name>_<id>_img_time.tif - 64-bit floating point time (seconds since beginning of experiment) img_intensity16/<dataset-name>_<id>_img_intensity16.tif - 16-bit raw intensity measurement img_intensity8/<dataset-name>_<id>_img_intensity8.tif - 8-bit intensity measurement (range corrected) img_mask/<dataset-name>_<id>_img_mask.tif - 8-bit mask image to identify missing pixel data (0-bad, 255-good)
SURF Feature File
The SURF feature files contain a list of SURF features extracted from the Autonosys lidar intensity data. There is one SURF feature file for every image stack in the dataset. When using the sub-pixel (u,v) coordinate to extract measurements from the image stacks, the four surrounding pixels were used for bilinear interpolation. Additionally, the azimuth, elevation and range measurements have been idealized using the supplied calibration model (see the autonosys_apply_calib helper function). There is a single header line in the file to describe the contents of each data column. The following shows the file naming convention and more detailed descriptions of the contained items: <dataset-name>_<id>_surf.txtFeature Match FileThe feature match file contains a list of indices that relate SURF features in sequential frames. Two types of match files have been provided. The first are the raw matches, which are based solely on the SURF feature descriptors. The second are the filtered matches, which provide only 'inlier' matches based on a RANSAC algorithm that accounts for motion distortion. There is a single header line in the file to describe the contents of each data column. The following shows the file naming convention and more detailed descriptions of the contained items: <dataset-name>_<id-A>_<id-B>_<matches-name>.txt |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Helpful ToolsThe following tools are provided to assist in utilizing the data.
Matlab Functions/Scripts
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
CreditsThis dataset was the work of Sean Anderson, Colin McManus, Hang Dong, Erik Beerepoot and Timothy D. Barfoot. If you use the data provided by this website in your work, please use the following citation: Anderson A, McManus C, Dong H, Beerepoot E, and Barfoot T D. “The Gravel Pit Lidar-Intensity Imagery Dataset”. University of Toronto Technical Report ASRL-2012-ABL001. (pdf) |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
AcknowledgementsThe collection of this data would not have been possible without the support of many people. In particular, we would like the thank the staff of the Ethier Sand and Gravel in Sudbury, Ontario, Canada for allowing us to conduct our field tests on their grounds and Dr. James O'Neill from Autonosys for his help in preparing the lidar sensor for our field tests. We also wish to thank DRDC Suffield, MDA Space Missions, the NSERC, the Canada Foundation for Innovation and the Canadian Space Agency for providing us with the financial and in-kind support necessary to conduct this research. |