UTIAS ASRL

Autonomous Space Robotics Lab

Chi Hay Tong

Postdoctoral Researcher
BASc (Eng Sci Computer, Toronto), PhD (Toronto)

Note: This page is out of date. I've moved to the Mobile Robotics Group at the University of Oxford.

Contact Info

Institute for Aerospace Studies
University of Toronto
4925 Dufferin Street, Room 150B
Toronto, ON, M3H 5T6, Canada

chihay.tong [at] utoronto.ca


Clearpath Husky A200 in the Canadian Space Agency's Mars Emulation Terrain

Research

Laser-Based 3D Mapping and Navigation in Planetary Environments

Focus:

  • Navigation in unstructured environments using a laser rangefinder
  • Robust 3D SLAM for stop-and-scan mapping
  • Continuous-time state estimation for continuous-sweep localization


Projects

Laser-Based Visual Odometry

Intensity image and feature tracks from the Autonosys Lidar. The ability to compute motion estimates from onboard sensing is a core competency for mobile robots. While stereo camera-based visual odometry (VO) has emerged as the dominant method for motion estimation, passive cameras are inherently sensitive to ambient lighting conditions. We avoid the dependence on external lighting by replacing the camera with a laser rangefinder in the standard VO pipeline, and construct intensity images from the return values. As a result, the same sparse appearance-based approach can be employed, and the additional issues arising from motion distortion are addressed by considering the estimation problem in continuous time. This is facilitated by Gaussian Process Gauss-Newton (GPGN), an algorithm for non-parametric, continuous-time, nonlinear, batch state estimation.

Continuous-Time Batch State Estimation

Modelling the rover trajectory in continuous-time. Recent research has shown that local batch estimators are more accurate than the traditional filtering techniques, providing significant improvements in performance even under fixed computational constraints. However, when an estimation problem involves high-rate sensing — such as inertial measurement units (100-1000 Hz) or continuously-scanning lidars (up to 500,000 Hz) — batch estimation approaches are computationally intractable. This is due to the fact that the robot state must be estimated at each measurement time. One possible solution to this problem is to move the estimation problem into continuous time. Representing the robot state as a continuous function of time decouples the number of state variables from the number of measurements, but maintains the rigourous mathematical underpinnings of the maximum-likelihood approach. We are exploring the use of machinery from both parametric and non-parametric regression theory to develop novel continuous-time batch state estimation algorithms that are able to deal with previously intractable problems in robotics. Collaborators: Paul Furgale, Tim Barfoot

3D SLAM for Mapping Planetary Worksite Environments

With the renewed worldwide interest in developing new methods for establishing a permanent presence on extraterrestrial surfaces, it is likely that robotics will play a large role in the precursor missions to manned deployment. Much like a construction project on Earth, detailed maps will be required to conduct operations such as site selection, site preparation, and base construction. Similarly, precise localization will be necessary to facilitate load transport, scientific investigation, and in-situ resource utilization. With this motivation, we developed a robust framework suitable for conducting three-dimensional Simultaneous Localization and Mapping (3D SLAM) in a planetary worksite environment. Operation in a planetary environment imposes sensing restrictions, as well as challenges due to the rugged terrain. By utilizing a laser rangefinder mounted on a rover platform, we demonstrated an approach that is able to create globally consistent maps of natural, unstructured 3D terrain. Extensive validation of the framework was conducted using data gathered at two different planetary analogue facilities. [video]

3D SLAM results for the a100_dome_vo dataset.

The Canadian Planetary Emulation Terrain 3D Mapping Dataset

Canadian Space Agency's Mars Emulation Terrain. The Canadian Planetary Emulation Terrain 3D Mapping Dataset is a collection of three-dimensional laser scans gathered at two unique planetary analogue rover test facilities in Canada. These test facilities offer emulated planetary terrain in controlled environments, as well as at manageable scales for algorithmic development. This dataset is subdivided into four individual subsets, gathered using panning laser rangefinders mounted on mobile rover platforms. This data should be of interest to field robotics researchers developing algorithms for laser-based Simultaneous Localization And Mapping (SLAM) of three-dimensional, unstructured, natural terrain. All of the data are presented in human-readable text files, and are accompanied by Matlab parsing scripts to facilitate use thereof. [website]

Speeded Up Speeded Up Robust Features

Sample results from SUSURF on an image obtained at Devon Island. One of the research directions of the Autonomous Space Robotics Lab has been to explore the use of appearance-based methods for rover navigation. To enable robust, real-time performance, we have developed a GPU implementation of the SURF algorithm and released this code as an open-source library. [website]


Education

Ph.D., Aerospace Engineering, University of Toronto, 2013

B.A.Sc., Engineering Science (Computer Option), University of Toronto, 2008


Teaching

AER1513: State Estimation for Aerospace Vehicles - Teaching Assistant (Fall 2011)

AER334: Numerical Methods I - Teaching Assistant (Fall 2008)


Presentations

Tutorial on Data Association and Outlier Rejection Techniques for SLAM (presented at the CRV 2010 SLAM Camp) [slides]