Autonomous Space Robotics LabChi Hay Tong
Postdoctoral Researcher |
Contact Info
Institute for Aerospace Studies |
ResearchLaser-Based 3D Mapping and Navigation in Planetary Environments
Focus:
|
Projects
Laser-Based Visual OdometryThe ability to compute motion estimates from onboard sensing is a core competency for mobile robots. While stereo camera-based visual odometry (VO) has emerged as the dominant method for motion estimation, passive cameras are inherently sensitive to ambient lighting conditions. We avoid the dependence on external lighting by replacing the camera with a laser rangefinder in the standard VO pipeline, and construct intensity images from the return values. As a result, the same sparse appearance-based approach can be employed, and the additional issues arising from motion distortion are addressed by considering the estimation problem in continuous time. This is facilitated by Gaussian Process Gauss-Newton (GPGN), an algorithm for non-parametric, continuous-time, nonlinear, batch state estimation.
Continuous-Time Batch State EstimationRecent research has shown that local batch estimators are more accurate than the traditional filtering techniques, providing significant improvements in performance even under fixed computational constraints. However, when an estimation problem involves high-rate sensing — such as inertial measurement units (100-1000 Hz) or continuously-scanning lidars (up to 500,000 Hz) — batch estimation approaches are computationally intractable. This is due to the fact that the robot state must be estimated at each measurement time. One possible solution to this problem is to move the estimation problem into continuous time. Representing the robot state as a continuous function of time decouples the number of state variables from the number of measurements, but maintains the rigourous mathematical underpinnings of the maximum-likelihood approach. We are exploring the use of machinery from both parametric and non-parametric regression theory to develop novel continuous-time batch state estimation algorithms that are able to deal with previously intractable problems in robotics. Collaborators: Paul Furgale, Tim Barfoot
3D SLAM for Mapping Planetary Worksite Environments
The Canadian Planetary Emulation Terrain 3D Mapping DatasetThe Canadian Planetary Emulation Terrain 3D Mapping Dataset is a collection of three-dimensional laser scans gathered at two unique planetary analogue rover test facilities in Canada. These test facilities offer emulated planetary terrain in controlled environments, as well as at manageable scales for algorithmic development. This dataset is subdivided into four individual subsets, gathered using panning laser rangefinders mounted on mobile rover platforms. This data should be of interest to field robotics researchers developing algorithms for laser-based Simultaneous Localization And Mapping (SLAM) of three-dimensional, unstructured, natural terrain. All of the data are presented in human-readable text files, and are accompanied by Matlab parsing scripts to facilitate use thereof. [website]
Speeded Up Speeded Up Robust FeaturesOne of the research directions of the Autonomous Space Robotics Lab has been to explore the use of appearance-based methods for rover navigation. To enable robust, real-time performance, we have developed a GPU implementation of the SURF algorithm and released this code as an open-source library. [website] |
EducationPh.D., Aerospace Engineering, University of Toronto, 2013 B.A.Sc., Engineering Science (Computer Option), University of Toronto, 2008 |
TeachingAER1513: State Estimation for Aerospace Vehicles - Teaching Assistant (Fall 2011) AER334: Numerical Methods I - Teaching Assistant (Fall 2008) |
PresentationsTutorial on Data Association and Outlier Rejection Techniques for SLAM (presented at the CRV 2010 SLAM Camp) [slides] |
|