Font Size: a A A

On temporal and spatial calibration for high accuracy visual-inertial motion estimation

Posted on:2012-12-28Degree:Ph.DType:Dissertation
University:University of Southern CaliforniaCandidate:Kelly, Jonathan ScottFull Text:PDF
GTID:1458390011453625Subject:Engineering
Abstract/Summary:
The majority of future autonomous robots will be mobile, and will need to navigate reliably in unknown and dynamic environments. Visual and inertial sensors, together, are able to supply accurate motion estimates and are well-suited for use in many robot navigation tasks. Beyond egomotion estimation, fusing high-rate inertial sensing with monocular vision enables other capabilities, such as independent motion segmentation and tracking, moving obstacle detection and ranging, and dense metric 3D mapping, all from a mobile platform.;A fundamental requirement in any multisensor system is precision calibration. To ensure optimal performance, the sensors must be properly calibrated, both intrinsically and relative to one another. In a visual-inertial system, the camera and the inertial measurement unit (IMU) require both temporal and spatial calibration---estimates of the relative timing of measurements from each sensor and of the six degrees-of-freedom transform between the sensors are needed. Obtaining this calibration information is typically difficult and time-consuming, however. Ideally, we would like to build power-on-and-go robots that are able to operate for long periods without the usual requisite manual sensor (re-) calibration.;This dissertation describes work on combining visual and inertial sensing for navigation applications, with an emphasis on the ability to temporally and spatially (self-) calibrate a camera and an IMU. Self-calibration refers to the use of data exclusively from the sensors themselves to improve estimates of related system parameters.;The primary difficultly in temporal calibration is that the correspondences between measurements from the different sensors are initially unknown, and hence the relative time delay between the data streams cannot be computed directly. We instead formulate temporal calibration as a registration problem, and introduce an algorithm called Time Delay Iterative Closest Point (TD-ICP) as a novel solution. The algorithm operates by aligning curves in a three-dimensional orientation space, and incorporates in a principled way the uncertainty in the camera and IMU measurements.;We then develop a sequential filtering approach for calibration of the spatial transform between the sensors. We estimate the transform parameters using a sigma point Kalman filter (SPKF). Our formulation rests on a differential geometric analysis of the observability of the camera-IMU system; this analysis shows for the first time that the IMU pose and velocity, the gyroscope and accelerometer biases, the gravity vector, the metric scene structure, and the sensor-to-sensor transform, can be recovered from camera and IMU measurements alone. While calibrating the transform we simultaneously localize the IMU and build a map of the surroundings. No additional hardware or prior knowledge about the environment in which a robot is operating is necessary.;Results from extensive simulation studies and from laboratory experiments are presented, which demonstrate accurate camera-IMU temporal and spatial calibration. Further, our results indicate that calibration substantially improves motion estimates, and that the local scene structure can be recovered with high fidelity.;Together, these contributions represent a step towards developing fully autonomous robotic systems that are capable of long-term operation without the need for manual calibration.
Keywords/Search Tags:Calibration, Temporal and spatial, Inertial, Motion, IMU, System
Related items