Font Size: a A A

Pose Tracking On Mobile Devices With Monocular Camera And IMU

Posted on:2018-04-26Degree:MasterType:Thesis
Country:ChinaCandidate:J B GuoFull Text:PDF
GTID:2382330569475067Subject:Information and Communication Engineering
Abstract/Summary:PDF Full Text Request
This paper addresses robust and ultrafast pose tracking on mobile devices,such as smartphones and small drones.Existing methods,relying on either vision analysis or inertial sensing,are either too computational heavy to achieve real-time performance on a mobile platform,or not sufficiently robust to address unique challenges in mobile scenarios,including rapid camera motions,long exposure time of mobile cameras,etc.Considering that today's mobile devices are equipped with cameras and inertial sensors,this paper fully explore the potential of visual tracking and inertial tracking on these devices by presenting two pose tracking systems making proper use of visual and inertial sensors.The first system is under the framework of model-based tracking,which will get the relative pose between camera and tracking target by tracking the location of target.While another system is a visual odometry and will track the change of pose between camera motion by simultaneous localization and mapping.In terms of methods,the main work of this paper is as following.(1)This paper presents a novel hybrid tracking system which utilizes on-device inertial sensors to greatly accelerate the visual feature tracking process and improve its robustness.In particular,our system adaptively resizes each video frame based on inertial sensor data and applies a highly efficient binary feature matching method to track the object pose in each resized frame with little accuracy degradation.This tracking result is revised periodically by a model-based feature tracking method to reduce accumulated errors.Furthermore,an inertial tracking method and a solution of fusing its results with the feature tracking results are employed to further improve the robustness and efficiency.We first evaluate our hybrid system using a dataset consisting of 16 video clips with synchronized inertial sensing data and then assess its performance in a mobile augmented reality application.Experimental results demonstrated our method's superior performance to a state-of-the-art feature tracking method,a direct tracking method and the Vuforia SDK,and can run at more than 40 Hz on a standard smartphone.(2)This paper presents a modified visual odometry system which utilizes on-device inertial sensors to greatly accelerate the tracking process.In particular,our system adaptively resizes each video frame based on inertial sensor data.In the visual tracking process,we use a modified pyramidal Lucas-Kanade algorithm which incorporates spatial and depth constraints for fast and accurate camera pose estimationan.Furthermore,we use an ultrafast binary feature description based directly on intensities of a resized and smoothed image patch around each pixel that is sufficiently effective for relocalization.A quantitative evaluation on public datasets demonstrates that our system achieves better tracking accuracy and up to2.1X faster tracking speed comparing to state-of-the-art monocular SLAM systems: LSD-SLAM and ORB-SLAM.For the relocalization task,our system is 2.0?4.6 faster than DBoW2 and achieves a similar accuracy.This paper focus on the pose tracking on mobile devices,and fully explore how to make full use of visual and inertial sensors on these devices.To deal with the challenge mobile tracking scenarios,some new methods on visual tracking and visual-inertial fusion are proposed.Experimental results show that our approach enables fast and stable attitude tracking on mobile devices.
Keywords/Search Tags:mobile device, IMU, pose tracking, fusion, kalman filter, visual odometry, relocalization
PDF Full Text Request
Related items