Font Size: a A A

Generic camera calibration for omnifocus imaging, depth estimation and a train monitoring system

Posted on:2016-03-14Degree:Ph.DType:Dissertation
University:University of Illinois at Urbana-ChampaignCandidate:Kumar, AvinashFull Text:PDF
GTID:1478390017976758Subject:Electrical engineering
Abstract/Summary:
Calibrating an imaging system for its geometric properties is an important step toward understanding the process of image formation and devising techniques to invert this process to decipher interesting properties of the imaged scene. In this dissertation, we propose new optically and physically motivated models for achieving state-of-the-art geometric and photometric camera calibration. The calibration parameters are then applied as input to new algorithms in omnifocus imaging, 3D scene depth from focus and machine vision based intermodal freight train analysis.;In the first part of this dissertation, we present new progress made in the areas of camera calibration with application to omnifocus imaging and 3D scene depth from focus and point spread function calibration. In camera calibration, we propose five new calibration methods for cameras whose imaging model can represented by ideal perspective projection with small distortions due to lens shape (radial distortion) or misaligned lens-sensor configuration (decentering). In the first calibration method, we generalize pupil-centric imaging model to handle arbitrarily rotated lens-sensor configuration, where we consider the sensor tilt to be about the physical optic axis. For such a setting, we derive an analytical solution to linear camera calibration based on collinearity constraint relating the known world points and measured image points assuming no radial distortion. Our second method considers a much simpler case of Gaussian thin-lens imaging model along with non-frontal image sensor and proposes analytical solution to the linear calibration equations derived from collinearity constraint. In the third method, we generalize radial alignment constraint to non-frontal sensor configuration and derive analytical solution to the resulting linear camera calibration equations. In the fourth method, we propose the use of focal stack images of a known checkerboard scene to calibrate cameras having non-frontal sensor. In the fifth method, we show that radial distortion is a result of changing entrance pupil location as a function of incident image rays and propose a collinearity based camera calibration method under this imaging model. Based on this model, we propose a new focus measure for omnifocus imaging and apply it to compute 3D scene depth from focus. We then propose a point spread function calibration method which computes the point spread function (PSF) of a CMOS image sensor using Hadamard patterns displayed on an LCD screen placed at a fixed distance from the sensor.;In the second part of the dissertation, we describe a machine vision based train monitoring system, where we propose a motion-based background subtraction method to remove background between the gaps of an inter-modal freight train. The background subtracted image frames are used to compute a panoramic mosaic of the train and compute gap length in pixels. The gap length computed in metric units using the calibration parameters of the video camera allows for analyzing the fuel efficiency of loading pattern of the given inter-modal freight train.
Keywords/Search Tags:Imaging, Camera, 3D scene depth from focus, Train, Image, Point spread function, Method
Related items