Font Size: a A A

An information-theoretic study of image formation, detection and post-processing

Posted on:2005-02-19Degree:Ph.DType:Dissertation
University:The University of New MexicoCandidate:Hope, Douglas AlanFull Text:PDF
GTID:1458390008980435Subject:Physics
Abstract/Summary:PDF Full Text Request
This dissertation presents a comprehensive analysis of image formation, detection and processing from the viewpoint of information theory. Statistical information is based on the concept of a communication channel. When a statistical ensemble of inputs is sent, one input at a time through a channel, the noise and other limitations in the channel lead to an ensemble of degraded channel outputs. The amount of statistical information successfully transmitted through the channel is defined as a measure of how well one can, on average, associate a channel output with its corresponding input. In the context of imaging the channel consists of the medium through which the object is imaged as well as the actual imaging system. The input ensemble is a statistical ensemble of object scenes, while the output ensemble is the corresponding ensemble of images that are generated by the imaging system. The amount of information about the object that is passed by the imaging system is reduced by system limitations.; Image post-processing uses prior knowledge to obtain an improved estimate of the object scene. The actual amount of improvement to the image can be assessed using statistical information. The most fundamental technique of image restoration, deconvolution, is shown to add information by increasing the resolution in the image. An information-based comparison of a variety of linear and nonlinear deconvolution techniques is presented for images formed by astronomical telescopes.; An optical interferometer is another important imaging system we consider here. It acquires image data in the Fourier domain by means of measurements of the visibilities and phases of fringes at different spatial frequencies. Among other things we show that a reduction in the number of photons involved in the measurement of complex fringe visibilities corresponds to a loss of object information conveyed by such visibility measurements and thus in the maps synthesized by the interferometer. A comparison of the CLEAN and MEM deconvolution algorithms is then presented. The problem of self-calibration, an important consideration for ground-based interferometry, is then treated from the viewpoint of information theory. It is shown that the initial model, the number of fringe visibility measurements and the type of CLEAN windows used all have a quantifiable impact on information recovery in deconvolution.; Speckle interferometric post-processing using an object support constraint is then discussed. Both noise and information based characterizations of such support constraints are presented. It is shown that, in general, an increase in information correlates with a decrease in noise, but there are important departures from this naively expected conclusion that clarify the fundamental differences between information and noise based analyses. A new purely information-based post-processing algorithm for speckle interferometry is proposed and addressed at length.
Keywords/Search Tags:Information, Image, Post-processing, Imaging system, Noise
PDF Full Text Request
Related items