Font Size: a A A

Statistical inference in multisensory perception and learning

Posted on:2010-08-12Degree:Ph.DType:Dissertation
University:University of California, Los AngelesCandidate:Wozny, David RFull Text:PDF
GTID:1445390002973373Subject:Engineering
Abstract/Summary:
This dissertation investigates the computational principles that govern multisensory processing. Multisensory processing is ubiquitous within the human nervous system, as we are constantly subjected to multiple sensations at any given moment. The human observer provides an excellent model system to investigate how to intelligently process streams of information from multiple sensory channels. Equally important components of multisensory processing are deciding (1) what signals come from disparate sources and should be processed independently, and (2) what signals come from a common source and should be integrated. If the signals are to be integrated, the subsequent problem is determining how the information should be integrated to improve performance. I aim to further the development of a unified computational theory that can explain these critical components of multisensory processing.;The computational theories in this dissertation are tested against human performance in perceptual experiments. First, I investigate if there is a general principle that can account for processing sensory information across the three modalities of vision, audition, and touch. The findings of the experiment demonstrate that multisensory processing across three modalities can be well characterized by a Bayesian model of perception that does not a priori assume that the sensory signals are caused by the same source.;The second group of experiments investigates the process of localizing auditory and visual stimuli in space. I was specifically interested in how Bayesian models of sensory processing characterize forms of multisensory recalibration. The results support an adaptation model where small consistent crossmodal deviations are likely to be attributed to sensory errors and the likelihood of sensory encoding is recalibrated, while larger inconsistent errors can be attributed to the causal structure of the environment. I also investigate the strategy used for perceptual decision making in spatial localization task. The results show that the majority of observers' responses are best characterized by a probability matching strategy as opposed to a commonly assumed strategy of minimizing the mean squared error. Our results demonstrate that under certain conditions, perceptual judgments can be subjected to similar heuristics that are more commonly observed in cognitive decision making, and underscore the importance of considering alternative objectives when evaluating perceptual performance.
Keywords/Search Tags:Multisensory, Perceptual
Related items