Font Size: a A A

The Research On Brain Mechanism Of The Multisenory Integration

Posted on:2011-07-27Degree:DoctorType:Dissertation
Country:ChinaCandidate:Q LiuFull Text:PDF
GTID:1115360302497585Subject:Development and educational psychology
Abstract/Summary:PDF Full Text Request
To perceive the complex external environment, our brains make use of multiple cues derived from either different sensory modalities (e.g., vision, touch and audition) or from the same sensory modality (e.g., in visual modality:binocular disparity, visual texture and visual motion). Information from different cues was often efficiently merged to form a unified and robust percept, rather than being perceived as incoherent shapes, colors, orientations, etc. This process is referred to as multisensory integration. Converging evidence from human behavioral research has demonstrated that stimuli from two or more sensory modalities presented in close spatial and temporal proximity have a facilitative effect on behavioral performance. Specifically, multimodal stimuli lead to faster detection times and more accurate discrimination performance compared to the constituent unimodal stimuli.Previous researchers investigated the neural mechanism of multisensory integration from the angle of behavior and neuroimage. Behavioral studies provided evidence that information from different sensory modality is integrated in the way of weighted average, and prior experience is also part of integrated information. Besides, the weighting of information from different modality could be affected by the prior knowledge of the reliability of the information in the whole process. Results from neuroimage studies demonstrated that the brain areas involved in multisensory integration process are a net of several brain areas, rather than a single brain area, which area is supposed to be activated depends on the properties of stimuli. There are function-specific in brain responsible for either unisensory processing or multisensory integration, the interaction of information from different modalities is not only the feedback connection between unisensory functional area and multisensory functional areas, but also feedforward connection between unisensory functional areas.There are some defects in previous studies. First, for experiment paradigm, most previous researches investigated it by presenting stimuli either from different sensory modality or stimuli from bi-sensory modality randomly. There might be modality switching effect in this paradigm, making the measurement of multisensory integration inaccurate. This kind of effect is especially prominent in EEG measuring. Thus, it is a necessary premise to design a reasonable measuring paradigm based on a full analyzing of the affected characters of modality switching effect so as to getting rid of it. Second, in behavioral research areas, whether the effect of prior knowledge of instability happens in processing phase or deciding phase is still unknown. There is a deficiency of an experiment paradigm which can effectively investigate the neural mechanism of the effect of prior knowledge to multisensory process. One important reason is a deficiency of an experiment paradigm which can not only manipulate prior knowledge but also adapted to neuroimage research. Finally, investigating the neural mechanism of the integration of visual word information and auditory phonetic information has great significance in understanding of the neural function of literacy. Thus, further study to word-phonetic is needed to ensure whether a specific cognitive neural mechanism is existed in the integration of word-phonetic.To the probable deficiency of previous study, the current study proved whether modality switching effect could affect the measuring results of multisensory integration in classic measurement of integration. Further, the characters of modality switching effect were analyzed by manipulating both the signal intensity and the position relationship of the pre-and-post stimuli in experiment 2 and experiment 3. The overall results showed that there were three factors which can affect the measurement of multisensory integration. First, the modality switching effect resulted from the modality switch of pre-and-post stimuli presentation; this factor could lead to the mix of switch-related ERP component in the ERP waves of unimodality condition. Second, the dynamic modulating to the alert level to current responses resulted from the intensity of pre-and-post modality stimuli. Third, the dynamic modulating to the perception threshold of the modality resulted from the intensity of pre-and-post stimuli in the same modality. Pure results in multisensory behavior measurement and ERP measurement could be obtained only if the three factors were eliminated. Existing paradigm of measuring was modified based on the characters of the three factors.For the measurement of prior knowledge, we used a novel audiovisual letter recognition task. Auditory and visual letters were always presented simultaneously, but the color of the letters specified the probability of audiovisual congruency (e.g. green=high probability (HP) and blue=low probability (LP)). The new prior about cue-congruency probability began to affect behavioral results at about the 900th trial and thus, the ERP data was analyzed in two phases:the "prior knowledge forging phase" (trial 900). The effects of prior knowledge were revealed through difference waveforms generated by subtracting the ERPs in the LP condition from the HP condition. The results found a frontal-central probability effect (90-120ms) was observed in the prior knowledge forging phase, which suggested that the brain assigns more weight to stimuli that changed more frequently. Two probability effects were observed at right parietal-occipital (40-96ms) and frontal-central probability effect (170-200ms) in the prior knowledge affecting phase. These results suggested the prior knowledge regarding the reliability of sensory cues affects the multisensory integration in the early perceptual processing stage. The prior knowledge was extracted in early stages of visual processing and modulated activity of multisensory cortical areas. Further, more attention resource was needed in the prior knowledge forging phase.Mechanism of vision-audition word integration was investigated using Chinese monosyllabic words because of their abundance. The component related to semantic integration in vision-audition integration was investigated by manipulating the congruence of the category of vision-audition stimuli. Results showed that visual-audio stimuli of total congruent could induce bigger amplitude of wave of visual-acoustic P2 than ones of incongruent did, but the waveform was of significant difference to the one induced by vision-audition congruent stimuli. This result showed that visual-audio P2 is the component related to semantic integration. In experiment 6, brain active difference in the process of visual-audio stimuli of color-phonetic and word-phonetic was analyzed using FMRI technology. Results showed that visual-audio integration effect occurred only in the process of word-phonetic stimulus in the left superior temporal gyrus, however, the visual-audio integration effect occurred in both tasks in supramarginal gyrus. Further, in the visual-audio integration process of word-phonetic, integration effect occurred also in the right inferior parietal lobule, this area is only involved in Chinese characters processing, so this result indicated that the brain areas involved in multisensory integration would vary as the task and the properties of stimuli.
Keywords/Search Tags:Iultisensory integration, Bayesian decision framework, Modality switching effect, Prior knowledge, Superior temporal gyrus (STG), Event-related potential (ERP), Functional magnetic resonance imaging (FMRI)
PDF Full Text Request
Related items