Font Size: a A A

Research On The Cross-modal Representation Of Visual And Auditory Information Based On Functional Magnetic Resonance Imaging

Posted on:2022-06-29Degree:DoctorType:Dissertation
Country:ChinaCandidate:J GuFull Text:PDF
GTID:1524307034461024Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
The human brain can represent information from different modalities,and its work mechanism of high efficiency and low consumption is an important reference for artificial intelligence to adapt to complex environments.The brain can efficiently analyze multimodal information based on the neural mechanism of cross-modal representation,which contributes to improving the utilization rate of information and reducing the redundancy of the processing.Therefore,it is of great significance to study the neural mechanism of cross-modal information representation in the brain.In recent years,the functional magnetic resonance imaging(f MRI)technology has been widely used with its excellent spatial resolution and greatly promoted the study of brain neural mechanisms.This thesis collected f MRI data of the brain in the process of auditory perception,auditory imagery,and visual perception,then performed the research on the cross-modal representation of visual and auditory information to provide the cognitive theoretical basis for brain-inspired intelligence.The research was carried out in the following parts:Based on the co-activated brain regions of auditory perception and imagery,this paper first analyzed and compared the neural representation in both modalities,and found the correlation and difference between auditory perception and imagery in the activated brain regions.We collected the brain f MRI data of subjects performing the perception and imagery tasks of complex sound.Based on univariate analysis and multi-voxel pattern analysis(MVPA),we found that voxels in the superior temporal gyrus and partial frontal-parietal regions were activated in both modalities,and realized the decoding of brain signal under the modalities of auditory perception and imagery.In addition,the activation patterns in auditory perception and imagery showed the modal specificity of the neural representation patterns in these co-activated regions.The activation intensity in the superior temporal gyrus,precentral cortex,and presupplementary motor area showed significant differences between the modalities of auditory perception and imagery.This study analyzed the cross-modal neural mechanism by making full use of the activation information in co-activated brain regions,and it was of great significance to the extraction and application of cross-modal brain signals.Considering the importance of functional integration of brain regions in the process of information representation,the present study continued to explore the neural representation of auditory perception and imagery in terms of brain information interaction.Based on the above f MRI data,we obtained functional connection networks with highly consistent structure and different connectivity parameters between auditory perception and imagery,in which the co-activated brain regions were as nodes.In this study,we conducted correlation analysis on the activation signals in the co-activated regions found in the first part of our work,and then compared the functional connection results between auditory perception and imagery.The results showed that the connection values between the areas in the superior temporal gyrus and the right precentral cortex were significantly higher in auditory perception than that in imagery modality.In addition,the modality decoding based on functional connectivity network parameters showed that the functional connectivity network of auditory perception and imagery can be significantly distinguishable.Subsequently,voxel-wise functional connectivity analysis further verified the distribution regions of voxels with significant connectivity differences between the two modalities.This study complemented the correlation and difference between auditory perception and imagery based on functional connectivity analysis,and it provided a new perspective for investigating the neural mechanism of different modal information representations.According to the above studies,there are close correlations between the neural representation of the same object in different modalities,which can further affect the cross-modal representation of information perception.In order to explore the characteristics of cross-modal information representation in the perceptual cortex,we collected the subjects’ f MRI data during the visual and auditory perception task.Based on the multi-voxel pattern analysis,we performed the decoding research on the early visual cortex(including the region of V1,V2,and V3),and auditory cortex(primary(A1)and secondary(A2)auditory cortex).The results showed that the V2,V3,A1,and A2 could represent the category information of cross-modal stimuli.However,the neural representation patterns corresponding to visual and auditory perception showed modal specificity.The study also suggested that rich cognitive experience with the stimuli could improve the decoding accuracy of the sound information in the V2 and V3,and the visual features that could induce imagery activity and made it easier to decode the cross-modal information in the A2.This study provided new experimental discoveries for the controversial cross-modal characteristics of the early visual cortex and auditory cortex,which supplemented the research on the cognitive mechanism of multimodal information processing.In summary,the research compared the neural representation in the modalities of visual perception,auditory perception,and auditory imagery,and studied the crossmodal representation characteristics of visual and auditory information.Cross-modal and multimodal information processing is an important advantage of the human brain.This study can enrich our understanding of the neural mechanism of different modal information and provide the theoretical basis for the application of brain information,and it is also expected that the study can provide brain inspiration for the utilization of multimodal information in the field of artificial intelligence.
Keywords/Search Tags:Cross-modal representation, Visual and auditory perception, Auditory imagery, MVPA, Brian decoding, Functional connectivity, fMRI
PDF Full Text Request
Related items