Font Size: a A A

The Modality Effect In Emotion Perception

Posted on:2019-11-28Degree:MasterType:Thesis
Country:ChinaCandidate:H M ZhangFull Text:PDF
GTID:2405330566979058Subject:Basic Psychology
Abstract/Summary:PDF Full Text Request
It is well known that the emotional state in which human beings perceive others can be conducted through different ways.First of all,facial expression.People can show different emotional states through the eyesmuscles,facemuscles and mouth muscles.Secondly,gesture expression and body expression.People will have different body postures under different emotional states,and gestures can also be used alone to express emotions or thoughts.Thirdly,vocal prosody.The pinch,speed,and intensity of vocal prosodyare all effective ways to express emotions.These three forms of non-verbal communication are collectively referred to as "body language".They are all received and processed through the human visual and auditory modalities.Nonetheless,currently little is known how emotion perception and its neural substrates differ across facial expression and vocal prosody.Prior study found that in the neutral condition event-related potential amplitudes was significantly less than facial-anger and bimodal-anger conditions,but not for vocal-anger over all brain regions.These results suggested that the emotional perception effect of visual modality is stronger than that in the auditory modality.However,the results of the study had some limitations.Firstly,the study did not directly compare the emotion perception across modalities.Secondly,the study did not involve positive emotions(e.g.happiness).More importantly,the above study did not match the emotional valance,intensity and arousal of unimodal emotion stimuli.Therefore,the result cannot be a powerful explanation for facial expression is associated with enhanced emotional perception compared to the vocal prosodies.So that,in currentstudies,we first found the positive and negative(happiness / anger)unimodal emotional materials(visual / auditory)and made all the materials of unimodal emotional stimuli values similar in intensity,valence and arousal,and directly compare whether these unimodal emotional stimuli have modality effect on emotional perception.In the first study,we recruited twenty-five college students from Southwest University to participate in paid experiments.The experimental materials were standardized through pilot study.In the pilot study,there are seven types of stimulus materials(facial-anger,vocal-anger,bimodal-anger,facial-happiness,vocal-happiness,bimodal-happiness,and neutral condition),and all the materials of unimodal emotion and those of bimodal emotion were rated significantly different from neutral materials in intensity,valence and arousal.Moreover,the materials of facial emotion and those of vocal emotion were rated similar in intensity,valence and arousal.Subjects were asked to watch the emotional stimuli displayed on the screen freely and then discriminate the valence of emotional expression(anger,happiness or neutral)from facial,vocal or bimodal condition.At the same time,functional magnetic resonance imaging(fMRI)was used to record the neural activity of the subjects when they were watching the emotional stimuli and doing valence judgment task.Study 1 mainly explored whether the equivalent unimodal emotion stimulation has a modality effect on the emotion perception on the neural level.Behavioral results found higher accuracy and shorter response latencies in facial compared with vocal modality for the happy expression.However,fMRI results showed that whole brain results responded only to brain regions associated with facial processing(e.g.fusiform gyrus,inferior temporal-occipital region).Nevertheless,the contrast of bimodal relative to facial emotion perception did not show any survived activation,irrespective of emotion category.That is,neural activations elicited by facial materials were not significantly intensified by the addition of simultaneous vocal emotion.Furthermore,the analysis of regions of interest(ROI)showed higher percent signal change(PSC)for facial relative to vocal modality in superior temporal sulcus during anger but not happiness and neutral stimulation.It means that facial expression is associated with enhanced emotion perception compared to the equivalent vocal prosodies.Medial prefrontal cortex,however,showed similar PSC for facial and vocal modalities.However,the affect valance judgement task in Study 1 did not allow direct assessment of the perceived emotion intensity,and neuroimaging results showedfacial relative to vocal prosodies recruited more intense neural processing in the occipital-temporal network which are well known for their roles in visual selective attention.That is,the enhanced perceptual processing of facial versus prosodic stimuli in these regions may result from their specific roles in visual processing.In order to remove modality-specific effects and verify the conclusion of Study 1,Study 2 used an affective priming task,which directly assessed one's experiential emotion intensity for bimodal target stimuli following the unimodal primes.In the second study,we recruited thirty-three college students from Southwest University to participate in the paid experiments.The experimental materials of study two were the same as that of study one which were divided into unimodal emotional stimuli and bimodal emotional stimuli.It should be noted that all priming emotional stimuli are unimodal emotional stimuli,and all of the target emotional stimuli are bimodal emotional stimuli.The subjects were asked to watch the priming emotional stimuli at first,and then to watch the target emotional stimuli.After this,they were required to rate the intensity,valance and arousal of the target emotional stimuli according to the response window.The results showed that for angry target preceded by angry primes,the intensity and arousal ratings were significantly lower but valence rating higher during prosodic compared to facial priming.For happy target with anger prime,the intensity and valence ratings were significantly higher(i.e.,more pleasant)but arousal rating lower during prosodic relative to facial priming,suggesting that negative facial prime interfered one's positive rating for the happiness target more than the corresponding prosodic prime.For neutral target,there was no significant difference in these values between these two priming conditions.Thus,according to all valence,arousal and intensity ratings,the results of Study 2 consistently showed that emotional perception of bimodal anger was enhanced while that of bimodal happiness hampered following facial relative to prosodic anger priming.This further verifies it a reliable phenomenon that facial expression results in more intense emotional perception compared to the equivalent vocal prosody.Results of these two studies showed that facial expression is associated with enhanced emotional perception compared to the equivalent vocal prosody.In other words,the same emotional stimuli,perceived this emotion through facial expressions will get stronger feelings than we do through prosodies.The results of these studies have given us a lot of inspiration.That is,when we perceive and regulate emotion in our daily lives,we should make full use of different modalities of human perception to help ourselves to get our own purposes.For example,we expressive positive emotions like happiness through facial expression,and expressive negative emotions like sadness and anger through vocal prosody.
Keywords/Search Tags:functional magnetic resonance imaging, emotion perception, facial expression, vocal prosody, modality
PDF Full Text Request
Related items