Font Size: a A A

An ERP Study On The Effect Of Chord On The Processing Of Quantity-name Collocatio

Posted on:2024-09-30Degree:DoctorType:Dissertation
Country:ChinaCandidate:X WangFull Text:PDF
GTID:1525306923483964Subject:Chinese Language and Literature
Abstract/Summary:PDF Full Text Request
Music has been playing a critical role in the evolution of language.With the development of the times,music and language are closely related to our daily lives.More importantly,both language and music are sound sequences that unfold gradually over time,integrating discrete elements into hierarchical systems with syntax according to their respective rules.Meanwhile,music can also express certain concepts and thereby be understood by people.It is because of the above similarities,whether these two domains share neural processing resources in our brain have aroused widespread concern among scholars.Although previous studies have addressed this issue with relatively fruitful results,there remains significant room for improvement and further research to elucidate underlying mechanisms.Previous studies on this issue have been conducted mostly with Indo-European languages in concern,all conclusions are based on morphosyntax.At the syntactic level,morphosyntax in languages and musical syntax may share neural processing resources at both the early and late stages.While at the semantic level,the findings have been hardly conclusive on whether the neural resources are shared across these two domains.Particularly there is little evidence from non-alphabetic languages,such as Chinese,therefore the generality of the results needs to be verified in further studies.In the present dissertation,Chinese sentences as the linguistic stimuli were presented visually word by word,and chord sequences as the musical stimuli were presented aurally simultaneously to musicians and non-musicians,with one word matched with one chord.The event-related potentials(ERP)technology was used to examine whether Chinese sentences and music sequences share neural resources when presented concurrently.This research consists of 4 parts;with 2 ERP experiments in each part.Linguistic violations(syntactic or semantic),integration difficulties(long or short distances),and subject groups(musicians or non-musicians)were manipulated as the variables.All results led to the following conclusions:(1)For local-distance dependencies,only syntactic integration resources were shared between language and music.At the semantic level,the N400 elicited by semantic violations in language was independent of musical semantic processing.It indicated that this N400 was irrelevant to semantic integration,leading to null interaction between language and music.At the syntactic level,the P600 elicited by syntactic violation was enhanced by music syntactic integration,suggesting that the syntactic integration resources were shared across domains.(2)For long-distance dependencies,integration resources were shared both at semantic and syntactic levels across language and music.At the semantic level,the P200 elicited by the early semantic integration at the early stage,and the N400 elicited by long-distance semantic integration at the late stage were both affected by musical semantic processing,showing that language and music shared the semantic integration resources both at the early and late stages.At the syntactic level,the P600 elicited by syntactic violations interacted with music syntax,showing that the syntactic integration resources in the late stage were shared across these two domains.(3)Music training could modulate the neural mechanisms between language and music.For the local-distance syntactic dependencies,the P600 elicited by musicians was significantly larger than non-musicians and had a more left-oriented distribution of N400 and P600.For the long-distance dependencies,musicians and non-musicians showed different tendency variations in the amplitude of N400 and P600 for semantic and syntactic integration processing respectively.Non-musicians showed significantly larger amplitudes in both N400 and P600 when language and music were violated,compared to sole language violations.Whereas the amplitudes of N400 and P600 for musicians were significantly reduced.This demonstrated that musical training could moderate the processing mechanisms of language and music.Non-musicians employed two sets of neural mechanisms that were identical in nature in language and music processing,thus a facilitation effect occurred when both sets worked simultaneously.Musicians,on the other hand,gradually merged these two separate mechanisms into one due to music training,and competition for the same resources occurred between language and music processing.In conclusion,all these results imply that the integration resources are shared by language and music.Music processing interacts with language processing both at the early and late stages.Meanwhile,music training could modulate the neural mechanisms across these two domains.Musicians gradually integrate two separate sets of processing mechanisms into one,so that the brain can optimally allocate the processing resources.
Keywords/Search Tags:language processing, music processing, neural resources, ERP
PDF Full Text Request
Related items