Font Size: a A A

Validating The Writing Part B GSEEE

Posted on:2021-05-21Degree:MasterType:Thesis
Country:ChinaCandidate:L M FangFull Text:PDF
GTID:2415330632951014Subject:English Linguistics and Applied Linguistics
Abstract/Summary:PDF Full Text Request
This study was conducted to investigate how topics might affect task difficulty comparability and test takers' writing performance of the Graduate School Entrance English Examination(GSEEE).The study aimed to find out whether essay scores and textual features varied with different topics,thus to clarify if topic constitutes one construct-irrelevant factor and poses threat to test validity.This study was situated within Chinese context,based on topics used in the GSEEE writing Part B in the past decade(2009-2019),in which three representative topics:social(2015),educational(201 7)and personal(2019)topics were selected from the original GSEEE test papers as targets in this study.Assessment Use Argument(AUA)proposed by Bachman&Palmer(2010)was adopted as the theoretical formwork for validationThe data consisted of 109 written essays produced by 45 potential test takers and 436 rating data from four trained raters.Multi-facet Rasch Measurement(MFRM)was used to analyze the difficulty comparability and topic effect on scoring,Coh-Metrix(an automated text analysis tool)was adopted to generate a finer-grained analysis of the textual features of the task responses.Statistical analysis results revealed that task comparability,scoring and textual features didn't vary with different topics,indicating no significant topic effect was found in GSEEE writing Part B.In analyzing task difficulty comparability,results showed that no significant group means were found(p=0.06)and the separation index of 0.65,Chi-square statistics of p-value of.01 in FACETS indicated that three topics were equally difficult.In scoring analysis,Cronbach's alpha statistics and the reliability estimates for internal consistency for the five rating categories indicated that the inter-rater reliability was within acceptable level and same abilities were being measured in the GSEEE rating rubric,no significant bias/interaction was found between topic with test takers,raters and rubrics Regarding textual features,no significant difference was found between the textual features generated within three topics.Inplications were also discussed in this study.
Keywords/Search Tags:prompt, GSEEE writing task, validation, Assessment Use Argument
PDF Full Text Request
Related items