Font Size: a A A

A Study On Feedback In College English Writing Based On Automated Writing Evaluation System

Posted on:2016-09-26Degree:MasterType:Thesis
Country:ChinaCandidate:C YuanFull Text:PDF
GTID:2295330470463434Subject:Curriculum and pedagogy
Abstract/Summary:PDF Full Text Request
Automated writing evaluation(AWE) program, based on educational technology, automatically provides assessment and feedback to students‘ compositions. Such way of assessing compositions or essays has become the direction and trend of future feedback research. Juku, as a tool to evaluation students‘ writing abilities in this study, is one of such systems. Previous studies on AWE has generally centered on examining its reliability, validity, comparison with human raters, participants‘ feedback and so on. The purpose of this study is to takes Juku as an example to discuss the effects of automated essay feedback on writing development of Chinese college EFL leaners.The subjects in this study were 160 EFL sophomores and 5 English writing teachers from a university in Jiangxi. Both quantitative and qualitative research methods were used: a questionnaire, a semi-structured interview and students‘ writing samples analysis. The research questions are:(1) what is students‘ attitude towards the usage of AWE and Juku‘s feedback?(2) What is teachers ‘attitude towards the usage of the AWE?(3) What effects does Juku‘s feedback have on learners‘ revision of essays and on learners‘ writing accuracy development?The research results reveal that on the whole, students held positive attitudes toward the integrated use of Juku. It is found that Juku‘s feedback is most helpful in checking grammar, providing suggestions for word usage and revising mechanics mistakes. However, the students do not recognize the helpfulness of automated feedback in content and organization. What‘s more, most of the students considered that grades given by the program can not reflect their actual English writing ability and the revision feedback given by Juku sometimes is wrong. Moreover, a majority of students in this study thought the automated feedback was not specific or informative enough to guide them how to revise their writings. The teachers‘ views on the potential of AWE corrective feedback in helping students with grammar and mechanics were largely positive although they were not completely satisfied with the quality of the feedback. Teachers‘ job on the organization and content is simply irreplaceable. As for the writing accuracy, Juku can correct over half of students‘ errors(53.6%). Besides, we also found that approximately 11.5% of the Juku feedback was wrong. These inadequate feedback messages can be very confusing and frustrating for learners who do not have enough grammar knowledge to filter out these false alarms. To sum up, Although Juku sometimes still offered inappropriate feedback messages, most of the feedback messages were useful for EFL learners to improve their writing accuracy.Since none of the AWE systems is without drawbacks and flaws, learners and teachers who use the AWE systems should be very careful. Based on the major findings of this study, it is suggested that English writing teachers should not expect that the AWE systems can replace the role of teachers. Teachers should provide students with additional assistance if possible. They need to offer feedback especially on content development and organization for students. What‘s more, the developers of AWE systems should revise the feedback mechanism and search for ways to eliminate the misleading or unusable suggestions and provide more detailed and reliable feedback on writing. Last but not least, further research and development of more sophisticated mechanism for providing reliable and individualized feedback on content and organization are needed.
Keywords/Search Tags:Automated Writing Evaluation, Computerized Feedback, College English Writing
PDF Full Text Request
Related items