4.3 Article

Rethinking the role of automated writing evaluation (AWE) feedback in ESL writing instruction

Journal

JOURNAL OF SECOND LANGUAGE WRITING
Volume 27, Issue -, Pages 1-18

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.jslw.2014.10.004

Keywords

AWE; ESL writing; Corrective feedback; Mixed-methods research

Categories

Ask authors/readers for more resources

The development of language processing technologies and statistical methods has enabled modern automated writing evaluation (AWE) systems to provide feedback on language and content in addition to an automated score. However, concerns have been raised with regard to the instructional and assessment value of AWE in writing classrooms. The findings from a few classroom-based studies concerning the impact of AWE on writing instruction and performance are largely inconclusive. Meanwhile, since research provides favorable evidence for the reliability of AWE corrective feedback, and that writing accuracy is both an important and frustrating issue, it is worthwhile to examine more specifically the impact of AWE corrective feedback on writing accuracy. Therefore, the study used mixed-methods to investigate how Criterion (R) affected writing instruction and performance. Results suggested that Criterion (R) has led to increased revisions, and that the corrective feedback from Criterion (R) helped improve accuracy from a rough to a final draft. The potential benefits were also confirmed by the instructors' interviews. The students' perspectives were mixed, but the extent to which the views vary may depend on the students' language proficiency level and their instructors' use and perspectives of AWE. (C) 2014 Elsevier Inc. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.3
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available