4.7 Article

The impact of automated writing evaluation (AWE) on EFL learners' peer and self-editing

Journal

EDUCATION AND INFORMATION TECHNOLOGIES
Volume 28, Issue 6, Pages 6645-6665

Publisher

SPRINGER
DOI: 10.1007/s10639-022-11458-x

Keywords

Automated writing evaluation; Peer editing; Self-editing; Writing assessment; Online writing Checkers

Ask authors/readers for more resources

Automated Writing Evaluation (AWE) is a machine technique used for assessing learners' writing. This study compares the effects of automated peer and self-editing using AWE software, WRITER, on Arab EFL learners' cause-effect essay writing. The results indicate that both the peer and self-editing experimental groups benefited from the AWE software, with no significant difference between them. The qualitative data also reflect participants' positive evaluation of the software and the automated editing experience.
Automated Writing Evaluation (AWE) is one of the machine techniques used for assessing learners' writing. Recently, this technique has been widely implemented for improving learners' editing strategies. Several studies have been conducted to compare self-editing with peer editing. However, only a few studies have compared automated peer and self-editing. To fill this research gap, the present study implements AWE software, WRITER, for peer and self-editing. For this purpose, a pre-post quasi-experimental research design with convenience sampling is done for automated and non-automated editing of cause-effect essay writing. Arab, EFL learners of English, 44 in number, have been assigned to four groups: two peer and self-editing control groups and two automated peer and self-editing experimental groups. There is a triangulation of the quasi-experimental design with qualitative data from retrospective notes and questionnaire responses of the participants during and after automated editing. The quantitative data have been analyzed using non-parametric tests. The qualitative data have undergone thematic and content analysis. The results reveal that the AWE software has positively affected both the peer and self-editing experimental groups. However, no significant difference is detected between them. The analysis of the qualitative data reflects participants' positive evaluation of both the software and the automated peer and self-editing experience.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available