Journal
ASSESSING WRITING
Volume 20, Issue -, Pages 53-76Publisher
ELSEVIER SCI LTD
DOI: 10.1016/j.asw.2013.04.001
Keywords
Automated essay scoring; High-stakes assessment; Writing; Race-to-the-Top; Performance assessment; Human raters
Categories
Funding
- William and Flora Hewlett Foundation
Ask authors/readers for more resources
This article summarizes the highlights of two studies: a national demonstration that contrasted commercial vendors' performance on automated essay scoring (AES) with that of human raters: and an international competition to match or exceed commercial vendor performance benchmarks. In these studies, the automated essay scoring engines performed well on five of seven measures and approximated human rater performance on the other two. With additional validity studies, it appears that automated essay scoring holds the potential to play a viable role in high-stakes writing assessments. (C) 2013 Elsevier Ltd. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available