4.3 Article

State-of-the-art automated essay scoring: Competition, results, and future directions from a United States demonstration

Journal

ASSESSING WRITING
Volume 20, Issue -, Pages 53-76

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.asw.2013.04.001

Keywords

Automated essay scoring; High-stakes assessment; Writing; Race-to-the-Top; Performance assessment; Human raters

Funding

  1. William and Flora Hewlett Foundation

Ask authors/readers for more resources

This article summarizes the highlights of two studies: a national demonstration that contrasted commercial vendors' performance on automated essay scoring (AES) with that of human raters: and an international competition to match or exceed commercial vendor performance benchmarks. In these studies, the automated essay scoring engines performed well on five of seven measures and approximated human rater performance on the other two. With additional validity studies, it appears that automated essay scoring holds the potential to play a viable role in high-stakes writing assessments. (C) 2013 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.3
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available