4.7 Article

Automated essay evaluation with semantic analysis

Journal

KNOWLEDGE-BASED SYSTEMS
Volume 120, Issue -, Pages 118-132

Publisher

ELSEVIER SCIENCE BV
DOI: 10.1016/j.knosys.2017.01.006

Keywords

Automated scoring; Essay evaluation; Natural language processing; Semantic attributes; Semantic feedback

Ask authors/readers for more resources

Essays are considered as the most useful tool to assess learning outcomes, guide students' learning process and to measure their progress. Manual grading of students' essays is a time-consuming process, but is nevertheless necessary. Automated essay evaluation represents a practical solution to this task, however, its main weakness is the predominant focus on vocabulary and text syntax, and limited consideration of text semantics. In this work, we propose an extension of existing automated essay evaluation systems by incorporating additional semantic coherence and consistency attributes. We design the novel coherence attributes by transforming sequential parts of an essay into the semantic space and measuring changes between them to estimate coherence of the text. The novel consistency attributes detect semantic errors using information extraction and logic reasoning. The resulting system (named SAGE - Semantic Automated Grader for Essays) provides semantic feedback for the writer and achieves significantly higher grading accuracy compared with 9 other state-of-the-art automated essay evaluation systems. (C) 2017 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available