4.3 Article

Automated Scoring of Students' Use of Text Evidence in Writing

Journal

READING RESEARCH QUARTERLY
Volume 55, Issue 3, Pages 493-520

Publisher

WILEY
DOI: 10.1002/rrq.281

Keywords

Writing; Assessment; Instructional strategies; methods and materials; Research methodology; Learning Sciences; Writing; Correlation; 3-Early adolescence

Funding

  1. Spencer Foundation
  2. William T. Grant Foundation
  3. Learning Research and Development Center, University of Pittsburgh
  4. Institute of Education Sciences, U.S. Department of Education [R305A160245]

Ask authors/readers for more resources

Despite the importance of analytic text-based writing, relatively little is known about how to teach to this important skill. A persistent barrier to conducting research that would provide insight on best practices for teaching this form of writing is a lack of outcome measures that assess students' analytic text-based writing development and that are feasible to implement at scale. Automated essay-scoring (AES) technologies offer one potential approach to increasing the feasibility of research in this area, provided that the scores yield information about substantive dimensions of writing aligned to new standards and are sensitive to variation in literacy instruction. The authors describe an approach to using AES technologies to provide information about students' skills at marshaling text evidence in the upper elementary grades. Specifically, the authors examined 1,529 responses to a response-to-text assessment (RTA) from 65 fifth- and sixth-grade language arts classrooms, from which the authors also collected data on instruction via logs, text-based writing assignments, and surveys. Through correlational, univariate, and multilevel multivariate analyses, the authors found validity evidence supporting automated scoring of the RTA: The authors found close correspondence of human and AES scores, alignment of AES scores with components of instruction that the authors expected would predict variation in students' writing quality, and association between AES scores and other expected measures of student achievement. These findings provide encouraging evidence that AES technologies as applied to the RTA can generate valid inferences about students' ability to marshal text evidence in writing and, thus, could be a useful tool for advancing large-scale writing research.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.3
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available