3.8 Article

Different Approaches to Assessing the Quality of Explanations Following a Multiple-Document Inquiry Activity in Science

Journal

Publisher

SPRINGER
DOI: 10.1007/s40593-017-0138-z

Keywords

Automatic assessment; Mental models; Explanations; Causal structure; Causal relations; Machine learning; Natural language processing

Funding

  1. Institute of Education Sciences [R305B070460, R305F100007]
  2. National Science Foundation [1535299]
  3. Division Of Undergraduate Education
  4. Direct For Education and Human Resources [1535299] Funding Source: National Science Foundation

Ask authors/readers for more resources

This article describes several approaches to assessing student understanding using written explanations that students generate as part of a multiple-document inquiry activity on a scientific topic (global warming). The current work attempts to capture the causal structure of student explanations as a way to detect the quality of the students' mental models and understanding of the topic by combining approaches from Cognitive Science and Artificial Intelligence, and applying them to Education. First, several attributes of the explanations are explored by hand coding and leveraging existing technologies (LSA and Coh-Metrix). Then, we describe an approach for inferring the quality of the explanations using a novel, two-phase machine-learning approach for detecting causal relations and the causal chains that are present within student essays. The results demonstrate the benefits of using a machine-learning approach for detecting content, but also highlight the promise of hybrid methods that combine ML, LSA and Coh-Metrix approaches for detecting student understanding. Opportunities to use automated approaches as part of Intelligent Tutoring Systems that provide feedback toward improving student explanations and understanding are discussed.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available