Overview of Automated Analysis

Constructed response questions – in which students must explain phenomena in their own words – create more meaningful opportunities for instructors to identify their students’ learning obstacles than multiple choice questions. However, the realities of typical large-enrollment undergraduate classes restrict the options faculty have for analyzing student writing.

In the Automated Analysis of Constructed Response (AACR) Research Group, we are exploring the use of computerized lexical analysis of students’ writing in large enrollment undergraduate biology and geology courses. We have created libraries that categorize student responses with > 90% accuracy. These categories can be used to predict expert ratings of student responses with accuracy approaching inter-rater reliability among expert raters. These techniques also provide insight into students’ use of analogical thinking, a fundamental part of scientific modeling. These techniques have potential for improving assessment practices across STEM disciplines.