Comparing Formative Feedback Reports: Human and Automated Text Analysis of Constructed Response Questions in Biology

TitleComparing Formative Feedback Reports: Human and Automated Text Analysis of Constructed Response Questions in Biology
Publication TypeConference Paper
Year of Publication2013
AuthorsWeston, M, Parker JM, Urban-Lurain M
Refereed DesignationRefereed
Conference NameNational Association on Research in Science Teaching
Date Published04/2013
PublisherNARST
Conference LocationRio Grande, Puerto Rico
KeywordsAACR, AACR-pub, constructed response, DQC, dqc-pub, formative feedback, hand scoring, Lexical analysis, photosynthesis, text analysis
AbstractConstructed response questions can offer a detailed look into students’ reasoning skills and understanding of key concepts, but take a considerable amount of time to analyze. This trade-off between the amount of time it takes to analyze constructed response questions and their ability to reveal student thinking has made them a desirable, but out-of-reach option, for instructors in large enrollment courses. Automated text analysis can potentially alleviate the time burden of constructed response questions by speeding up the scoring process, while still revealing the level of detail a human reader looks for. This report compares the quality and time needed for two different instructors’ analyses of a hand-scored sample of responses to a constructed response question on cell metabolism with an analysis done using statistical modeling of automated text analysis results. We found that the automated text analysis can obtain the same information that an instructor would look for in responses. Additionally, it has the ability to summarize the entire set of responses in virtually the same amount of time. In this study the automated text analysis along with the discriminant analysis took more time than the instructors spent on their analyses, but most of the time consuming work would not need to be repeated with new data in the future.
0
Your rating: None
Groups: