Copy the page URI to the clipboard
Simsek, Duygu; Sandor, Agnes; Buckingham Shum, Simon; Ferguson, Rebecca; De Liddo, Anna and Whitelock, Denise
(2015).
DOI: https://doi.org/10.1145/2723576.2723603
Abstract
When assessing student essays, educators look for the students’ ability to present and pursue well-reasoned, strong, valid and logical arguments, and for their ability to use examples, evidence and personal interpretations for and against a particular position throughout a written piece. Such scholarly argumentation is often articulated by rhetorical metadiscourse. Educators will be necessarily examining metadiscourse in students’ writing as signals of the intellectual moves that make their reasoning visible. Therefore students and educators could benefit from available powerful automated textual analysis that is able to detect rhetorical metadiscourse. However, there is a need to validate such language technologies in higher education contexts, since they were originally developed in non-educational applications. This paper describes an evaluation study of a particular language analysis tool, the Xerox Incremental Parser (XIP), on undergraduate social science student essays, using the mark awarded as a measure of the quality of the writing. As part of this exploration, the study presented in this paper seeks to assess the quality of the XIP through correlational studies and multiple regression analysis. As a result, we found statistically significant correlations between the output of the XIP and grades of student essays.