Can a computer marked exam improve retention?

Rosewell, Jon (2012). Can a computer marked exam improve retention? In: ALT-C 2012: A Confrontation with Reality, 11-13 Sep 2012, Manchester, UK.



Distance learning modules (particularly low-cost introductory and enrichment modules) may show poor retention compared to traditional campus courses. The perceived difficulty of exams and end-of-module assessments (EMA) appears to deter some students from submitting. In contrast, interactive computer-marked assignments (iCMA) are typically attempted by most students.

Can retention therefore be improved by changing the format of part of the final assessment to an iCMA?

'Robotics and the meaning of life' is a 10-point, 10-week general-interest Open University module. The assessment comprised a mid-module iCMA and a final written EMA. The iCMA (a Moodle quiz) provided detailed feedback only after the submission deadline. The EMA included short-answer questions, a programming question and an essay. The EMA was script-marked and feedback limited to overall score and performance profile provided well after the end of the course.

The intervention simply replaced the script-marked short-answer questions by a second iCMA covering the same content with similar questions. The programming and essay questions were retained unchanged as a written, script-marked EMA.

The hypothesis to be tested was that retention would increase: students would be more likely to submit the final iCMA, their confidence would increase, and they would be motivated to submit the written EMA.

Quantitative data were gathered for patterns of submission, course completion and pass rates for two presentations (124 and 220 students); data were also available for thirteen previous presentations (1814 students). Structured interviews were carried out to probe student preferences, confidence and engagement.

More students submitted the iCMA (86%) than the EMA (81%). Although they had the same deadline, 91% of students submitted the iCMA before the EMA. They submitted the iCMA well in advance of the deadline (median 4 days 15 hrs) but kept the EMA open as long as possible (median 18 hrs before deadline; 11% submitted in the final hour). These patterns strongly suggest that students were more confident with the iCMA than the EMA. Completion rates were the highest recorded: 88% and 89% compared to 79% for pre-intervention presentations. Overall pass rates were also improved (83% and 85% c.f. 76%). This can be ascribed to improved submission rates alone: the pass rate and mean scores among those who submit were unchanged giving confidence that the assessment difficulty was unaltered.

Student interviews suggested that students did attempt the final iCMA before the EMA and had greater confidence in obtaining a good mark for the iCMA than the EMA. Students valued the mix of assessment methods and felt it produced a robust result; although some expressed concern over the correctness of computer marking, they appreciated the detailed feedback it provided.

This intervention suggests that a change of assessment format can improve student engagement and pass rates without compromising rigour.

Viewing alternatives

No digital document available to download for this item

Item Actions