The Open UniversitySkip to content

Does the Quality and Quantity of Exam Revision Impact on Student Satisfaction and Performance in the Exam Itself?: Perspectives from Undergraduate Distance Learners

Cross, Simon; Whitelock, Denise and Mittelmeier, Jenna (2016). Does the Quality and Quantity of Exam Revision Impact on Student Satisfaction and Performance in the Exam Itself?: Perspectives from Undergraduate Distance Learners. In: EDULEARN16 Proceedings, IATED Academy, pp. 5052–5061.

Full text available as:
PDF (Accepted Manuscript) - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Download (343kB) | Preview
DOI (Digital Object Identifier) Link:
Google Scholar: Look up in Google Scholar


This paper reports the findings of a large-scale survey in to the student experience of assessment at the UK’s largest distance learning university. Three key aspects of assessment were covered in the survey: formative assessment, revision for examination and the examination/end of module; with a view of providing insight for more effective learning designers and learning analytics. This analysis meets an urgent need to better understand the assessment analytics associated with the ‘revision’ period – the weeks leading up to an examination that may be crucial in ensuring student assessment success, building confidence, and improving progression to the next course. Using results from an online questionnaire (n=281) sent to undergraduate distance learners and follow-up telephone interviews (n=13), this paper will examine some of the relationships between the revision and examination experience. Specific regard will be paid to usefulness of revision resources, time spent revising, enjoyment, reflection and learning, exam preparedness and clarity, mark satisfaction and score received. The paper will begin with an overview of the central findings of the survey, followed by a focus on the relationship between the ‘revision’ period and the examination itself. In particular, aspects of student experience, performance and self-reported learning effort will be explored. The research represents an important step in extending the scope of assessment analytics and in better understanding the opportunities for providing more timely, targeted or personalised learning support. Key findings of the analysis reported are that revising for an exam and the exam itself are relatively distinct experiences; there is no significant correlation between time spent revising, usefulness of revision resources and module exam score. Similarly, revision for learning, revision design and satisfaction with revision resources appear as distinct factors in the student experience. These results have clear implications for the design and teaching of assessment.

Item Type: Conference or Workshop Item
Copyright Holders: 2016 IATED
ISBN: 84-608-8860-6, 978-84-608-8860-4
ISSN: 2340-1117
Keywords: Assessment, exam, revision, student satisfaction, revision analytics, learning analytics, distance learning
Academic Unit/School: Faculty of Wellbeing, Education and Language Studies (WELS)
Learning and Teaching Innovation (LTI) > Institute of Educational Technology (IET)
Learning and Teaching Innovation (LTI)
Research Group: Centre for Research in Education and Educational Technology (CREET)
OpenSpace Research Centre (OSRC)
Item ID: 46937
Depositing User: Simon Cross
Date Deposited: 05 Aug 2016 13:57
Last Modified: 01 May 2019 11:03
Share this page:


Altmetrics from Altmetric

Citations from Dimensions

Download history for this item

These details should be considered as only a guide to the number of downloads performed manually. Algorithmic methods have been applied in an attempt to remove automated downloads from the displayed statistics but no guarantee can be made as to the accuracy of the figures.

Actions (login may be required)

Policies | Disclaimer

© The Open University   contact the OU