Copy the page URI to the clipboard
Cross, Simon; Aristeidou, Maria; Rossade, Klaus-Dieter; Wood, Carlton and Brasher, Andrew
(2022).
URL: https://conference.eadtu.eu/
Abstract
This paper reports findings from a research study at the Open University, UK, into the quality of distance learners’ remote online exam experience and the differences in experience between online (pandemic) and in-person (pre-pandemic) modes of examination. Our research responds to the perpetual need for greater insight into the exam experience (Bevitt, 2015) and is uniquely positioned in two ways: firstly, we make use of a robust reference dataset collected before the Covid pandemic; and, secondly, we asked students about their experience preparing and revising for the exam as well as the exam itself. Exam revision represents an important transitional period for learners (Entwistle & Entwistle, 2003).
Our reference dataset is the second iteration of the university's Student Experience of Feedback, Assessment and Revision (SEFAR) survey. This was sent to a representative sample of undergraduate students who answered in respect to courses taken in 2018-2019. Twenty of the approximately 120 question items in the survey were about exams. These, in part, extend item constructs used previously (Gibbs & Dunbar-Goddet, 2007; Vattøy et al, 2021) although we made a distinction between the experience of revising and taking the exam because the relationship between the two is not straightforward (Cross et al, 2016). Our items also asked about anxiety (Falcikov & Boud, 2007), exam preparation (Payne & Brown, 2011), mark satisfaction and enjoyment. SEFAR also asked about formative assessment, tutor marking and feedback, assessment literacies and assessment networks. Our comparison dataset is taken from a 2022 survey of university students’ views about remote online exams since the start of the Covid pandemic. Datasets are judged to be similar and a subset of each (n=168 and n=190 respectively) were used for this study.
Our results show that, overall, the shift to online exams did not impact on the quality of distance learner’s experience of revising for exams or taking the exam itself. For example, we found no significant change in the revision experience across six of seven measures including learning benefits of learning whilst revising, enjoyment, and support. However, we found that students felt less anxious revising for online exams and our survey indicates possible reasons for this.
The quality of the exam experience itself was largely unaffected by the move from in-person to remote online exams. No significant differences were found for seven of the nine measures including question clarity, question relevance, mark satisfaction, enjoyment, anxiety, and sense of achievement. However, we found satisfaction with the exam environment was significantly higher for online exams and that learners felt the online exam was harder than they were expected compared to the in-person exam. Consideration of gender and age will also made.
The paper will conclude by using our datasets to explore whether improvements observed in exam experience are a result of changing the mode of assessment from in-person to online or a product of longer-term institutional progress in quality enhancement. This is an important question for anyone seeking to attribute causality to the shift to online exams during Covid.
This paper includes the rectification of a misprint in Table 8.
Viewing alternatives
- Request a copy from the author This file is not available for public download