The Open UniversitySkip to content
 

Investigating Different Types of Assessment in Massive Open Online Courses

Papathoma, Tina (2015). Investigating Different Types of Assessment in Massive Open Online Courses. MRes thesis The Open University.

Full text available as:
[img]
Preview
PDF (Version of Record) - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Download (2MB) | Preview
Google Scholar: Look up in Google Scholar

Abstract

The current technological era has largely influenced the development of learning environments. As a result, there are new opportunities for teaching, learning and assessment. The emergence of Massive Open Online Courses (MOOCs) in particular, has attracted the attention of higher education institutions and course designers. MOOCs may provide the opportunity to thousands of students to learn from anywhere and at their convenience. Assessment is a component of the learning environment that drives student learning. However, only a small proportion of existing literature on assessment investigates its use for the enhancement of educational growth as most of the literature is concerned with how to use assessment for purposes of grading and ranking (Rowntree, 1987). Assessment has a double role in learning by both motivating students to study in order to undertake it, but also providing the necessary feedback on their performance so that students can track their learning progress (Rowntree, 1987).

Research in MOOCs is currently growing, focusing on different aspects such as the “questionable course quality, high dropout rate, unavailable course credits, complex copyright, limited hardware and ineffective assessments” (Chen, 2014). Assessment in MOOCs has been mostly investigated from a perspective that is looking at: how the grading load can be diminished by adopting automated techniques, the aims of each technique, and finally new potential approaches that will be able to assess high-level cognition. Summing up, researchers are currently testing tools that will be automatically scoring essays and giving feedback to learners in an effective way (see Balfour, 2013). However, the learners’ voice and standpoint about the different assessment types in the MOOCs context is inconclusive in the current literature and there is need for more research.

This study explores learners’ views on assessment types in Massive Open Online Courses, whether any of these has an impact on their enrolment and completion of a course and in what aspects each type of assessment is effective in supporting their learning experience. Auto-assessment, peer-assessment and self-assessment are the types under investigation as they are frequently used in MOOCs and therefore are the most commonly discussed in literature (see Balfour, 2013, Suen, 2013, Wilkowski et al, 2014). The study draws upon literature on assessment in general and on assessment in MOOCs in particular. The concept of online communities, i.e. the learners that appear in MOOCs will also be discussed in detail.

Online ethnographic approaches are employed to explore the issue in question by using online interviewing and observation methods. Thematic analysis is carried out using a sample of 12 MOOCs participants from online interviews and 13 posts of online observations. The outcome of this qualitative research study reveals that even though participants identify benefits in peer assessment, there is a preference for automated assessment since it is an already known, clear type of assessment for them. Moreover, self-assessment is not popular by participants. Learners’ comments also reveal that a clear guidance for assessment helps them to carry out peer assessment more effectively. Some learners also consider that the combination of assessment types may also have a positive effect on students’ learning as each of them serves a different purpose.

Item Type: Thesis (MRes)
Copyright Holders: 2015 The Author
Keywords: Education technology
Academic Unit/School: Faculty of Wellbeing, Education and Language Studies (WELS)
Item ID: 61445
Depositing User: ORO Import
Date Deposited: 24 May 2019 10:19
Last Modified: 24 Jun 2019 22:29
URI: http://oro.open.ac.uk/id/eprint/61445
Share this page:

Download history for this item

These details should be considered as only a guide to the number of downloads performed manually. Algorithmic methods have been applied in an attempt to remove automated downloads from the displayed statistics but no guarantee can be made as to the accuracy of the figures.

Actions (login may be required)

Policies | Disclaimer

© The Open University   contact the OU