The Open UniversitySkip to content
 

A multidimensional evaluation framework for personal learning environments

Law, Effie Lai-Chong and Wild, Fridolin (2014). A multidimensional evaluation framework for personal learning environments. In: Kroop, Sylvana; Mikroyannidis, Alexander and Wolpers, Martin eds. Responsive Open Learning Environments: Outcomes of Research from the ROLE Project. Cham: Springer International Publishing, pp. 49–77.

Full text available as:
[img]
Preview
PDF (Version of Record) - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Download (368kB) | Preview
DOI (Digital Object Identifier) Link: https://doi.org/10.1007/978-3-319-02399-1_3
Google Scholar: Look up in Google Scholar

Abstract

Evaluating highly dynamic and heterogeneous Personal Learning Environments (PLEs) is extremely challenging. Components of PLEs are selected and configured by individual users based on their personal preferences, needs, and goals. Moreover, the systems usually evolve over time based on contextual opportunities and constraints. As such dynamic systems have no predefined configurations and user interfaces, traditional evaluation methods often fall short or are even inappropriate. Obviously, a host of factors influence the extent to which a PLE successfully supports a learner to achieve specific learning outcomes. We categorize such factors along four major dimensions: technological, organizational, psycho-pedagogical, and social. Each dimension is informed by relevant theoretical models (e.g., Information System Success Model, Community of Practice, self-regulated learning) and subsumes a set of metrics that can be assessed with a range of approaches. Among others, usability and user experience play an indispensable role in acceptance and diffusion of the innovative technologies exemplified by PLEs. Traditional quantitative and qualitative methods such as questionnaire and interview should be deployed alongside emergent ones such as learning analytics (e.g., context-aware metadata) and narrative-based methods. Crucial for maximal validity of the evaluation is the triangulation of empirical findings with multi-perspective (end-users, developers, and researchers), mixed-method (qualitative, quantitative) data sources. The framework utilizes a cyclic process to integrate findings across cases with a cross-case analysis in order to gain deeper insights into the intriguing questions of how and why PLEs work.

Item Type: Book Section
Copyright Holders: 2015 The Authors
ISBN: 3-319-02398-5, 978-3-319-02398-4
Keywords: evaluation; multi-method; usability; user experience; community of practice; self-regulated learning; diffusion of innovation; cross-case analysis; automated monitoring
Academic Unit/School: Faculty of Science, Technology, Engineering and Mathematics (STEM) > Knowledge Media Institute (KMi)
Faculty of Science, Technology, Engineering and Mathematics (STEM)
Research Group: Centre for Research in Computing (CRC)
Item ID: 41834
Depositing User: Kay Dave
Date Deposited: 14 Jan 2015 13:30
Last Modified: 28 Nov 2016 18:07
URI: http://oro.open.ac.uk/id/eprint/41834
Share this page:

Altmetrics

Download history for this item

These details should be considered as only a guide to the number of downloads performed manually. Algorithmic methods have been applied in an attempt to remove automated downloads from the displayed statistics but no guarantee can be made as to the accuracy of the figures.

Actions (login may be required)

Policies | Disclaimer

© The Open University   contact the OU