Whitelock, Denise; Gilbert, Lester; Hatzipanagos, Stylianos; Watt, Stuart; Zhang, Pei; Gillary, Paul and Saucedo, Alejandra
Addressing the challenges of assessment and feedback in higher education: a collaborative effort across three UK universities.
In: INTED 2012 6th International Technology Education & Development Conference, 5 - 7 March 2012, Valencia, Spain, International Association of Technology, Education and Development (IATED), pp. 3354–3359.
Full text available as:
Assessment has been identified as one of the major challenges faced by Higher Education Institutions (Whitelock, et al, 2007). As a response to the challenge, in a project funded by the Joint Information Systems Committee (JISC) Open Mentor (OM) was developed as a learning support tool for tutors to help them reflect on the quality of feedback given to their students on assignments submitted electronically. Its development was based on the fundamental theory that there was convincing evidence of systematic connections between different types of tutor comments and the level of attainment in an assignment (Whitelock, et al 2004). OM analyses, filters, and classifies tutor comments through an algorithm based on Bale’s Interaction Process. As a result, tutor’s feedback comments are classified into four categories namely: Positive reactions, Teaching points, Questions and Negative reactions. The feedback provided is analysed against an ideal number of feedback comments that an assignment given a mark of a specific band should have. Reports are provided in OM to support tutors in the task of reflecting on their feedback structure, content and style.
The JISC-funded Open Mentor technology transfer (OMtetra) project is continuing the work initiated by the Open University implementing OM at the University of Southampton and King’s College London. OMtetra aims at taking up OM and extending its use by developing the system further and ultimately offering better support to tutors and students in the assessment process. A group of tutors from the University of Southampton and Kings’ College are at present using OM in their teaching and assessment. In this paper, we explore potential improvements to OM in three aspects: user interface, technology implementation and analysis algorithm design.
For the user experience aspect suggested additions to OM include the creation of a simple entry form where tutors may validate the results of the analysis of the feedback comments. In addition, enhancements to OM will facilitate uploading of students and modules information into the system. Presently, OM utilises a built-in database of users that needs to be maintained separately from institutional systems. Improvements for this system feature include a more flexible authentication module which would simplify the deployment of the system in new environments and thus promote uptake by a larger number of institutions. In order to reach this goal, the system will be migrated to an open source framework which provides out-of-the-box integration with various authentication systems. The last to improve is the analysis algorithm. Currently, OM classifies tutors’ comments into four categories by applying an underlying text matching algorithm. This method could be improved if tutors are allowed to confirm comments’ classification through the OM interface and a free-text classification algorithm. As the number of users grow, so will the algorithm and analysis process, making it more comprehensive and intelligent as the keywords used during analysis are dynamically expanded.
OMtetra is an on-going project with a lot of potential. We believe that the outcomes from the development and trial implementations of OM will contribute highly to the area of assessment in higher education.
Actions (login may be required)