The Open UniversitySkip to content
 

Agreement is overrated: A plea for correlation to assess human evaluation reliability

Amidei, Jacopo; Piwek, Paul and Willis, Alistair (2019). Agreement is overrated: A plea for correlation to assess human evaluation reliability. In: Proceedings of the 12th International Conference on Natural Language Generation, 29 Oct - 1 Nov 2019, Tokyo, Japan.

Full text available as:
[img]
Preview
PDF (Version of Record) - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Download (543kB) | Preview
Google Scholar: Look up in Google Scholar

Abstract

Inter-Annotator Agreement (IAA) is used as a means of assessing the quality of NLG evaluation data, in particular, its reliability. According to existing scales of IAA interpretation – see, for example, Lommel et al. (2014), Liu et al. (2016), Sedoc et al. (2018) and Amidei et al. (2018a) – most data collected for NLG evaluation fail the reliability test. We confirmed this trend by analysing papers published over the last 10 years in NLG-specific conferences (in total 135 papers that included some sort of human evaluation study). Following Sampson and Babarczy (2008), Lommel et al. (2014), Joshi et al. (2016) and Amidei et al. (2018b), such phenomena can be explained in terms of irreducible human language variability. Using three case studies, we show the limits of considering IAA as the only criterion for checking evaluation reliability. Given human language variability, we propose that for human evaluation of NLG, correlation coefficients and agreement coefficients should be used together to obtain a better assessment of the evaluation data reliability. This is illustrated using the three case studies.

Item Type: Conference or Workshop Item
Academic Unit/School: Faculty of Science, Technology, Engineering and Mathematics (STEM) > Computing and Communications
Faculty of Science, Technology, Engineering and Mathematics (STEM)
Item ID: 67962
Depositing User: Jacopo Amidei
Date Deposited: 04 Nov 2019 09:19
Last Modified: 25 Mar 2020 03:45
URI: http://oro.open.ac.uk/id/eprint/67962
Share this page:

Download history for this item

These details should be considered as only a guide to the number of downloads performed manually. Algorithmic methods have been applied in an attempt to remove automated downloads from the displayed statistics but no guarantee can be made as to the accuracy of the figures.

Actions (login may be required)

Policies | Disclaimer

© The Open University   contact the OU