The Open UniversitySkip to content

The effect of semantic relatedness measures on multi-label classification evaluation

Nowak, Stefanie; Llorente, Ainhoa; Motta, Enrico and Rüger, Stefan (2010). The effect of semantic relatedness measures on multi-label classification evaluation. In: Proceedings of the ACM International Conference on Image and Video Retrieval - CIVR '10, p. 303.

Full text available as:
Full text not publicly available (Version of Record)
Due to publisher licensing restrictions, this file is not available for public download
DOI (Digital Object Identifier) Link:
Google Scholar: Look up in Google Scholar


In this paper, we explore different ways of formulating new evaluation measures for multi-label image classification when the vocabulary of the collection adopts the hierarchical structure of an ontology. We apply several semantic relatedness measures based on web-search engines, WordNet, Wikipedia and Flickr to the ontology-based score (OS) proposed in 'S. Nowak and H. Lukashevich. Multilabel Classification Evaluation using Ontology Information. In Proc. of IRMLeS Workshop, ESWC, 2009'. The final objective is to assess the benefit of integrating semantic distances to the OS measure. Hence, we have evaluated them in a real case scenario: the results (73 runs) provided by 19 research teams during their participation in the ImageCLEF 2009 Photo Annotation Task. Two experiments were conducted with a view to understand what aspect of the annotation behaviour is more effectively captured by each measure. First, we establish a comparison of system rankings brought about by different evaluation measures. This is done by computing the Kendall ? and Kolmogorov-Smirnov correlation between the ranking of pairs of them. Second, we investigate how stable the different measures react to artificially introduced noise in the ground truth. We conclude that the distributional measures based on image information sources show a promising behaviour in terms of ranking and stability.

Item Type: Conference or Workshop Item
Copyright Holders: 2010 ACM
Extra Information: Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
Keywords: evaluation; image annotation; evaluation measures; benchmark; WordNet
Academic Unit/School: Faculty of Science, Technology, Engineering and Mathematics (STEM) > Knowledge Media Institute (KMi)
Faculty of Science, Technology, Engineering and Mathematics (STEM)
Research Group: Centre for Research in Computing (CRC)
Item ID: 23434
Depositing User: Kay Dave
Date Deposited: 05 Oct 2010 10:20
Last Modified: 09 Dec 2018 10:28
Share this page:


Altmetrics from Altmetric

Citations from Dimensions

Actions (login may be required)

Policies | Disclaimer

© The Open University   contact the OU