The Open UniversitySkip to content
 

The effect of semantic relatedness measures on multi-label classification evaluation

Nowak, Stefanie; Llorente, Ainhoa; Motta, Enrico and Rüger, Stefan (2010). The effect of semantic relatedness measures on multi-label classification evaluation. In: Proceedings of the ACM International Conference on Image and Video Retrieval (CIVR '10), 5-7 July 2010, Xi'an, China, p. 303.

Full text available as:
Full text not publicly available
Due to copyright restrictions, this file is not available for public download
DOI (Digital Object Identifier) Link: http://dx.doi.org/10.1145/1816041.1816086
Google Scholar: Look up in Google Scholar

Abstract

In this paper, we explore different ways of formulating new evaluation measures for multi-label image classification when the vocabulary of the collection adopts the hierarchical structure of an ontology. We apply several semantic relatedness measures based on web-search engines, WordNet, Wikipedia and Flickr to the ontology-based score (OS) proposed in 'S. Nowak and H. Lukashevich. Multilabel Classification Evaluation using Ontology Information. In Proc. of IRMLeS Workshop, ESWC, 2009'. The final objective is to assess the benefit of integrating semantic distances to the OS measure. Hence, we have evaluated them in a real case scenario: the results (73 runs) provided by 19 research teams during their participation in the ImageCLEF 2009 Photo Annotation Task. Two experiments were conducted with a view to understand what aspect of the annotation behaviour is more effectively captured by each measure. First, we establish a comparison of system rankings brought about by different evaluation measures. This is done by computing the Kendall ? and Kolmogorov-Smirnov correlation between the ranking of pairs of them. Second, we investigate how stable the different measures react to artificially introduced noise in the ground truth. We conclude that the distributional measures based on image information sources show a promising behaviour in terms of ranking and stability.

Item Type: Conference Item
Copyright Holders: 2010 ACM
Extra Information: Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
Keywords: evaluation; image annotation; evaluation measures; benchmark; WordNet
Academic Unit/Department: Knowledge Media Institute
Interdisciplinary Research Centre: Centre for Research in Computing (CRC)
Item ID: 23434
Depositing User: Kay Dave
Date Deposited: 05 Oct 2010 10:20
Last Modified: 26 Oct 2012 07:40
URI: http://oro.open.ac.uk/id/eprint/23434
Share this page:

Actions (login may be required)

View Item
Report issue / request change

Policies | Disclaimer

© The Open University   + 44 (0)870 333 4340   general-enquiries@open.ac.uk