Overview of INEX 2014

Bellot, Patrice; Bogers, Toine; Geva, Shlomo; Hall, Mark; Huurdeman, Hugo; Kamps, Jaap; Kazai, Gabriella; Koolen, Marijn; Moriceau, Véronique; Mothe, Josiane; Preminger, Michael; SanJuan, Eric; Schenkel, Ralf; Skov, Mette; Tannier, Xavier and Walsh, David (2014). Overview of INEX 2014. In: Kanoulas, Evangelos; Lupu, Mihai; Clough, Paul; Sanderson, Mark; Hall, Mark; Hanbury, Allan and Toms, Elaine eds. Information access evaluation. Multilinguality, multimodality, and interaction: 5th International Conference of the CLEF Initiative, CLEF 2014, Sheffield, UK, September 15-18, 2014. Proceedings. Lecture Notes in Computer Science, 8685. Springer, Cham, pp. 212–228.

DOI: https://doi.org/10.1007/978-3-319-11382-1_19


INEX investigates focused retrieval from structured documents by providing large test collections of structured documents, uniform evaluation measures, and a forum for organizations to compare their results. This paper reports on the INEX 2014 evaluation campaign, which consisted of three tracks: The Interactive Social Book Search Track investigated user information seeking behavior when interacting with various sources of information, for realistic task scenarios, and how the user interface impacts search and the search experience. The Social Book Search Track investigated the relative value of authoritative metadata and user-generated content for search and recommendation using a test collection with data from Amazon and LibraryThing, including user profiles and personal catalogues. The Tweet Contextualization Track investigated tweet contextualization, helping a user to understand a tweet by providing him with a short background summary generated from relevant Wikipedia passages aggregated into a coherent summary. INEX 2014 was an exciting year for INEX in which we for the third time ran our workshop as part of the CLEF labs. This paper gives an overview of all the INEX 2014 tracks, their aims and task, the built test-collections, the participants, and gives an initial analysis of the results.

Viewing alternatives


Public Attention

Altmetrics from Altmetric

Number of Citations

Citations from Dimensions

Item Actions