Copy the page URI to the clipboard
Herrmannova, Drahomira; Patton, Robert; Knoth, Petr and Stahl, Christopher
(2017).
DOI: https://doi.org/10.1145/3057148.3057154
URL: https://dl.acm.org/citation.cfm?id=3057154
Abstract
In this paper we show that citation counts and Mendeley readership are poor indicators of research excellence. Our experimental design builds on the assumption that a good evaluation metric should be able to distinguish publications that have changed a research field from those that have not. The experiment has been conducted on a new dataset for bibliometric research which we call TrueImpactDataset. TrueImpactDataset is a collection of research publications of two types -- research papers which are considered seminal work in their area and papers which provide a survey (a literature review) of a research area. The dataset also contains related metadata, which include DOIs, titles, authors and abstracts. We describe how the dataset was built and provide overview statistics of the dataset. We propose to use the dataset for validating research evaluation metrics. By using this data, we show that widely used research metrics only poorly distinguish excellent research.
Viewing alternatives
Download history
Metrics
Public Attention
Altmetrics from AltmetricNumber of Citations
Citations from DimensionsItem Actions
Export
About
- Item ORO ID
- 59079
- Item Type
- Conference or Workshop Item
- ISBN
- 1-4503-5240-5, 978-1-4503-5240-6
- Keywords
- information retrieval; scholarly communication; publication datasets; data mining; research evaluation; semantometrics
- Academic Unit or School
-
Faculty of Science, Technology, Engineering and Mathematics (STEM) > Knowledge Media Institute (KMi)
Faculty of Science, Technology, Engineering and Mathematics (STEM) - Research Group
- Big Scientific Data and Text Analytics Group (BSDTAG)
- Copyright Holders
- © 2017 Association for Computing Machinery
- Depositing User
- Drahomira Herrmannova