Research Metrics

  • One metric can’t fully capture the research impact of an individual, institution or other entity
  • A “basket of metrics” approach is more nuanced and progressive

Responsible Research Assessment

Citation justice

  • Research metrics, particularly those based on citations, perpetuate inequities in academia
  • Citation-based metrics:
    • privilege works by white, male researchers
      • female authors are cited less on average than male researchers; this phenomenon spans disciplines and does not reflect research quality
      • racialized and other marginalized researchers are cited less than their white counterparts, even if they have more expertise
  • Citation justice advocates for considering equity in citations
    • Cite scholars from marginalized groups
    • Cite works which may not fit narrow definitions of “scholarly” for example oral histories 

Metrics tools and resources

Explore a variety of tools and resources to help you measure and demonstrate the impact of your research.

Citation metrics focus on article citations as the gold standard for measuring the impact of research.

  • Used to evaluate individual researchers, departments/centres, institutions, disciplines, countries and other modes.
  • Challenges:
    • vulnerability to “gaming”
    • failure to adequately capture differences between disciplines and journals
    • tendency to privilege pure versus practical research.

Citation metrics sources:

  • H-index
    • Aauthor-level metric calculated by number (h) of author’s articles which have been cited at least that same number (h) of times
    • Available from Google Scholar
    • Other h metrics include:
      • H-core – set of top cited h articles from a journal
      • H-median – median of citation counts in a journal’s h-core
      • h5-index, h5-core, h5-median – publication metrics for a journal’s articles published in the past five years
  • Impact Factor
    • A journal’s Impact Factor is the number of citations in a given year to documents published in the 2 previous years, divided by the total number of documents published
    • Proprietary designation controlled by Clarivate Analytics and only applicable to publications included in Journal Citation Reports
    • Available via Web of Science
  • SCImago Journal Rank
    • Similar to Google PageRank, SJR ranks journals by their average prestige per article
    • SJR calculations include both citations and journal prestige
    • Freely available; based on information from the Scopus database
    • Developed by a research group from the Consejo Superior de Investigaciones Científicas (CSIC), University of Granada, Extremadura, Carlos III (Madrid) and Alcalá de Henares
  • Source-Normalized Impact per Paper (SNIP)
    • Measures average citation impact of a journal’s publication and corrects for differences in citation practices between different disciplines
    • Based on data from Scopus
    • Produced by the Centre for Science and Technology Studies, Leiden University
  • Eigenfactor Score
  • CiteScore
  • Alternative metrics include broader impact measures such as media coverage, social media sharing, public engagement
  • Can be used by individual researchers, departments, institutions, publications and more
  • For an overview of altmetrics, advice, usage examples and more consult the Altmetrics Guide created by the University of Waterloo

Challenges:

  • May fail to capture a full range of scholarship by omitting non-article outputs such as preprints, posters, data sets, conference proceedings, etc.
  • Data not normalized
  • Newer, not as widely known as citation metrics

Popular Altmetrics tools:

  • Dimensions
    • Pulls and links data from various sources, including grants, conference proceedings, preprints, books and chapters, journal articles, patents, clinical trials, and data sets
    • Tracks:
      • Citations and citation ratios
      • Social media sharing
      • Mentions in public policy documents
      • Wikipedia references
      • Media coverage
    • ​Includes
      • Researcher profiles and metrics
      • Article level metrics
      • Disciplinary, institutional comparisons
  • Altmetric
  • Altmetric bookmarklet
    • Browser add-in which provides altmetrics for individual articles which have DOIs and/or are available via PubMed or arXiv.
  • Plum Analytics
    • Captures metrics for all types of scholarship and categorizes according to usage, captures, mentions, social media and citations
    • Specific products for institutions, institutional repositories, research departments/groups, research funders
  • Impactstory
    • For individual researchers
    • Tracks and ranks all research outputs via data from citations, social media, data and code repositories and other sources
    • Links to users’ ORCID profiles
    • Free accounts for Twitter users
  • Research Gate Score

The Metrics Tool Kit is an evidence-based resource to help you explore and select metrics that best fit your discipline and desired outcome – e.g. cultural impact, attention/reach.

An introduction to research metrics and the Dimensions database

  • Workshop materials from the Brock Library

Measuring research output through bibliometrics

Assessing the impact of research

Approaches to assessing impacts in the Humanities and Social Sciences

 

Questions?

Contact Elizabeth Yates, Research and Scholarly Communication Librarian, at eyates@brocku.ca