- One metric can’t fully capture the research impact of an individual, institution or other entity
- A “basket of metrics” approach is more nuanced and progressive
Responsible Research Assessment
- Institutions, funders & researchers are increasingly challenging reliance on quantitative measures of research impact
- Academia is moving towards responsible research assessment (RRA): “an umbrella term for approaches to assessment which incentivise, reflect and reward the plural characteristics of high-quality research, in support of diverse and inclusive research cultures.” — Harnessing the Metric Tide: indicators, infrastructures & priorities for UK responsible research assessment.
- Some examples of advocacy for RRA:
- Declaration on Research Assessment (DORA)
- Catalyst for holistic, equitable research evaluation across all disciplines
- Declaration on Research Assessment (DORA)
- Coalition for Advancing Research Assessment
- Advocates for primarily qualitative assessment, balanced with quantitative data
- Highlights importance of considering diversity, inclusiveness and collaboration
Citation justice
- Research metrics, particularly those based on citations, perpetuate inequities in academia
- Citation-based metrics:
- privilege works by white, male researchers
- female authors are cited less on average than male researchers; this phenomenon spans disciplines and does not reflect research quality
- racialized and other marginalized researchers are cited less than their white counterparts, even if they have more expertise
- privilege works by white, male researchers
- Citation justice advocates for considering equity in citations
- Cite scholars from marginalized groups
- Cite works which may not fit narrow definitions of “scholarly” for example oral histories
Metrics tools and resources
Explore a variety of tools and resources to help you measure and demonstrate the impact of your research.
Citation metrics focus on article citations as the gold standard for measuring the impact of research.
- Used to evaluate individual researchers, departments/centres, institutions, disciplines, countries and other modes.
- Challenges:
- vulnerability to “gaming”
- failure to adequately capture differences between disciplines and journals
- tendency to privilege pure versus practical research.
Citation metrics sources:
- Google Scholar Citations
- Track citations to your articles
- See who is citing them
- Graph citations over time
- H-index
- Aauthor-level metric calculated by number (h) of author’s articles which have been cited at least that same number (h) of times
- Available from Google Scholar
- Other h metrics include:
- H-core – set of top cited h articles from a journal
- H-median – median of citation counts in a journal’s h-core
- h5-index, h5-core, h5-median – publication metrics for a journal’s articles published in the past five years
- Impact Factor
- A journal’s Impact Factor is the number of citations in a given year to documents published in the 2 previous years, divided by the total number of documents published
- Proprietary designation controlled by Clarivate Analytics and only applicable to publications included in Journal Citation Reports
- Available via Web of Science
- SCImago Journal Rank
- Similar to Google PageRank, SJR ranks journals by their average prestige per article
- SJR calculations include both citations and journal prestige
- Freely available; based on information from the Scopus database
- Developed by a research group from the Consejo Superior de Investigaciones Científicas (CSIC), University of Granada, Extremadura, Carlos III (Madrid) and Alcalá de Henares
- Source-Normalized Impact per Paper (SNIP)
- Measures average citation impact of a journal’s publication and corrects for differences in citation practices between different disciplines
- Based on data from Scopus
- Produced by the Centre for Science and Technology Studies, Leiden University
- Eigenfactor Score
- Measures frequency that articles from a journal have been cited in a given year
- Weighted by size; larger journals have higher scores
- Developed at the University of Washington
- CiteScore
- CiteScore metrics calculate the citations from all documents in year one to all documents published in the prior three years for a title
- Developed by Elsevier; Based on data from Scopus
- Alternative metrics include broader impact measures such as media coverage, social media sharing, public engagement
- Can be used by individual researchers, departments, institutions, publications and more
- For an overview of altmetrics, advice, usage examples and more consult the Altmetrics Guide created by the University of Waterloo
Challenges:
- May fail to capture a full range of scholarship by omitting non-article outputs such as preprints, posters, data sets, conference proceedings, etc.
- Data not normalized
- Newer, not as widely known as citation metrics
Popular Altmetrics tools:
- Dimensions
- Pulls and links data from various sources, including grants, conference proceedings, preprints, books and chapters, journal articles, patents, clinical trials, and data sets
- Tracks:
- Citations and citation ratios
- Social media sharing
- Mentions in public policy documents
- Wikipedia references
- Media coverage
- Includes
- Researcher profiles and metrics
- Article level metrics
- Disciplinary, institutional comparisons
- Altmetric
- Tracks myriad sources for mentions of research outputs to calculate an Altmetric score
- Sources tracked include news media, social media, citations, public policy documents and multimedia platforms
- Specific Altmetric streams for researchers, institutions and publishers
- Articles in the Brock Digital Repository feature Altmetric scores; see example
- Altmetric bookmarklet
- Browser add-in which provides altmetrics for individual articles which have DOIs and/or are available via PubMed or arXiv.
- Plum Analytics
- Captures metrics for all types of scholarship and categorizes according to usage, captures, mentions, social media and citations
- Specific products for institutions, institutional repositories, research departments/groups, research funders
- Impactstory
- For individual researchers
- Tracks and ranks all research outputs via data from citations, social media, data and code repositories and other sources
- Links to users’ ORCID profiles
- Free accounts for Twitter users
- Research Gate Score
- Research Gate, calculates a score based on peer evaluations of users’ contributions
- Contributions can include publications, data, etc
- Your RG score is weighted by the RG score of whoever is evaluating your work
The Metrics Tool Kit is an evidence-based resource to help you explore and select metrics that best fit your discipline and desired outcome – e.g. cultural impact, attention/reach.
An introduction to research metrics and the Dimensions database
- Workshop materials from the Brock Library
Measuring research output through bibliometrics
- From the Working Group on Bibliometrics, University of Waterloo. Winter, 2016
Assessing the impact of research
- A model for tracking diffusion of research outputs and activities; from the Bernard Becker Medical Library, Washington University
Approaches to assessing impacts in the Humanities and Social Sciences
- From Federation for the Humanities and Social Sciences: Addresses benefits & risks of impact assessment; strengths and weaknesses of assessment approaches
Research Metrics © 2018 by Elizabeth Yates is licensed under CC BY-SA 4.0