Research Impact Challenge

Day 5: Monitor your impact

Welcome to the Last Day of the Research Impact Challenge! Today your task is to monitor the impact of your research outputsThe challenges thus far have explored different ways to create an online scholarly presence. Now that you have curated your online scholarly presence, it's time to monitor the impact of your research outputs.

To complete today's challenge, you will explore tools that can be used to track your impact. 

  1. You can set up citation alerts to receive notifications when your research is cited.
    • Google Scholar citation alerts
    • Web of Science citation alerts
    • Scopus citation alerts 
  2. You can monitor page views and downloads. While you can't be certain that views and downloads mean people are reading your work, it can be a good proxy. Some repositories like Dryad provide this information so you can track the interest in your datasets. 
  3. Publishers PLOS display traditional citations, alternative metrics, page views, and download information for individual articles directly on their website. Many of these metrics are only available on their websites. Some publishers like PeerJ and Frontiers will even send you an email to update you on views, downloads, and use. 
  4. Academic social networks like Academia.edu and ResearchGate reports on views and downloads for your uploaded content. You can also turn on email notifications by visiting your "Account Settings."
  5. Altmetric.com allows you to track alternative metrics (i.e., altmetrics). Install the Altmetric Bookmarklet in your browser (available for Chrome, Firefox, Safari) to instantly get altmetric data for articles with a DOI, PubMed ID, or ArXiv ID. Visit the bookmarklet link, and drag the "Altmetric It!" button into your browser's menu bar. 

It is important to keep in mind that no metric aggregates all impact information. You may want to adopt a hybrid approach and combine different metrics to demonstrate your impact. Impact metrics also vary across disciplines. Fifty citations in one discipline may seem like a lot to one researcher, while it may be a small fraction in another. While high impact metrics may be good, low metrics do not necessarily mean the work hasn't made a significant impact. 

For more information, please contact the Research Communications Librarian.

Finding an appropriate metric

There is a wide set of data and tools available for measuring research impact. Databases like Scopus, Web of Science, Dimensions, and Google Scholar, aggregate data to record the number of times that a journal article has been cited by other publications. These citations can then be used to produce author metrics, article metrics, and journal metrics to indicate impact.

Note that each database aggregates from different journals and sources with some overlap. Because of these differences, the metric for the same paper/journal/author in one database may differ from that of another.

For more information on metrics and impact, register for a Research Impact Series class from the Lane Library. 

Author-level metrics measure the impact and productivity of a researcher. The most common metric is the H-index which is often used as a “yardstick” to measure and compare researchers and scholars. The H-index was created by physicist Hirsh in 2015 to give “an estimate of the importance, significance and broad impact of a scientist’s cumulative research contributions … A scientist has index h if h of his or her Np papers have at least h citations each and the other (Np -h) papers have ≤ h citations each” (Hirsh, 2015). This calculation combines two variables to attribute impact to a researcher: the number of publications, and the number of citations.

To locate the h-index of a researcher, search by author in Google Scholar, Scopus, or Web of Science.

Article-level metrics or citation metrics are used to determine if a publication (commonly an article) has been cited by another work or works. The number of times an article is cited can indicate its importance to a particular field of study, popularity of the topic, or contested nature. Citation count is useful, but it should not be the only criteria used to evaluate the impact of the article; the number of times a paper is cited does not indicate its actual quality. Certain disciplines tend to have low numbers of journals and usage, and thus should not be compared to other disciplines. 

You can locate citation counts and article metrics from Dimensions Plus, Scopus, Web of Science, and Google Scholar. Altmetric also aggregates data to provide non-traditional citation metrics for articles. To learn more, please contact the Research Communications Librarian.

Journal-level metrics track citation patterns within journals and determine which journals are highly cited. The most commonly known journal metric is the Journal Impact Factor (JIF) generated from the Journal Citation Reports (JCR) from Thomson Reuters, the vendor for Web of Science. It is important to note that metrics can't tell you which are the 'best' journals, but they can help you identify journals that receive more attention on average than others. While publishing in a highly-cited or highly-discussed journal won't guarantee that your paper will be read, cited, or shared, it can help raise the profile of your work and boost your CV. Ultimately, however, the decision of where to publish your work depends on many factors that are beyond the scope of metrics.

To locate journal metrics, search by journal name or subject area in Scopus, Journal Citation Reports, Eigentfactor Journal Ranking, or Scientific Journal Ranking (SJR). This is just a snapshot of some of the journal rankings. 

Alternative metrics (i.e., altmetrics) is a new class of metrics that traces actions of access, dissemination, sharing, and discussion on the web through the uses of published research outputs. It works by way of supplementing formal citation counts or any metrics reliant on it. As an ever-evolving class of metrics, no definitive list of data sources exists. Commonly included metrics are article page reviews; downloads; comments on a publisher’s platform; shares on Facebook, LinkedIn, and Twitter; Zotero and CiteULike social bookmarks; Wikipedia references; mentions on Reddit or other discussion boards; and shares on Mendeley or other social network sites for researchers. It is important to note that online social behavior shifts due to the rise and fall of popular online platforms which will be duly reflected in altmetrics.

Altmetrics data is available through PlumX Metrics in Scopus, and other journal/publisher websites. To instantly see altmetric data for any published research output with a DOI, PubMed ID, or ArXiv ID, install the free Altmetric Bookmarklet in your browser (available for Chrome, Firefox, Safari). Visit the link and drag the "Altmetric It!" button into your browser menu bar.

Metrics Toolkit

The Metrics Toolkit can help you navigate the research metrics landscape. Take a few minutes to browse through the website and check out the different metrics and their use cases. Go to 'Explore Metrics' and navigate between the tabs at the top for publication type to see which metrics are suggested to meet your impact needs. Remember, there may be more than one appropriate indicator, and you may want to use a combination of metrics to demonstrate your impact.

Metrics Toolkit