Riding the crest of the altmetrics wave: How librarians can help prepare faculty for the next generation of research impact metrics

Scott Lapinski; Heather Piwowar; Jason Priem

* Contact Claire Stewart—series editor, head of digital collections and scholarly communication service at Northwestern University—with article ideas, e-mail: E-mail:


Over the last decade, scholars have begun a great migration into online spaces, moving workflows, and discussions to online platforms like Mendeley, blogs, Twitter, Facebook, and more. In these new spaces, once-invisible interactions like reading, saving, discussing, and recommending become visible. They leave traces. Observing these traces can inform new metrics of scholarly influence and impact—so-called altmetrics.1

These altmetrics are fast: data appears in days or weeks, instead of the years required by citations. More importantly, they are diverse, tracking impacts all across a quickly changing scholarly communication landscape populated by:

  • diverse products beyond the including datasets, software, and blog posts;
  • diverse platforms beyond the journal, like institutional repositories and online communities; and
  • diverse audiences beyond the including practitioners, clinicians, and the general public.

University faculty, administration, librarians, and publishers alike are beginning to discuss how and where altmetrics can be useful towards evaluating a researcher’s academic contribution.2 As interest grows, libraries are in a unique position to help facilitate an informed dialogue with the various constituencies that will intersect with altmetrics on campus, including both researchers (students and faculty) and the academic administrative office (faculty affairs, research and grants, promotion and tenure committees, and so on).

Librarians can provide this support in three main ways: informing emerging conversations with the latest research, supporting experimentation with emerging altmetrics tools, and engaging in early altmetrics education and outreach.

Know the literature

Librarians can begin by familiarizing themselves with the current state of discussion around altmetrics. Good places to start include a recent SPARC report,3 Finbar Galligan and Sharon Dyas-Correia’s excellent overview,4 and the recent ASIS&T Bulletin special issue on altmetrics.5 Librarians should also be familiar with the growing body of peer-reviewed research on altmetrics. An important concept from this literature is the idea of “impact flavors,” a way to understand the distinctive patterns in the diverse impacts of individual products. A product featured in mainstream media stories, blogged about, and downloaded by the public, for instance, has a very different flavor of impact than one heavily saved and discussed by scholars, which is in turn different from one highly cited in research papers. Altmetrics can help researchers, funders, and administrators optimize for the mix of flavors that best fits their particular goals.6

Other noteworthy research has examined correlations between altmetrics and traditional citations finding that some altmetrics sources, particularly Mendeley, are significantly correlated with citation (around .5 in several studies).7,8,9 This same research shows that other sources, like Facebook bookmarks, correlated only slightly with citations; this suggests that they track different kinds of impacts. Other early touchstones include studies exploring the predictive potential of altmetrics,10 growing adoption of social media tools that inform altmetrics,11 and insights from article readership patterns.12

Know the tools

Altmetrics are in active use today: several tools allow scholars to collect and share the broad impact of their research portfolios. In the same way a librarian would experiment with new features added to a once-familiar search interface just before the fall semester, librarians can play around with altmetrics tools to add them to their bibliographic instruction repertoire. Familiarity will enable a librarian to do easy demonstrations, discuss strengths and weaknesses, contribute to product development direction, and serve as a resource point for campus scholars and administration during the upcoming transition to Web-native scholarship.

A great place to start experimenting is ImpactStory, a nonprofit Web application created by two of this article’s coauthors (Jason Priem and Heather Piwowar) and supported by the Alfred P. Sloan Foundation. Scholars upload their articles, datasets, software repositories, and other products to ImpactStory using Google Scholar, ORCID, or DOI lists. ImpactStory then gathers and reports both altmetrics and traditional citations for each product. As shown in figure 1, metrics are displayed as percentiles relative to similar products; data can be exported for further analysis. ImpactStory is built on open-source code, offers open data, and is free to use.13


Figure 1: ImpactStory report for a BioMed Central article. View this article online for detailed image

PlumX is another tool that provides impact profiles for scholars. Like ImpactStory, PlumX is a Web application that displays altmetrics on a wide range of scholarly products. PlumX is available to scholars upon university-wide subscription; users can experiment with a free demo version.14

Integrate altmetrics into library outreach and education

Establishing a strong familiarity with the altmetrics tools will allow librarians to enhance standard bibliographic instruction with this added perspective. As these opportunities for outreach and engagement present themselves, librarians will find that even a brief demo of tools like ImpactStory and its ability to pull together usage data from a variety of resources hosting a scholar’s work, will stimulate interest among content producers and users alike.

Requests for using specific research databases and understanding publishing choices along with preparing bibliographic formatting to an author’s manuscript are common requests at the library. As students and faculty engage with the library during any of these activities, either via drop-in research consultations, or invitations to speak at student orientations and faculty meetings, these opportunities allow us to touch on altmetrics in demonstrating where it will intersect with their published research.

Showing article level metrics, as in the PLoS journals or download statistics from PubMedCentral and other institutional repositories, allows for a quick introduction to where they (as “content producer”) may consider participating. Depending on the options available on one’s campus, it could also be an opportunity to highlight the benefits of participating in open access venues, and discuss “impact” as something more closely tied to an individual’s scholarship rather than “a number” set exclusively to a journal title. The conversation can also segue into a useful discussion about the limitations imposed by placing too much reliance on the more familiar Journal Impact Factor (JIF) or h-index (an author-oriented metric based on a calculation between the number of articles an author has published and citations received for each of those same articles).

Whether traditional measures like JIF or h-index are being “encouraged” by various faculty or departments on campus to measure the impact of a scholar’s article output, by introducing these same faculty and departments to altmetrics, the library can play an important role. A researcher or a faculty affairs department could ask the library to provide assistance and instruction on calculating an h-index within various databases (Web of Science, SCOPUS, etc.). Integrating altmetrics into these instruction sessions is in the same spirit as a library providing a researcher with several additional choices in considering primary resources on any research project. We need to make researchers aware of the choices that are available to them in evaluating the impact of scholarship, and the relevant research, helping them make informed choices.

Libraries should take advantage of opportunities to demonstrate ImpactStory or other altmetrics tools to multiple constituency groups and share visual information on how these metrics have been integrated into a researcher’s profile. If there has not been an opportunity for the faculty member to visualize usage data from a conference presentation on SlideShare, or a video interview that may have been posted on Vimeo, here is a chance to, if not peak interest, at least stimulate awareness to the possibilities.

The added benefit of sharing these examples with researchers on campus may extend beyond just introduction to altmetrics, and provide a window into the online communities that are sharing scholarship in ways in which the researcher had not yet considered participating in.

Conclusion

Traditional impact measures, most commonly the JIF and the h-index, continue to be the source of much debate, and, over the years, have provoked many suggestions for ways in which their interpretation (or algorithms) could be improved. Altmetrics is not a complete answer to whatever shortcomings are inherent within these traditional impact measures. However, altmetrics do allow assessment directly at the product level, rather than the publication. Moreover, they cover the growing diversity in scholarly products, platforms, and people.

Of course, early excitement in altmetrics’ potential must be tempered by appropriate caution; research into the validity and reliability of altmetrics is still in its infancy. However, as we transition from a paper-native to a Web-native scholarly communication system,15 these new metrics are likely to grow in importance. As they do, librarians are well positioned to inform and support researchers and decision makers in their use.


Notes
1. Priem, J.. Taraborelli, D.. Groth, P.. Neylon, C.. , “Alt-metrics: A manifesto,”. altmetrics.org, 2010 , http://altmetrics.org/manifesto (accessed May 15, 2013).
2. Galligan, F.. Dyas-Correia, S.. , “Altmetrics: Rethinking the Way We Measure,”. Serials Review 39: 56-61 –, 2013 , doi: [CrossRef] .
3. Tananbaum, G.. , “Article-Level Metrics: A SPARC Primer,”. SPARC, www.sparc.arl.org/bm~doc/sparc-alm-primer.pdf.
4. Galligan, . Dyas-Correia, . , “Altmetrics: Rethinking the Way We Measure.”.
5. Piwowar, H.. , “Altmetrics: What, Why and Where?”. ASIS&T Bulletin, 2013 , www.asis.org/Bulletin/Apr-13/AprMay13_Piwowar.html.
6. Priem, J.. Piwowar, H.. Hemminger, B.M.. , “Altmetrics in the Wild: Using Social Media to Explore Scholarly Impact,”. 2012 , http://arxiv.org/abs/1203.4745 (accessed May 15, 2013).
7. Ibid.
8. Bar-Ilan, J.. Haustein, S.. Peters, I.. Priem, J.. Shema, H.. “Beyond citations: Scholars’ Visibility on the Social Web,”. ( 2012 ), http://arxiv.org/abs/1205.5611 (accessed May 15, 2013).
9. Li, X.. Thelwall, M.. Giustini, D.. , “Validating Online Reference Managers for Scholarly Impact Measurement,”. Scientometrics 91: 461-471 –, 2011 , doi: [CrossRef] .
10. Eysenbach, G.. , “Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact,”. Journal of Medical Internet Research 13, 2012 , doi: [CrossRef] .
11. Priem, J.. Costello, K.. Dzuba, T.. , “Prevalence and Use of Twitter among Scholars New Orleans, LA, USA,”. 2011 , http://figshare.com/articles/Prevalence_and_use_of_Twitter_among_scholars/104629.
12. Bollen, J.. Van de Sompel, H.. Hagberg, A.. Bettencourt, L.. Chute, R.. “Clickstream Data Yields High-Resolution Maps of Science, 2009,”. PLoS ONE 4: e4803 . doi: [CrossRef] .
13. http://impactstory.org.
14. www.plumanalytics.com.
15. Priem, J.. , “Scholarship: Beyond the Paper,”. Nature 495: 437-440 –, 2013 , doi: [CrossRef] .
Copyright © 2013 Scott Lapinski, Heather Piwowar, and Jason Priem

Article Views (Last 12 Months)

No data available

Contact ACRL for article usage statistics from 2010-April 2017.

Article Views (By Year/Month)

2018
January: 7
February: 14
March: 30
April: 31
2017
April: 0
May: 28
June: 17
July: 14
August: 8
September: 13
October: 10
November: 14
December: 16