something to know a bit about
Have you heard about this…this altmetrics (short for alternative assessment metrics or alternative metrics) business? We know it has to be important because it is one of the topics for the Chapter Sharing Roundtables at this year’s Medical Library Association Conference. We know it might even be super important because it is right at the top of the list of 25 topics being discussed. It beat out Building a Network of Partners, which came in second, followed by Consumer Health, Copyright Issues, and Embedded Librarians. Wow!
Why is it a good idea for librarians to pay attention to altmetrics to at least a level of understanding where we can carry on an intelligent conversation with a…Altmetricologist (fyi, not a real word…yet)? We hope that by reading through the rest of this article and following the links to some websites, you’ll be able to determine for yourself if this is something you need to add to your Things-To-Know-More-About list.
According to the good folk at altmetrics.org*, altmetrics is “the creation and study of new metrics on the Social Web for analyzing and informing scholarship.”
The folks at www.altmetric.com (no apparent affiliation with altmetrics.org) discuss how altmetrics work on the premise that researchers are very interested in what other people think of their work (it’s true…they do care) and if it is having some sort of impact. Altmetrics goes beyond or provides an alternative measurement to the traditional method for measuring impact of published research, which is at the journal level with impact factor measurements of how many times a paper has been cited. Altmetrics allows for harvesting the more real-time buzz of someone’s published work. When papers are published, the authors hope and pray that people are discussing their work. Typically, people share what they think about other people’s work on social media sites, publisher’s sites, daily newspapers, government regulatory sites, and wherever else people go to discuss published research. That people are discussing a paper in anyone of these outlets produces impact. For researchers to measure this discussion of their work, requires access to companies like Altmetric.com. These companies have tools that monitor online discussion outlets, compile what is being discussed, give scores to different levels of discussion, and then compile it into one report for a specific published work.
To help better explain altmetrics, let’s look at a scenario from Altmetric.com. A paper published in a research journal reports on a particularly effective way to get HIV/AIDS patients to comply with their doctor’s prescribed therapy. A nurse reads the paper, begins using the method, and also finds it to be effective. The nurse then shares his/her results on a listserv he/she belongs to, such as any of the great email lists hosted by the Health Care Education Association. His/her sharing and any potential ripple effects can be collated using altmetric tools, and the impact of the paper can be measured. Without altmetrics, however, the authors of the paper would never know about the impact because the nurse wouldn’t report the successful implementation in a formal venue, like a journal.
Altmetrics can potentially reduce the time it takes for the impact of peer-reviewed scholarly work to be identified as good (or bad) contributions to science using the concepts of crowdsourcing to get it done. Altmetrics.org writes that more and more researchers are sharing their publications online through social/reference manager apps like Zotero and Mendeley, who use a form of altmetrics to provide an impact score of published research.
We strongly recommend that you read altmetrics.org’s manifesto. It addresses the statement “no one can read everything.” A statement we librarians can relate to very well. Especially as this is basically the entire premise for why library card catalogs…oops…we mean…why OPACs exist. Many librarians have been saying this for years when we get a little defensive when uninformed people ask us to justify the importance of our role in analyzing and informing scholarship. How many of you have said, “because nobody can read everything,” when a student asks you why they need to become proficient using PubMed MEDLINE? The manifesto also rationally discusses how altmetrics can address the crush of abstracts from new academic literature being indexed in MEDLINE. Remember back to the 1950s when Index Medicus indexed a respectable annual ~100k-170k abstracts. The number of abstracts indexed in 2010 was over 900k. Yikes! Oh, and remember that not every biomedical journal is indexed in MEDLINE, right? Double Yikes!
Now, where were we…oh, yes…
Altmetrics.org also has a nice list of tools that are using altmetrics. Among those tools, you’ll probably recognize the name “PLoS.” Their tool, the PLoS Impact Explorer, makes it possible for users to find online conversations on PLoS papers found on altmetrics.org. Altmetrics.org also provides some nice links to social media resources you might want to checkout: @Mendeley, @GoogleGroups, @LinkedIn, and @FriendFeed.
You can also find several videos on altmetrics for those of you who want to get your altmetrics-geek-on. They range in length from 30 minutes to a little over 120 minutes. If you only have time to watch one of these, we recommend watching the first video on the list, Jason Priem’s 2012 lecture given to an audience of librarians at Purdue. Just think, you can view this right at your desk while eating your lunch.
The site nicely tracks who is writing about altmetrics on their press page. There are publications from The Guardian, IEEE Spectrum, Chronicle of Higher Ed, Forbes, and some cool blogs.
If you’d like to talk with John or Rachel about the Altmetrics’ing (again, not a real word…yet) that is taking place around you, please reach out to us. We’d like to hear from you and what you think. Really! We do.
*Researching who is behind almetrics.org led to Dr. Jason Priem, of University of North Carolina Chapel Hill. If you want to be amazed, checkout his CV. You can tell he is totally invested in finding ways to blend the social aspect of the Web with science.
– John Bramble, Utah/Technology Coordinator
– Rachel Vukas, Kansas/Technology Coordinator