Research Commons

Connect. Collaborate. Contribute.

Author: Nancy Courtney (Research Impact Librarian, University Libraries)

Peer Review Week 2017: Publons- Acknowledgement for Peer Review

This is the third of a three post series focused on peer review hosted by Publishing and Repository Services and the Research Commons to celebrate the 3rd annual Peer Review Week.  The first post reflected on editing the open peer review journal, Empirical Musicology Review. The second post discussed the interdisciplinary, moderated, and peer-reviewed Latinx Talk, published by The Ohio State University Libraries. Our third in the series talks about tracking peer review participation using the web service Publons.

 

Peer review is an important part of the academic publishing process.  As the production of new knowledge and the number of journals and journal submissions grows, finding and cultivating peer reviewers takes on a greater urgency while time-pressed researchers may be reluctant to participate in an activity that often only counts as service when it comes to promotion and tenure.  Publons (publons.com) is a web service, established in 2013, whose “mission is to speed up research by harnessing the power of peer review.”  They attempt to do this by providing a way for researchers to track and demonstrate their reviewing work, by offering online training for new peer reviewers, and by helping journal editors and potential reviewers to connect.

Publons profiles allow researchers to display their reviewing and editorial work without compromising reviewer anonymity or journal policies.  Unless allowed by the journal and author, reviews are noted only by journal name and year.  Publons verifies that the reviews were done through automatic linkages with partner publishers (such as Wiley, Springer Nature, Sage, and Taylor & Francis) or by the author providing acknowledgement emails from editors that will be verified with the editor.  The idea is that getting credit for reviewing work may encourage researchers to volunteer for more reviews.  Information about reviewing activity can be made public or private and can be linked to the researcher’s ORCID profile.

Another service is the Publons Academy, an online training course with ten modules consisting of videos (about ten minutes each) covering:  how to dissect an article considering methodology, data, and results; how to structure your review and provide constructive feedback; ethical issues; the peer process from the reviewer and journal points of view, and the differences between pre- and post-publication reviews. Anyone can view the videos but to complete the course and receive an endorsement on the Publons profile you must complete several reviews and work with a supervisor who will provide guidance and feedback.

Finally, Publons provides a mechanism for you to search for journals in your area and let them know that you are interested in reviewing for them.  Developing a strong track record of reviewing makes you more attractive to journal editors looking for peer reviewers in your field.

For more on Peer Review Week, including available webinars, visit the Peer Review Week activities page: https://peerreviewweek.wordpress.com/activities/ 

To learn more about the Libraries Publishing Program, visit the website: go.osu.edu/librarypublishing

CiteScore vs. Impact Factor: How Do We Rate Journal Quality?

In December 2016, Elsevier launched a new journal metric, CiteScore, that takes direct aim at the hegemony of the Impact Factor, a product of Clarivate Analytics (formerly part of Thomson Reuters.) The two companies already have competing bibliographical citation databases in Scopus (Elsevier) and the Web of Science (Clarivate).

The Impact Factor has had a long reign in academe. Beginning in 1975 as a byproduct of the Science Citation Index, it provided a unique, objective means of rating journals based on their citations and quickly became a standard measure of journal quality. When someone says they want a journal’s impact factor they really mean the Impact Factor from Journal Citation Reports (JCR) and nothing else. Because of this, the Impact Factor has come in for a lot of criticism over the years due to its inherent limitations, among them:

  • Since the Impact Factor is derived from journals indexed in the Web of Science, no other journals can have an Impact Factor.
  • Since the Impact Factor only looks at citations in the current year to articles in the previous two years, it only works well for disciplines in which rapid citation is the standard.
  • It doesn’t take into account disciplinary differences in expected numbers of citations.
  • There is no JCR for arts & humanities, therefore no Impact Factor for those journals.

To be fair, the JCR does also report a 5-year impact factor but it is an alternative, not the official Impact Factor, as are several other measures reported in the JCR such as the Immediacy Index, Cited- and Citing- Half-Life, and the Article Influence Score.

Other attempts have been made to offer alternatives to the Impact Factor that differ in methodology, such as the Eigenfactor Score, that uses the same journal source list as the JCR, and Scimago Journal Rankings and the SNIP (Source Normalized Impact per Paper), that uses the Scopus journal list.

Now, CiteScore has arrived to compete with the Impact Factor, luring users in with these benefits:

  • It is free to access on the Scopus Journal Metrics website (JCR is a paid subscription.)
  • It is calculated from the Scopus journal list, which is much larger than the Web of Science list and includes more social sciences and humanities journals.
  • It provides a 3-year citation window, rather than the 2-year window of the Impact Factor.

(There is also the CiteScore Tracker, where you can watch your journal’s score go up or down on a monthly basis. While this may interest some journal editors, I cannot recommend it for anyone else.)

As with the Impact Factor, CiteScore does not take into account disciplinary differences, though the website displays other metrics that do. Another factor that Elsevier touts as a benefit but has caused some criticism is that CiteScore includes citations to all documents in its calculation while the Impact Factor only includes citation to “citable” documents (e.g. articles, reviews, and meeting papers but not editorials, letters, or abstracts.) While this seems reasonable, it has meant that journals that include these other types of material get a lower score relative to journals that do not include it. Elsevier’s own journals have benefitted from this distinction, leading to the criticism.

So, where does this leave us when it comes to journal quality? We should remember that all these scores are just numbers. They’re interesting and tell us a little about citations. They don’t tell us anything about the journals that are not indexed in Scopus or Web of Science. A journal’s score doesn’t say anything about the quality of an individual article within the journal. The scores can be subject to gaming, though the producers eventually seem to catch on to that.

A lot of research is being produced and there is a lot of pressure on researchers to publish. Trying to force a large quantity of articles through the small funnel of Web of Science journals in pursuit of a high Impact Factor seems like it would just lead to a big delay in dissemination. Wouldn’t it be a better choice to expand the list of acceptable journals by entertaining some additional measures of impact? Or even take the journal itself out of the equation and just focus on the individual article and its impact?

For more information on journal metrics and links to sources, see the Journal Metrics section of the Research Impact guide at http://guides.osu.edu/c.php?g=608754&p=4233834 .

Browser Extensions Help Find Open Versions of Research Articles

Two browser extensions, Unpaywall (unpaywall.org) and the Open Access Button (openaccessbutton.org), offer an easy way to discover open access content when you need it – while you are viewing an article’s information online. Once installed, Unpaywall automatically appears as a tab on the side of the browser when you are viewing a research article’s page. The tab is green if there is full-text available and gray if there is not. Unpaywall is a new feature from Impactstory, a nonprofit organization supported by grants from the National Science Foundation and the Alfred P. Sloan Foundation, which claims that the extension will work on 65-85% of articles.

The Open Access Button, which has been around for a few years, requires you to click on it when viewing an article page to initiate a search for available open versions. (You can also use the search box on the webpage if you do not want to install the browser extension). If no open version is available, you are presented with the option to initiate a request to the author to make the article accessible. Open Access Button will search for and request datasets as well as articles.

Both extensions are free and simple to install and use. Neither is perfect. Unpaywall does not always display properly if you are already behind a paywall. For example, it showed a free version of a subscription article when I was off campus but showed that same article as not free when I was logged in through the proxy server, even though there really was a free version available. Also, since it uses the open or closed padlock symbol it can easily be confused with the similar symbols used on the journal webpages to indicated subscription access. Open Access Button does not do well searching for titles of articles and is not as instant as the web page would have you believe. Still, they are both worth using.

Both projects were recently highlighted in the Chronicle of Higher Education.