When you’re a librarian working with open access publishing, there is a question that comes up a lot. It’s one that many of us dread, because it tends to come with a lot of baggage, and it can be tricky to answer in a way that satisfies the querent. The question is, “What about predatory open access publishers?” Sometimes it’s asked as an attempt to discredit OA publishing as a whole, in which case it’s likely that no amount of logical argumentation and no set of facts will be acceptable as a response. More often, though, it’s asked in the context of problem-solving. Predatory OA is a threat – to vulnerable junior scholars, to authors in developing countries, to the enterprise of scholarly publishing as a whole – so what should we do about it? It’s tempting to toss off a quick, “Don’t give them your work to publish. Problem solved!” It has the advantage of brevity, but it doesn’t do much to address the very real fears of scholars who don’t have the training and the experience to confidently evaluate the worth of a given publication. To give me something to point people to when the question comes up, and to provide a useful alternative to lists of predatory publishers (more on this in a minute), I decided to share my own understanding of what constitutes a ‘predatory’ publisher and offer a set of criteria by which authors can evaluate publications. It doesn’t provide any easy answers, but hopefully it provides some useful guidance.
Why can’t I just look at a list?
Before we get to the good stuff, it’s worth taking a moment to consider the many lists in this space. ‘Tis the season, so we’re going to call them “nice” lists and “naughty” lists.
“Nice” lists provide a venue for legitimate OA publishers to identify themselves, and for scholars to confirm that they are, indeed, dealing with a reputable entity. (A couple of examples: the Directory of Open Access Journals and the Open Access Scholarly Publishers Association.) These are usually opt-in in nature, in that publishers who want to be included can apply and be evaluated according to a set of criteria established by the owner of the list. Inclusion in a “nice” list is a good sign, but there are three major pitfalls in leaning too heavily on them: 1. Because they are opt-in, you can’t assume that a publisher isn’t legitimate if they don’t appear. 2. No system is perfect, and it’s always possible for a bad egg to slip through the cracks. 3. Just because they meet someone’s criteria, doesn’t mean they would meet yours. More on this in a bit.
“Naughty” lists provide a way for individuals or organizations to name bad actors. There are some fairly well-known ones, but I’m not going to link to any, because I find this list category especially problematic and I don’t want to throw my link-support around. The pitfalls are similar to those of the “nice” lists: 1. They are not comprehensive. No one could possibly catch every scammy publisher, and the landscape shifts too quickly to stay on top of reliably. Just because it’s not on the list, doesn’t mean it’s good! 2. Publishers change. There have been well-intentioned organizations that started off with low-quality offerings but managed to turn things around, just as there have been reputable ones that followed the slippery slope to exploitation. Knowing what a publisher was doing then doesn’t always tell you what they’re doing now, and it’s not always clear how often a list is updated or under what circumstances. 3. The same criteria issues apply to “naughty” lists as “nice” lists, with some added potential for malfeasance. As with any public take-down, they can be a useful vehicle for grudge-settling and agenda-pushing. “Naughty” lists can be a useful piece of evidence as you evaluate a publication, but take them with an especially large grain of salt, and learn what you can about the person or organization that created them.
Identifying “predatory” publishers
You may notice I have yet to define the phrase “predatory publisher.” That wasn’t an accident. It’s a tricky thing to do, and I wanted to give the definition the breathing room it deserves. Some people will tell you that any journal that charges its authors a fee to publish is predatory. They tend to be from fields where author charges are unheard of (unlike many disciplines in the sciences, where authors regularly fork over page charges to publish in subscription journals), grant funding is scarce, and well-established OA journals are thin on the ground. They also tend to have a rosy view of the subscription model of publishing as being free from the degradations of market capitalism. (Obviously, they tend not to be librarians.) Other people work from lists of positive and negative publisher attributes, asserting that a publisher that lacks one or more good qualities, or has one or more bad qualities, is “predatory.” Still others offer a vague statement along the lines of, “They take your money and don’t give you anything in return.”
I find it useful to think about journals in three categories: “good” journals, “bad” journals, and scams. Publishers labeled predatory can easily land in the second and third category, and even occasionally in the first. Let’s take them one at a time, out of order.
I’m going to borrow from the vague definition in the previous section: a scam journal takes your money and doesn’t give you anything in return. In fact, they’re not really journals at all – they’re advance-fee scams. How do you know you’re dealing with one? Like all lists, this one isn’t foolproof, but here are some things to look out for:*
- Little or no published scholarship
- Lack of a named editor or editorial board
- Promises of full peer review with incredibly fast turnaround times (days, sometimes)
- The content on the journal site (home page, ‘about’ page, etc.) doesn’t make sense
- A quick Google search turns up angry rants about them
Your more sophisticated scams will avoid most of those, and can sometimes conjure a slick-looking journal website with a (fake) editorial board, etc., but most of them aren’t trying that hard and can be identified pretty easily.
Let’s jump to the other end of the spectrum, and talk about good journals. Lists aside, I think it makes the most sense to consider a journal “good” if it:
- Improves your work through thoughtful peer review and attentive editing. To get a sense of this before you submit your work, check out the editors, look at the journal’s description of its peer review process, and, most importantly, read some articles. If the journal is consistently publishing high-quality scholarship, free from major copyediting errors, there’s a good chance that your work would benefit from its attention.
- Helps your research find an appropriate audience. Look at the scope of the journal – is it likely to be read by the people you want to see your work? Look at who has published in it – are they people you would consider your colleagues? Check to see if it shows up in the important databases in your field – are people doing similar research likely to find it?
- Adds to your reputation as a scholar. This one will require you to ask around. Keeping in mind who you most want to impress (which may vary widely depending on where you are in your career), see if the people around you recognize the journal by name, and what they think about it. While you’re at it, you might as well ask them for suggestions of other journals to consider.
If you find a journal that does all of those things, and is interested in publishing your work, congratulations! You have identified a good journal and your work is in good hands. Of course, in this imperfect world, you may not always be able to find that perfect fit, which brings us to…
A “bad” journal doesn’t do those things. Or, rather, it doesn’t do the ones that are most important to you at a given moment. Maybe you’ve found a journal that is skillfully edited and laser-focused on your scholarly niche, but no one in your department has ever heard of it. Does that make it a bad journal? Maybe. How badly do you need that local name recognition? Is this article the linchpin of your tenure case? Is there a ‘magic’ list of journals that you have to publish in to be promoted? Maybe it’s worth it to entrust your magnum opus to a more widely-known publication that doesn’t offer the same benefits.
You see what I’m getting at – I can’t tell you if a journal is “good” or “bad.” I can probably help you avoid a worthless scam, but only you know what you want to get out of any given interaction with the world of scholarly publishing. Take the time to think about your situation and what you want for your work, and to look closely at any publication you are considering submitting to – the time invested will pay off in spades.
Some further advice
I realize that the strategy above can be challenging to apply, so I’m going to leave you with some concrete pieces of advice as you look at journals:
- Know your obligations: Does your funder require you to provide public access to your work? Does your college or department have a faculty open access resolution? Are there other restrictions on where you can publish? This may be an obvious one, but don’t forget that there are other people invested in your research, and make sure you know of any conditions that will affect your choice of journal.
- Contact the editor: If you’ve scoured the journal site and asked all of your colleagues and still can’t decide whether you’re dealing with a good journal, don’t hesitate to contact the editor with questions. Good editors are always on the lookout for high-quality research in their field, and should be willing to answer a few questions by a prospective author. If the editor isn’t responsive to an email, that’s a good clue you may not want to trust them with your work.
- Ask your librarian: Don’t hesitate to ask your subject specialist librarian for help identifying an appropriate journal or evaluating one you’ve found. The conversation will be extra productive if you approach them with a good understanding of your needs and goals.
Update, December 18, 2015: Grand Valley State University Libraries has what I think is an especially nice list of “Open Access Journal Quality Indicators.”