The rise of low-quality and predatory open access journals and conferences worries Derek Lowe
Scientists know which journals are the big, impressive ones in their field. And they can easily list the ones just under that layer – the solid, reliable journals that are unlikely to publish the next Nobel prize discovery, but are equally unlikely to publish junk. Unfortunately, the list of journals does not end there, and the proportion of junk begins to increase as you continue down the list.
The edge of the junk pile consists of papers on subjects that no one finds particularly interesting or particularly useful, but which are at least honest (albeit small) contributions. Then you descend into repetitious paper-slicing, as the proportion of new (or new-ish) material gets smaller and smaller compared to the restatements of earlier work. Beyond that, you have papers that seem to be mostly restatements of someone else’s work entirely, and so on further down the slope.
This is the scientific underworld, and it reaches its fullest extent in a depressingly long list of journals that will publish anything, anything at all, just so long as the cheque clears. By this point, we’ve moved past the papers that aren’t very interesting, down to entire journals, entire families of journals, that no one cares about, that no one ever reads, and that no one, in many cases, ever should.
Counting papers and conferences is as easy and as stupid as counting numbers of compounds
This is the seamy side of open-access (OA) publishing. There are many good journals that use the open access model, where contributors defray the costs of publication and the results are open to anyone to read. But once the word got out that contributors would willingly hand over money, an ever-expanding list of charlatans came along to pick up some of that cash for themselves. And they’ve found some kindred spirits among their customers.
Some of the scientists who publish in the predatory OA journals are honestly deluded or simply without a clue, and are happy to find a venue (or a tomb) for their manuscript. But others know just what they’re doing and who they’re dealing with, and they don’t care: they need publications, and have found a place where publications can be bought.
The engine that drives this worthless cycle is the insistence of some hiring and evaluating committees around the world that everyone publish, all the time. The longer the list of publications you show up with, the better it goes for you, apparently, and it doesn’t seem to matter what percentage of them are garbage and published in Int. J. Garbage, Garbage Lett. or Acta Garbalogica.
Do you need to have some conference presentations on your record as well? Not a problem: there are garbage conferences out there for you, bizarre ‘meetings’ that cheerfully accept every single abstract about whatever, as long as the cheque clears. Want to chair a session? Write a larger cheque. How anyone can stand to attend such a gathering is beyond me, but I suspect that the halls are large and the audiences small. The next step, which for all I know is already happening, is the conference where no one bothers to present at all, or even to attend the thing, although that last step would cut down on the opportunities for using someone else’s money for travel.
As an industrial scientist, here’s where I have to note that it’s academia, loosely defined, that is largely responsible for this state of affairs. A much smaller proportion of people from industry publish in the bottom-of-the-barrel journals or attend the fake conferences, simply because they can be evaluated in other ways. Is the company making money? Does anything you’re doing look like helping the company to make any money, now or in the future? For better or worse, a list of publications in compost-pile journals does not answer such questions very convincingly. The industrial world has its own ways of wasting money, some of them spectacularly brainless, but this isn’t one of them. It’s the worst of the granting agencies and the worst of the tenure and promotion committees that are providing the incentives for all this worthless activity.
Evaluating scientists is not easy. That’s always been the case, and the shortcuts to doing it have been around a long time too. Counting papers and conferences is easy, but stupid – as easy and as stupid as counting numbers of compounds in an industrial chemistry lab. And while we’re not going to get rid of stupidity or greed any time soon, we could at least try to give them less room to run.