Online platform aims to make peer review faster, unbiased and less of a burden on researchers

The chemistry journal Synlett, published by Thieme, has successfully trialled a new system of peer review that allows a crowd of 100 approved researchers to comment on manuscripts online.

The idea was born out of frustration with traditional peer review, says the journal’s editor in chief, Benjamin List of the Max Planck Institute for Coal Research in Germany. ‘What we do [at the moment] is sloppy – we let two people give their opinions and on the basis of this a paper gets accepted or not,’ he tells Chemistry World. ‘It’s slow and the quality can vary.’

‘Our solution was to utilise the full power of the internet [where] we can all connect to each other in an instant.’

Similar online approaches, such as post-publication peer review sites that allow people to comment on papers, are becoming increasingly popular, but these can attract comments from any anonymous members of the public which can potentially be abusive or unconstructive. The Synlett team believes their system – which they call intelligent crowd reviewing – captures the best of both worlds.

Their crowd consists of 100 hand-picked reviewers based on recommendations from the editorial board, as well as researchers who volunteered to take part. These reviewers can access a secure, bespoke online platform and view papers being considered for publication. They can annotate and comment on manuscripts, as well as respond to comments left by others.

‘We think it’s a lot more effective not only making small improvements [to manuscripts] but also avoiding the publication of big flaws. This could reduce the number of retractions overall,’ says Denis Höfler, List’s PhD student and the project’s editorial assistant.

As with traditional peer review, comments are anonymous. List thinks this is important, and it sets the Synlett platform apart from similar systems such as F1000 Research, where the identities of reviewers for online papers is not kept a secret. ‘I think refereeing needs to be anonymous. Then you can be really frank and candid and say whatever your opinion is,’ says List. ‘We need arguments – we need really critical, valuable comments that are of substance.’

The journal trialled the platform throughout 2016 and it was a success, with each uploaded manuscript attracting dozens of comments from the crowd within a matter of days. The comments and discussions were of sufficient quality to allow the journal’s editors to make editorial decisions in a fraction of the time it normally take to organise peer review.

The system has several advantages beyond speed, List says. ‘It can clear up biased refereeing, for example if there is one negative or super positive review versus 99 neutral ones,’ he says. ‘That’s a much more solid basis to make a decision.’

Being able to leave comments of any length reduces the burden on individual reviewers, and as each researcher can spend less time reviewing a particular paper they can potentially contribute to more papers, he adds.

Recruiting enough qualified reviewers to make up the crowd – something those who were sceptical of the idea were initially worried about – turned out to be the easiest part of the project, according to Höfler.‘There are a lot of people out there who are eager to comment and think the idea has potential,’ he says.

Philip Moriarty, a physicist at the University of Nottingham, UK, who has spoken in favour of post-publication and crowd-based approaches to peer review, says the platform seems to be a ‘really good innovation’. ‘It’s exactly the right balance,’ he says. ‘I have a lot of time for [the website] PubPeer but the anonymity aspect can be abused. The fact that [Synlett’s] editors know who the reviewers are even though it’s anonymous […] means that flame wars aren’t going to break out and it’s not going to descend into a free-for-all.’

The platform was trialled with 10 manuscripts, but the journal now wants to expand this with the ultimate aim of handling all their papers in this way. They say the intelligent crowd based approach could also work for other journals or even other processes involving peer review such as grant applications.