In June, a video popped up on Facebook: Is your food fake or real? Find out with these 16 easy tests at home! The tests include heating up rice to melt bits of plastic mixed in with the real thing to ‘increase manufacturer profit’, and holding a slice of cheese over an open flame to check if it is ‘processed cheese with chemicals [that make it] difficult to melt’.

Within a few hours, 6.5 million people had watched the video, and more than 3 million had shared it – a viral success. The only problem with this video: it was almost entirely bogus.

While the video’s title suggests food experiments for a lay audience, all it contained was a mix of half-truths and outright lies. ‘Plastic rice’ is an almost decade-old urban legend that has never been substantiated. And while processed cheese does indeed contain additives like emulsifiers, they help it melt more evenly than regular cheese – contrary to the video.

Karen Douglas, a leading conspiracy theory researcher from the University of Kent, UK, says it’s unsurprising that people worry about what goes into their food, but they often find it difficult to tell the difference between real and fake information. ‘It is therefore very easy for fake scientific information to spread far and wide on the internet.’

Waking up to the problem

In early 2019, New York City and the state of Washington declared a state of emergency after they were swept by separate measles outbreaks, 19 years after the disease had been declared eliminated in the country. More than 1000 people were infected. Most had not received the measles vaccine as they believed it could harm them or their children.

Debunking can make a difference, but it is not always easy finding the right way to do it

Karen Douglas, University of Kent

US member of congress Adam Schiff wrote an open letter to Facebook chief executive Mark Zuckerberg, urging him to address vaccine misinformation that was running rampant on the platform at the time. ‘There is strong evidence to suggest that at least part of the source of this trend is the degree to which medically inaccurate information about vaccines surface on the websites where many Americans get their information, among them Facebook and Instagram,’ Schiff wrote.

Although there is no research that confirms social media has played a part in the anti-vaccine movement, Facebook took note. The platform decided to crack down on scientific misinformation, including falsehoods about vaccines and sensational health claims.

As part of this strategy, the social media giant is paying 50 organisations to debunk hoaxes. One of these fact checkers, Lead Stories, picked up the viral food video two days after it was first published, labelling most of the claims as ‘definitely false, others misleading and others so general in nature that they are impossible to verify’.

Nevertheless, the video continued to spread, amassing 87 million views on Facebook – and more on other platforms – after one week. ‘Debunking can make a difference, but it is not always easy finding the right way to do it,’ Douglas says. So does fact checking really reduce the harm fake science news can do?

Misinformation nation?

Millions of people follow science pages on Facebook. 33% of Americans consider social media an important way to get their science news, according to a Pew Research Center survey. This reflects a wider trend: almost half of all UK and US residents said in surveys that they get their daily news from social networks. And Facebook is ahead in the race, being more popular than either YouTube or Twitter.

Among Facebook’s 30 most popular science pages are IFL Science with almost 26 million followers and National Geographic with 44 million followers. However, the 30 most followed pages rarely report on the science issues that frequently appear in traditional news outlets, such as climate change or gene editing. Many focus entirely on health and food topics, some of them offering scientifically questionable advice.

Raw food advocate David Wolfe, for example, has almost 11 million followers, making his the seventh most popular science-related page on Facebook. Wolfe has been called out repeatedly – including by scientists like David Gorski, surgical oncologist and editor of Science-Based Medicine – for promoting pseudoscientific ideas.

‘A lie can travel halfway around the world before the truth can get its boots on,’ Mark Twain once said. Or was it Winston Churchill? While it turns out that neither Twain nor Churchill ever said such a thing, it is true that falsehoods usually spread faster and reach more people.

In 2018, social media expert Sinan Aral from the Massachusetts Institute of Technology, US, and his team analysed 126,000 stories tweeted by 3 million people over more than a decade. Lies tend to be more novel than truths, are 70% more likely to be retweeted and routinely reach up to 100,000 people. True stories rarely diffuse to more than 1000 people, and take about six times longer than falsehoods to reach that number of individuals.

Checking up

The 2016 US presidential elections came with a major uptick in intentionally misleading political news on social networks. An analysis found the most popular false election stories on Facebook were more widely shared than the most popular mainstream news stories.

Although there was no proof that misinformation spread on social media had influenced the election results, people were worried. In December 2016, Facebook introduced a tool for users to report hoaxes and misleading news, as well as signing up 25 independent fact checking organisations.

People who read anti-vaccine conspiracy theories are often influenced by this material to avoid vaccinating their children

Karen Douglas, University of Kent

Although a lot of disinformation is related to politics, some fact checkers are starting to focus more on scientific data misuse. ‘One of the big focuses has been health information, because we’ve identified there is a real potential risk of harm,’ says Tom Phillips, who leads the fact checking team at UK-based charity Full Fact.

‘Conspiracy theories on important topics such as climate change and vaccines can influence people’s behaviours,’ Douglas agrees. ‘For example, people who read anti-vaccine conspiracy theories are often influenced by this material to avoid vaccinating their children.’

User reports and machine learning tools help Facebook identify potentially misleading videos, images or articles. These are automatically compiled into a list for each fact checking partner, who can then choose pieces of content they want to tackle.

Full Fact checks 30 to 40 pieces of content each month, Lead Stories around 60 – though the time it takes to evaluate a story varies dramatically. Scientific claims are often the trickiest to debunk. ‘A lot of the time, claims will be slightly mangled – some source, way back in the chain was, in fact, real,’ explains Phillips. His team tries to unwrap the layers of misunderstanding and track down the original study.

‘Sometimes we just discover that [something] was copy–pasted from a satire website, and that makes the fact checking a whole lot simpler,’ says Lead Stories’ fake news expert Maarten Schenk. If all else fails, fact checkers will search for reliable source material and consult with experts – a process that can take weeks.

Fact checkers then apply one of eight rating options, including ‘true’, ‘false’ and ‘satire’. Posts rated as false or containing a mix of accurate and misleading information are demoted, so that they appear lower on users’ timeline. This, according to Facebook, reduces views by around 80%.

Moreover, a link to the debunking story is shown to everyone who clicks the post’s share button. This, says Schenk, is better than deleting the post entirely, ‘because if you delete something, then people wonder “What was it? Why are they trying hide it?”’

Making a difference

What remains unclear is the effect fact checking has on individuals. Fake news warnings might, in fact, give people a false sense of security. Massachusetts Institute of Technology cognitive scientists David Rand and colleagues found that attaching warning labels to some, but not all, misleading headlines increases the perceived accuracy of those fake news stories that have not been fact checked.

We are providing feedback to editors so that they become aware of major inaccuracies that are being published in their outlet

Emmanuel Vincent, Science Feedback

Moreover, experiments have shown that correcting misinformation – in areas such as vaccination, politics and climate change – can, ironically, reinforce people’s misconceptions. But a new study by Cornelia Betsch and Philipp Schmid from Germany’s University of Erfurt, found no evidence for this so-called backfire effect – at least when it comes to swaying an audience in public discussions with science deniers. In their tests, people were less likely to believe in misinformation if they were either given the facts or were informed about the rhetorical techniques – like impossible expectations and selective use of data – science deniers use.

‘I think this is very, very important that we have independent fact checkers, and that they are engaged with social media to tell the users not to trust a piece of content,’ says Schmid. ‘The more effective way of doing that, would be to have a disclaimer that tells the user why something might be fake, which means readers activate their own immune system against misinformation.’

Emmanuel Vincent, director of Science Feedback, says that he doesn’t think that the first goal of fact checkers is to convince everybody ‘that what they say is correct or incorrect’. Instead, he hopes to have an impact on the wider information landscape. Science Feedback, another one of Facebook’s partners, focuses on misleading science in the mainstream media. ‘We are providing feedback to editors so that they become aware of major inaccuracies that are being published in their outlet – they might not have realised that before,’ Vincent explains.

Correct or contest?

Once Facebook’s fact checkers have done their work, publishers have two options: dispute the rating or write a correction. The viral food video’s publisher, First Media, sent Lead Stories ‘an eight-page pdf to support their video with information and evidence’, says Schenk. However, most of its sources, which First Media also shared with several news outlets, were obscure and others directly contradicted the video’s claims.

We are calling on Facebook to be a bit more transparent and to release more data so we can examine our impact

Tom Phillips, Full Fact

First Media later agreed to publish a correction, but Schenk said they instead deleted the video altogether. At that point, the video had already been going viral for more than a week and had undoubtedly generated a sizeable amount of advertising money.

According to a Facebook email shared with Buzzfeed News in 2017, it takes more than three days from the site finding and flagging a post to fact checkers rating it. ‘The question of impact is important to us if something has already gone viral,’ says Phillips from Full Fact. ‘Some of the stuff we see on Facebook is years old.’

According to two studies – neither peer-reviewed – Facebook’s efforts seemed to have reduced the overall amount of misinformation since 2017. Nevertheless, misinformation remains a problem. A BuzzFeed News analysis found that the 50 biggest fake news stories in 2018 generated a total of 22 million shares, likes and comments on the social media platform.

Face facts

Fact checkers have mixed feelings about their work with Facebook. ‘What is unclear at this stage is what the long-term effect is: whether the effort is actually helping to curb the spread of misinformation from the sources that keep publishing such content,’ says Science Feedback’s Vincent.

In a report released in late July, Full Fact expressed concern that the programme’s impact is unclear. ‘We are calling on Facebook to be a bit more transparent and to release more data so we can examine our impact,’ Phillips says. According to a Poynter Institute study conducted in late 2018, other fact checkers are similarly uncertain whether their work does anything at all.

It’s not enough to point out that information is false

Maarten Schenk, Lead Stories

Another problem Full Fact identifies is scale. ‘We think that there needs to be a more open discussion of how to scale this kind of work up,’ says Phillips. ‘It would be good if Facebook was more explicit about their plans, for example what they’re doing with machine learning methods.’

Julia Bain from Facebook’s integrity partnerships responds that ‘many of the recommendations in the report are being actively pursued by our teams as part of continued dialogue with our partners, and we know there’s always room to improve. This includes scaling the impact of fact checks through identical content matching and similarity detection, continuing to evolve our rating scale to account for a growing spectrum of types of misinformation, piloting ways to utilise fact checkers’ signals on Instagram and more.’

Despite his criticism, Phillips thinks that the fact checking programme is worthwhile. ‘We do think that other social networks should consider doing something similar,’ he says.

‘We definitely have to make sure that the good information is out there and preferably somewhere near the bad information,’ says Schenk. ‘But it’s not enough to point out that information is false, but to also expose the mechanisms behind it. Who is spreading it, and why? How are they making money out of it?’

Schenk compares the battle against misinformation to a marathon rather than a sprint. ‘As countermeasures shift and evolve so do the threats, what works today might be obsolete tomorrow.’