By Liliana Shymanska, Corporate Communications Officer at the British Science Association


Meet a team that is fighting fake news.

Neuroscientist Professor Gina Rippon, computer scientist Robert Elliott Smith PhD FRSA and journalism lecturer Dr Gavin Evans share their unique insights on the misinformation that threatens the public perception of climate, public health and more, in a panel discussion held at this year’s British Science Festival.

This panel of cross-industry experts led a rich discussion into how social media algorithms work to help the spread of fake news, the language to be aware of when spotting misinformation in science, and how the fallacies of race science have spread over time.

The media has dynamics

A few years ago, Robert became concerned about the effects algorithms were having on society. In reality, he found that the algorithms had changed very little over his 30-year career in Artificial Intelligence (AI). What really changed is how we receive our news.

Robert began his talk with a quote from Marshall McLuhan, a philosopher whose work is among the cornerstones of the study of media theory: “The medium is the message”

In the 1960s, when this phrase was coined, media broadcasters were only granted licenses if their coverage was honest, equitable and balanced as airwaves were a limited resource. Since then, the limitations of bandwidth have disappeared. And in came the introduction of bandwagoning; the idea of staying on message; and the soundbite – all used to effectively fit news into those few moments they have our attention.

Social media, on the other hand, has its own set of dynamics. Robert recalled simulating social media networks in a lab by modelling people as computational “agents”.

When the agents were given the opportunity to share opinions on divisive issues, such as vaccination, the entire network chose a side, resulting in a polarised network. The two opposing sides then became "trapped" in an echo chamber, where they’re surrounded by those who hold the same beliefs as them and removing the chance to engage with others in debate or discussion.

Mind your language

“I am sure everyone in this room has a story about a strange factoid on the science of sex differences that they’ve come across…” said Gina in her opening remarks.

Gina’s area of work, combatting myths around sex differences in the brain, is riddled with misinformation, but where does it come from?

Sometimes, the problem lies at the source. Blame is often put on journalists for misunderstanding the science, or on how information can be manipulated within social media. But in some cases, researchers themselves need to take responsibility for what they are saying.

Gina talked through examples of this problem with the audience, and put forward some “neurononsense-spotting” guidelines, summarised in the table below. These guidelines stemmed from an article demystifying what estate agents really mean.  For instance, “deceptively spacious” = a cupboard. 

What they say

What you hear

What they should say

Men and women have different [xxx]

At last the truth; men are from Mars, women are from Venus

“On average”


Important; pay attention

There is a 5%/1% probability that this finding is due to chance; we will replicate this study to confirm the findings


Really, really necessary – a must have

Some kind of inbuilt characteristic; part of an individual’s biologically determined essence

Profound, Fundamental

Proof at last – those pesky scientists have finally caught up

Less than 1% of the 134k comparisons I carried out were statistically significant

A dangerous revival of bad science

“The claim that there is a link between race and intelligence is the main tenet of what is known as race science or, in many cases, scientific racism”, explained Gavin.

This idea that certain races are inherently more intelligent than others has been thoroughly debunked as false, however, it is experiencing a dangerous revival.

A mix of media sources – driven by a small group of anthropologists, IQ researchers, psychologists and pundits - has caused such ideas to reach and influence new audiences with potentially harmful consequences.

After Robert, Gina and Gavin’s opening remarks, the panel then proceeded to take questions from the audience on the best ways to prevent the spread of fake news.

Who or what should decide what fake news is? Should this fall under corporations, governments, or the public?

Robert took us back to the moment that he first looked into fake news. The results on Wikipedia stated that fake news consisted of articles in The Onion (an American satirical digital media company). Essentially, the term fake news derived from a type of satire that pretended to be the news. The term was later popularised by Donald Trump who took it mainstream, co-opting it to mean news that didn’t align with his views.

For Robert, fake news means “pernicious information that's causing social mal-effect”. In that case, he believes responsibility for the spread of fake news should fall with the broadcaster. Regulation could be established through collaborations between governments, the corporations themselves, and grassroots movements focussed on targeting misinformation. However, according to Robert, none of those options have had much success.

Fake news isn’t necessarily false information. Robert notes that categorically defining something as true or false isn’t always straightforward. This more subtle issue of false news could be tackled by human-centric regulation, as was seen in the broadcast era.

In Gavin’s area of work, some major social media platforms, like YouTube for example, have taken action against those promoting fake news in relation to race science by removing and disabling their accounts. This stance has been taken by various platforms due to public pressure. However, as new channels are constantly being created, this is an ongoing battle.

From Gina’s point of view, if the topic alludes to science, then scientists in the relevant field should be the ones to decide what is fake news. Although as Gina mentioned previously, in some cases, scientists themselves are – inadvertently or not – responsible for generating and propagating fake news.

What do you think children should be taught in school to help them to learn to critically consider the information they are presented with?

The next generation will be exposed to considerably more sources of information than ever before. With the development of social media, to robotics and AI, we’re long past the days of a reliable daily newspaper.

As Professor of Cognitive Neuroimaging at Aston University, Gina recalls her long-standing battle with students around plagiarism and where they are getting their information from (exclusively copying and pasting from Wikipedia doesn’t usually cut it).

Gina suggests training young people to challenge the sources of information they come across; is this information repeated in two independent resources? Are these resources reliable? Can we break down the ‘estate-agent jargon’?

Is it possible to convince someone who has retroactively justified a previously held belief with bad science?

“I would like to say yes, it’s very easy and people respond very favourably to facts that change their minds but in my experience, that doesn’t happen very often,” Gavin admits. Due to an increase in polarisation as a result of algorithms driving people to similar content, the ability to change someone’s mind has probably become even harder in recent years. According to Gavin, in cases like this, any alternative information that gets presented to people can get easily pigeonholed as fake news.

Robert agreed, saying that “there are the unpersuadable”. But he also mentioned that there are others that simply get stuck in the environment around them, particularly when it comes to social media. Following some of the pointers below will help to reach those that aren’t at the “unpersuadable core” when it comes to fake news on a particular issue or topic:

  • “Unfriend” less vigorously, and add friends more vigorously. This way, you are effectively opening up channels of communication and not acting like a algorithm yourself.
  • Don’t hate on headlines. Read the whole article before you make a judgement as oftentimes the headlines are misleading.
  • Identify reputable writers that you like and promote their work.
  • Post more comments instead of simply liking and sharing.

Gina’s advice was to always question sources of information, even if they support your own views on a topic. She recalled a story published recently that Texas was being overwhelmed with hospitalisations due to ivermectin overdoses (a drug that some anti-vaccine sources have incorrectly promoted as a substitute for a COVID-19 vaccine).

After questioning the information, the article turned out false information. A reporter had merged two unrelated articles together in an attempt to create a great headline. Although the underlying message behind this article (the best way to fight the COVID-19 pandemic is mass vaccination) is correct, sharing fake news in this way can further convince vaccine sceptics that their stance is justified.

Gavin adds that fake news retains its ‘stickability’ through a combination of the clarity of the information and the number of times it is repeated.

How you can fight fake news further

You can play your part in promoting accurate and responsible communication of research on sex/gender and the brain by taking part in the Noise in Neuroscience project.

The project aims to create an authoritative set of good practice guidelines for responsible communication of sex difference research. But they need your help to make them a success.

Noise in Neuroscience are keen to hear your views on what the guidelines should cover, and examples of good or poor practice in communicating research findings on sex/gender and the brain.

Their quick survey (about 3 minutes) is open to all. For more information visit: