to the British Science Association

We are a registered charity that exists to advance the public understanding, accessibility and accountability of the sciences and engineering in the UK.


Show me content for... +

Show me content for...
Professional development
Families & teenagers (aged 12+)
Families (children aged 12 & under)



Register with us and you can....

  • Sign up to our free e-communications
  • Become a member of the Association
  • Create your own web account, & post comments
  • Be part of British Science Festival
  • Save your favourite items


Keep up to date with the latest news from the British Science Assocation. Sign up to our RSS feeds and take us with you when you are on the move.

You are here

Get involved

Choose from...

What's happening in your area?

What can we learn from the Public Attitudes to Science 2014 survey?

What can we learn from the Public Attitudes to Science 2014 survey?

by Graphic Science


In March 2014, the latest version of the Public Attitudes to Science report was published by the Department for Business, Innovation and Skills in partnership with Ipsos MORI. But what can we learn from the PAS survey?

  • 45% of the public feel informed about science (PAS 2014 infographic)
  • More affluent half of the UK population (ABC1s) feel informed (51%) compared to 35% of the less affluent half of the UK population (C2DEs)  (p59, main report)
  • And for most people (54%) knowing more about science does not make them more worried. But there’s a difference between those who feel informed and those who don’t (for 18% and 29% respectively knowing more does make them more worried – p63).

Is feeling informed the same as being informed? Not necessarily (p61). So does the picture change if you look at knowledge scores instead of how informed participants feel? In this case, not really – a look at the data tables (available as a 1200 page pdf or SPSS) shows that 18% of those with high knowledge score vs 32% of people with low knowledge scores say knowing more makes them more worried. Could the data tell us whether the “informed worried” differ from the “uninformed worried”? Probably. Would it be useful to know? Do you think it would?

So much information, so many answers, so many more questions!

The PAS 2014 report is an extensive, dense, detailed, fascinating and selective look at the survey results. Members of the Graphic Science team have spent hours poring over it, highlighting the interesting, the curious and the questionable; occasionally meandering into the data to find out more or gain clarification, but it feels as though we’ve barely touched it.

And there are so many perspectives from which to view it.

  • Interested in gender differences? You will discover a distinct divide between male and female attitudes
  • Seeking the long view? There are glimpses of emerging trends throughout
  • Trying to reach new audiences? Keep the attitudinal groups in mind as you read

How can we possibly digest and absorb all this information and use it to inform what we do?

We decided to use our regular(ish) Twitter grey literature club #SciCommLit to find out what colleagues across the science communication community thought of the report and how it applies to them. We had some good discussions in a session focussing on Chapter 6 - Trust and confidence in science (storify.com/GraphicScience/scicommlit-pas2014-public-trust-in-science) and another looking at Chapter 5 - Discussing science in a digital age (storify.com/GraphicScience/scicommlit-may-2014-pas2014-engaging-with-science).

However, we couldn’t help noticing that the only people really engaged in conversations about the content of the PAS itself were the people who had been involved in producing it. Maybe it was a case of wrong place, wrong time but where was the community at large? Where were the representatives from the 2900 subscribers to the PSCI-Comm mailing list and the couple of dozen re-tweeters? Who is reading and engaging with this report?

These discussions could be going on elsewhere, but from our broader experience we suspect they are not and this seems like a paradox for the science communication community.

We seek engagement with the public, but don’t immerse ourselves in the outcomes of a report that tells us what the public thinks. We are staunch advocates of the scientific method and an evidence based approach but don’t properly take into account rigorously conducted research that could inform and develop our own professional practice.

Do you agree? Have you read the report? What do you think?

Join the debate...
Log in or register to post comments