People & Science

A publication of the British Science Association

22/12/2014

Show me content for... +

Show me content for...
Events
Resources
Volunteers
Teachers
Professional development
Families & teenagers (aged 12+)
Families (children aged 12 & under)

Donate

register

Register with us and you can....

  • Sign up to our free e-communications
  • Become a member of the Association
  • Create your own web account, & post comments
  • Be part of British Science Festival
  • Save your favourite items

Register

Keep up to date with the latest news from the British Science Assocation. Sign up to our RSS feeds and take us with you when you are on the move.

You are here

Wising up

Sciencewise Expert Resource Centre’s research has taken a magnifying glass to the problems of engaging experts in public dialogue, evaluating engagement and large-scale dialogue. Suzannah Lansdell, Diane Warburton and Pippa Hyam report their results.

Successful scientific developments

Suzannah Lansdell

Although some public dialogues have brought scientists together with the public, most are still taking a largely ‘top down’ approach to the interactions, rather than a two-way conversation.

Suzannah Lansdell

Although some public dialogues have brought scientists together with the public, most are still taking a largely ‘top down’ approach to the interactions, rather than a two-way conversation.

There are very few examples of processes that seek to have experts as an integral part of a dialogue where they are taking part alongside public participants, rather than imparting information for the public to use in their deliberations. This is an area of future opportunity, but to get it right we need to give experts that role explicitly in the aims of the dialogue. We also need to give them the skills to undertake it.

My research developed some key practical tips for making sure that the role of experts is fully considered in the planning and design of public dialogue. There was little rocket science, but it will make for a richer experience for the project, experts and the public alike. 

Core steps

The project identified twelve core steps running through a project, from inception through to dissemination and evaluation. 

The first step, at the beginning of a project, is to assess the context of the issue under deliberation. This will inform the sort of expertise you may need to engage, the mechanism by which you engage the expert and the role you might want an expert to take.

Expert views may not represent the full range. You may need, for example, people with experience of a condition or service as well as scientists, academics and other stakeholders.

Supporting experts

Another step is to invest time in briefing and supporting experts. Evaluations show that this is time well spent. Recognising that, for many experts, public dialogue is not a familiar environment, you need to cover what is required, how the day/session will look and feel, how they should present material and why that is being asked for.

You also need to follow up, evaluate and continue to engage with the experts after the dialogue. Dialogues should respect the engagement of experts and, as a minimum, keep them in the loop of developments and evaluate their experiences.

Commissioners and deliverers of public dialogue cited the difficulty of engaging experts. It is not currently something that enhances a scientist’s career: in fact, it can be seen as detrimental.  A key challenge in sustaining and improving future expert involvement is finding ways to formally and informally recognise the part experts play in public dialogue.  

The report is available at www.Sciencewise-erc.org.uk/cms/strategic-work-streams/

Evaluating engagement

Diane Warburton is plugging the gaps

Evidence Counts is new research from Sciencewise-ERC on evaluating public engagement.1  It summarises the existing evidence from evaluations, and shows that the benefits include improving policy and policy making. Public engagement can strengthen and enrich the evidence base for policy design by incorporating public knowledge and experience, thus enabling policy solutions to be more relevant and robust. Engagement also helps make policy making more transparent and accountable. 

Diane Warburton is plugging the gaps

Evidence Counts is new research from Sciencewise-ERC on evaluating public engagement.1  It summarises the existing evidence from evaluations, and shows that the benefits include improving policy and policy making. Public engagement can strengthen and enrich the evidence base for policy design by incorporating public knowledge and experience, thus enabling policy solutions to be more relevant and robust. Engagement also helps make policy making more transparent and accountable. 

Engagement has also helped scientists and policymakers understand where there is public support for difficult decisions, and where the public draw the line. As Professor Kathy Sykes, Chair of the Sciencewise-ERC Steering Group, has pointed out, this knowledge helps politicians, scientists and policymakers to be both 'braver and wiser' in their decision-making.

Benefits and gaps

Evidence from evaluations also shows that engagement can spread understanding and awareness, even of complex scientific and technical issues, not just among public participants but among the family, friends and colleagues they talk to afterwards. Also, participants are nearly always very positive about the experience of taking part in dialogue. They become more enthusiastic about future engagement, which is crucial for wider benefits such as strengthening democracy, social capital and social cohesion.

The research identifies two major gaps in the evidence on public engagement, both around economic value. The first is identifying the financial benefits of public dialogue, and whether it can in practice save costs and increase benefits in the longer term through easier, quicker and better policy dissemination and implementation. The second is the extent to which public dialogue can be shown to be cost effective. The research reviews previous studies but finds no agreement on how, or even whether, to assess costs and benefits, and little evidence on the financial costs, risks or negative impacts of dialogue.

New framework

Evidence Counts aims to fill these gaps by providing a new, four-stage framework for evaluating public dialogue. It includes templates for collecting data and analysing the balance of costs and benefits through qualitative and comparative assessments that link to a new set of 12 criteria for assessing quality and impacts.

The aim of the new framework is to enable future evaluations to gather evidence to test the economic value of public dialogue, which is likely to be especially important in the current economic climate, and in planning cost-effective public dialogue in future. In the short term, Sciencewise-ERC will use the framework to evaluate the projects it funds, and work with partners to develop the approach further.

1 Diane Warburton, Evidence Counts. Understanding the value of public dialogue. Available at www.Sciencewise-erc.org.uk/cms/strategic-work-streams/

Engagement: large or small?

Pippa Hyam sifts the evidence

Recent engagement practice suggests that processes involving very large numbers of participants are on the rise. We reviewed policymakers’ motivations for ‘going big’ and aimed to establish under what circumstances bigger is better in order to provide some guidance to policymakers.

To provide value for money, (expensive) large-scale processes should achieve substantially better results than small-scale processes. But do they?

Opinion research or dialogue

Pippa Hyam sifts the evidence

Recent engagement practice suggests that processes involving very large numbers of participants are on the rise. We reviewed policymakers’ motivations for ‘going big’ and aimed to establish under what circumstances bigger is better in order to provide some guidance to policymakers.

To provide value for money, (expensive) large-scale processes should achieve substantially better results than small-scale processes. But do they?

Opinion research or dialogue

If the purpose is to do sophisticated opinion research (or deliberative research), large-scale engagement is unlikely to be the appropriate solution. If the purpose is to achieve decisions based on common understanding, increasing the number of participants may well be beneficial.

The rules that apply to social or opinion research do not apply to public dialogue. Many argue that involving greater numbers makes findings more robust. However, in interactive processes, the added value of more participants rapidly decreases above a number that allows for a fair degree of diversity, which can be as low as 30 people. Bringing in more people hardly ever reveals important issues that would have otherwise been overlooked.

Influencing each other

There is also limited value in recruiting participants to make up a sample that is representative for the population as a whole, often a stated reason to up-scale a process. Indeed 1,000-plus carefully selected individuals may reflect the views of a broader public if they respond individually, but if they gather in a room, mutual influencing instantly undermines the original distribution of views – and the value of the meticulously compiled sample. Peer group challenging and influencing is a critical element in genuine public dialogue, so market research conventions and good public engagement practice may come into conflict at this scale. Our work suggests the principles of dialogue are more important than the need for representative samples.

This is not to say we disapprove of all large scale engagement. There are clearly examples and opportunities for citizens and government to influence each other, especially in the early stages of a decision-making process and there are a number of models for up-scaling at these ‘scoping’ and development phases. Involvement of this kind is likely to generate buy-in from participants as well as an understanding of dilemmas and trade-offs. Besides obtaining support from more citizens, up-scaling some types of engagement process enable people in more regions to participate, enhances the potential for reaching people beyond those who actively participate and supports the development of active citizenship.

The report is available at www.Sciencewise-erc.org.uk/cms/strategic-work-streams/

Suzannah Lansdell
Suzannah Lansdell is a Dialogue and Engagement Specialist with Sciencewise-ERC and stakeholder engagement adviser and facilitator
Diane Warburton
Diane Warburton is Evaluation Manager for Sciencewise-ERC and a Senior Partner in Shared Practice
Pippa Hyam
Pippa Hyam is Process Director of Dialogue by Design
Join the debate...
Log in or register to post comments