by Katherine Mathieson, Chief Executive, British Science Association


At the British Science Association (BSA) we’re big supporters of using research and evaluation to improve the impact of our work. This includes our long-standing CREST Awards programme for 5 to 19-year-olds which is designed to enable more young people to see science, technology, engineering and maths (STEM) as part of their culture and everyday lives. 

In 2015, an independent analysis arranged by Pro Bono Economics found that students who completed Silver level CREST Awards achieved higher GCSE Science scores compared to students with similar characteristics (including gender, prior attainment, and entitlement to Free School Meals) who hadn’t done CREST Awards.

This analysis was the first time that an extracurricular STEM enrichment scheme had shown a positive correlation with external exam results in UK schools (as far as we know). However, the study doesn’t answer the question of what caused the attainment differences: maybe it was the experience of doing the CREST Award or maybe it was something else, such as a more enthusiastic teacher?

This study’s findings prompted the Education Endowment Foundation (EEF) to commission a randomised controlled trial of CREST Awards. The purpose of this trial was to find out whether CREST was causing the difference – or some other factor. To do this, they would randomise students to “CREST” or “no CREST” arms of their trial. The EEF commissioned research agency NatCen to run the trial and analyse the results, and provided the BSA with funding to recruit schools and give supporting grants to the schools who chose to get involved.

The trial is still on-going. The students have done their CREST Awards but the effects on their GCSEs won’t be known until those students have finished their GCSE courses. In the meantime, NatCen has published an interim analysis which includes their findings about changes in attitudes and science knowledge.

NatCen’s study found that students and teachers said the CREST Awards had a positive effect on student confidence and attitudes to school. Teachers and Heads valued the hands-on aspect of CREST, the opportunity to get formal recognition of students’ efforts, and the programme’s strong reputation. Students and teachers reported that taking part in CREST made science topics more interesting and relevant for students, and that the students learned transferable skills like problem-solving, time management and teamwork. Teachers said they were more likely to incorporate inquiry-based learning methods (which underpin the CREST approach) in their regular teaching. Most teachers who took part (85% of survey respondents) said they would recommend CREST to other teachers.

As expected, there was no change to students’ science knowledge. CREST Awards cover a very wide range of subjects, including many topics outside the science curriculum, so it’s unlikely that a CREST Award would improve performance on a standard science knowledge test. Also, the test was given to students before they had finished their CREST Awards and before their increased engagement would have had time to affect their science attainment.

Some concerns have been raised over the way the trial was conducted, and it is hoped that these can be resolved – either before the final analysis is completed or (if it’s too late), to help design and carry out similar future trials. This will enable the trial to answer the original question of what is causing the difference between CREST students and non-CREST students: is it some aspect of the CREST Award experience or is it something else?

One of the challenges has been deciding what counts as ‘doing a CREST Award’; for the purposes of this analysis, the researchers simply asked the students whether they had done a CREST Award and if they said yes, they were included in the ‘intervention’ arm of the trial. But most of those students had not submitted their project to the BSA for assessment and so it’s not possible to know whether these trial students had the same experience, or achieved the same level, as other CREST Award students – such as the approx. 50,000 students who achieve a CREST Award each year.

There are a few other anomalies with this trial. For unknown reasons, the drop-out rate was higher than other RCTs in education – and much higher in the ‘intervention’ group where over a quarter of students dropped out. A possibly related factor is that CREST Awards are, of course, available to whoever wants to do them – and therefore some of the teachers or students who were in the ‘control’ arm of the trial might have done CREST Awards anyway. Although this was prohibited in the initial conditions that participating teachers signed up to, the teachers may have forgotten or been confused by the different CREST levels or topics that are available – for example, four students from the control arm of the trial inadvertently admitted doing CREST Awards in their responses to their researchers’ attitudes survey.

This interim report raises some interesting questions that might point the way for future research. For example, there is a hint that students who do CREST Awards outside lessons might see a bigger benefit on their science attainment compared to students who do CREST Awards within lessons. We are currently testing this hypothesis with a Gatsby-funded project as part of their Practical Science programme. Students who wanted to take part in the trial seemed more likely to share other similarities too. Although students and teachers reported that CREST led to increased student confidence, statistical analysis of the students’ responses to surveys about confidence found the difference was too small to be statistically significant – so perhaps more work is needed to understand what affects student confidence.

Schools with more experience of running STEM engagement found CREST easier to run than schools with less experience, suggesting that as a sector, we need to focus most resource on schools with the least experience (not the schools with the biggest appetite or the most disadvantaged students).

These and many other fascinating and important questions will be the subject of future studies. We at the BSA are looking forward to seeing the full findings of this trial and many others too as they offer us a very valuable tool for improving, updating, sharpening, understanding – and maybe even ceasing – the programmes that we offer as part of our goal for more young people to see science as part of their culture and lives.

While we await the final report from this trial, there are several steps we will take in the meantime.

  1. We need to do more to support schools with the least experience (and we will use our networks to encourage others in the STEM sector to do so too).
  2. We will ensure that the CREST submissions process is as supportive as possible in enabling all teachers and students to complete and submit their STEM projects for CREST Awards, and we will explore the reasons for attrition.
  3. We aim to investigate whether – and why – doing CREST Awards outside lessons has a bigger impact on attainment than doing them as part of lessons.
  4. We will strive to learn more about what affects student confidence.

We are committed to making continual improvements to our programmes and projects, and will use the findings from this study and others to help us improve the CREST model so it can benefit more students in future. To continue the discussion, get in touch: [email protected] or @CRESTAwards