People & Science

A publication of the British Science Association

01/09/2014

Show me content for... +

Show me content for...
Events
Resources
Volunteers
Teachers
Professional development
Families & teenagers (aged 12+)
Families (children aged 12 & under)

Donate

register

Register with us and you can....

  • Sign up to our free e-communications
  • Become a member of the Association
  • Create your own web account, & post comments
  • Be part of British Science Festival
  • Save your favourite items

Register

Keep up to date with the latest news from the British Science Assocation. Sign up to our RSS feeds and take us with you when you are on the move.

You are here

Bad science and bad politics

Scientists need scrutiny. The mainstream press does a good job of holding those in public office to account. But power, spin and questionable financial practices are not just the preserve of the political elite. Scientific endeavor combined with ambition, financial and political agendas can be a potent mix, causing all sorts of poor practices to occur.

Examining Tamiflu

Spin coupled with political and research agendas, for example, can cost a country millions. Back in 2009, during the influenza pandemic, the Cochrane Collaboration, a network of independent academics, was commissioned by the NHS to look at the evidence about the benefits and risks of using Tamiflu — a drug the UK had spent around £500m on to treat all those infected in the outbreak. This shouldn’t be a difficult task for seasoned researchers.

But when the Cochrane group went about surveying the medical literature, not all of the trials they knew existed about the effects of the drug in healthy people appeared in the medical press. This prompted the Cochrane group to ask many questions - ones that it would be difficult for them to answer using a traditional scientific approach. Why were the studies unpublished? And who exactly had access to data? So they enlisted the help of the BMJ and Channel 4 News to help track down the data.

Journalism and research

The underlying principle behind good journalism isn’t vastly different from that of academic research. Both professions formulate a question or hypothesis and set about investigating and exploring it from every angle. While the approaches are different, the question underpinning it all is: Why?

Working in tandem, the BMJ has found that academics and journalists can pool their skills. So when manufacturers, Roche, produced less reliable observational studies rather than the clinical trial data that the Cochrane group wanted, the story for Channel 4 News might have had to end there. However, using resources at the disposal of medical journals, the BMJ turned to an epidemiologist for statistical help. Roche said that this observational data they sent showed that Tamiflu reduced complications in otherwise healthy people. Our peer reviewed analysis said the effect on healthy adults was limited at best. Statistics held the key.

Investigative medical journalism

The BMJ has a long history of investigative journalism dating back to the 19th Century. The editor at the time, Ernest Hart, published a series of controversial journalistic articles describing the practice of baby farming. This was a notorious Victorian practice, where unwanted infants were taken in to be nursed in exchange for payment but were instead neglected and often killed – ‘foster’ parents who profited from child neglect. Fast forward 150 fifty years, and the subject matter of BMJ investigations may not be as gruesome as maimed babies. Conflicts of interest, spun science and buried bad news have become the topics du jour.

When investigating misuse of science, it’s essential that the story is done robustly particularly when those being investigated have deep pockets and large teams of lawyers. Bad science can’t be debunked by bad science.

There’s a tendency for the media to view medical science as test tubes, bleeping machines and blood tests. I’ve been asked by TV companies how can we ‘show’ or ‘disprove’ something using a single case study.  But these TV experiments should only be used to add colour. A sample size of one may only paint a limited picture.

Cross-over between journalism and science has continued further. The internet has opened up novel sources of information offering journalists the opportunity to gather data and process it to spot trends to generate stories – an approach that has been coined ‘data journalism’. It’s essentially what science researchers have been doing for years.

Enquiry into sports equipment

When the BMJ was told by visiting sports doctors last year that there was much distortion of science in sport, we decided that the only way to disentangle the ‘myths’ pedaled by various manufacturers was to take a methodological approach to data journalism. So we teamed up with the Centre of Evidence Based Medicine at Oxford University to analyse the performance claims made by advertisers for a broad range of sports products in the top 10 UK and US fitness magazines.

We assessed the evidence behind 431 performance-enhancing claims in adverts for 104 different sports products including sports drinks, protein shakes and trainers.

If the evidence wasn’t clear from the adverts, we contacted the companies for more information. Some, like Puma, did not provide any evidence, while others like GlaxoSmithKline— makers of Lucozade Sport—provided hundreds of studies. They should be applauded for trying to be transparent. The research formed the backbone of a Panorama programme, which unpicked some of the claims made by the companies in the BMJ/Oxford research. 

In summary, they found three (2.7 per cent) of the studies the team was able to assess were judged to be of high quality and at low risk of bias. They say this absence of high quality evidence is ‘worrying’ and call for better research in this area to help inform decisions .

Hydration dogmas

But how good were these studies? An accompanying analysis dissected the studies sent – it found that they were limited in design and many had methodological flaws. In the ultimate in science spin, studies had been designed to give a specific outcome. It’s perhaps little surprise that participants in a study perform better when they are given a drink containing carbohydrate and water if they have been fasted overnight.

So given the limitations of the evidence sent to Oxford, why are the public and athletes plied with all manner of hydration dogmas with such gusto? Prehydrate; drink ahead of thirst; train your gut to tolerate more fluid; your brain doesn’t know you’re thirsty – the list goes on. It’s what the science says, consumers are told.

An accompanying investigation told a familiar story. It found that the scientists advising sports organisations weren’t entirely independent; said organisations received money from the sports drinks manufacturers; results of studies had been over-interpreted; and, the resulting guidance had been adopted by other health groups and policy makers uncritically.

As so often in publications, the moral of the story is: Don’t read the introduction and conclusion – read the results and methodology instead. Draw your own conclusions and then see if the researchers’ interpretation has been spun.

Demonisation

Sports scientists who thought the benefits of hydration had been oversold told of bullying and vilification at conferences – they were demonised if they questioned the science behind the perceived wisdom.

Shortly after publication, the investigation was attacked by the likes of the American Beverage Association and sports scientists on Twitter who claimed it was biased. We were all asked by an angry Canadian professor of kinesiology (the scientific study of human movement) to produce our conflict of interest statements.  He couldn’t believe that we might not have been on the payroll of someone somewhere. It is amazing how controversial a blend of sugar and salt can be.

So far, the BMJ has “investigated” a range of medical scandals and without sounding nihilistic, there seems to be a pattern. This includes issues that you might expect to see in the mainstream press (and I’ll put the science equivalent in brackets): buried or fabricated reports (data), bad news going unannounced (unpublished); financial conflicts of interest; ambition; lobbying; spin in reports (research papers and accompanying promotional material); competition for re-election to Parliament (ever decreasing research budgets) to name but a few.

So just as principles underpinning journalistic and scientific inquiry are not a million miles apart, neither are scandals involving science very different from a front-page political press story.

Deborah Cohen
Assistant editor of the BMJ. Find her on Twitter @deb_cohen
Join the debate...
Log in or register to post comments