Founding Director & Editor-In-Chief

November 8, 2019 (Updated )

We write this as the UK is in the midst of yet another General Election, with conflicting opinion polls abound. But whenever you’re reading this, you’ll probably receive a bucket tonne of PR emails by the time you finish. And about half will likely include some kind of research results.

But, polls are tricky. In short, how do you know the line you’ve got in front of you is actually trustworthy, especially when you’re a desk reporter who’s expected to churn out eight or more stories a day?

“The challenge is that even reputable papers will, in the heat of the battle, will do things like The Guardian reporting on Ed Miliband moving ahead [on the 2015 polls] by one percentage point,” says Ben Page, the Chief Executive of market-research company Ipsos MORI.

Speaking at the launch of new Market Research Society (MRS) and IMPRESS guidance on how journalists can report on polling and survey data better, he also points to more recent examples, such as The Telegraph excluding ‘don’t know’ answers to portray that PM Boris Johnson had support for shutting down Parliament for Brexit. At the time, he tweeted that their reporting was “taking the piss”.

But, while there will always be “editors with a bee in their bonnet”, as Stephen Barnett of Westminster University puts it, a lot of the time getting polls wrong is a lot more subtle than that. It’s about a lack of information in a time-pressured environment. So, what are the steps journalists can take to try and get it right?

‘It’s Basically PR Driven’

A PR car, driving its way to poll town. (Image Credit: Liam Pozz / Unsplash)

Firstly, the main thing for journalists to realise, is that poll and survey reporting reaches much wider than just election time. “The fact is, it’s basically PR driven,” explains Stephen, “you’ll get dozens of PR emails into your inbox [every day] and then half of them will have some kind of survey or poll”.

Want to get better at this kind of stuff? You can download the full, free guide from IMPRESS and MRS here, which you’re more than welcome to stick on your wall. Excellent stuff.

Basically, regardless of whether it’s a poll about Jacob Rees–Mogg or something fluffy about tea and coffee, the vital thing is to make sure you’re putting the same amount of scepticism on them.

“I’m not expecting journalists to have the training to be a social scientist,” continues Stephen, “[but think about] who commissioned it – there’s a reason they’re doing it. It’s not easy, but if you can just apply a little bit of critical analysis.”

In almost all cases, the starting point here should be the technical note, which should be included with every single survey or poll findings. If it’s not there go back and ask for it. This will include things like the audience size, who conducted the poll, how they were selected, when the poll took place, and how the research was conducted.

The roadmap to reporting polls responsibly, put together by IMPRESS and MRS.

As a general guideline, you essentially want to know:

  • That the poll was conducted by a recognised and reputable organisation. If you’re not sure, you can check if they’re a member of the Market Research Society.
  • That they spoke to at least 1,000 people for the poll. If you’re trying to portray public opinion any less you probably want to be speaking to around this number, as the lower the figure, the bigger the margin of error. Obviously there are sometimes groups you might want to survey which are less than 1,000 in membership themselves, such as MPs, but always take a look.
  • You should check that they’ve made sure the people they’ve asked are representative of the audience. – and you should see the phrase “representative sample” in the technical note. In short, you wouldn’t get a good sample of over 75s if you only found people through online adverts.
  • Make sure the research wasn’t self-selecting. For a poll to mean anything statistically, it needs to have been picked by researchers or the results will be skewed.
  • Are the questions biased? This is a statistical term, but basically means the questions lead people to answer one way or another. Look at things like whether participants had to pick from a list and what options were available or if the question was double-barrelled.

It might seem like a lot of things to be looking out for, but in reality, all of this stuff should be included in the technical note at the bottom of the release, and, if it’s not, then it’s the time to start asking questions.

“Pick up the phone,” advises Cordelia Hay, an Associate Partner at Britain Thinks. “Most of the time they’ll be happy to speak to a journalist. It’s in their interests to get it presented accurately,” she adds, stressing that “no pollster wants to be embarrassed.”

“My advice, to be honest, to any journalist, is to think ‘I’ve got the headline, I’ve decided it’s interesting, now send me the tables,'” adds Sir John Curtice, Senior Research Fellow at NatCen Social Research, “that’s the kind of discipline we need.” Whatever you do, make sure you address any worries pre-publication.

A Snapshot In Time

Polls and surveys are just a snapshot in time – they don’t give you the wide view. (Image Credit: Marco Xu / Unsplash)

If you’ve ticked off all of the basic points, you’re already in a pretty good place. However, the main thing is to remember that polls and surveys simply can’t tell you everything. “What we keep coming back to is the tension between the bigger picture and the detail,” says Cordelia.

“[It’s basically] the headline versus what’s really going on. [Polls] don’t translate very well onto seat predictions, and we don’t think that’s said often enough in the public domain. A shock poll makes for a good headline, [but we need] more time and space for nuance”.

It’s also vital to remember that polls only show us one moment in time – they can never claim to show the whole picture. To put this in context, think about asking about a poll on a general election a month in advance – there’s little chance people would even know who they’re going to vote for then, so it would be unwise to suggest this is a definitive piece of research.

Make sure you understand the numbers. If you’re reporting on polls, make sure you understand the key numbers. Make sure you know the margin of error and what is actually statistically significant, and nail the difference between percentage points and percentages, as well as knowing why it’s unwise to compare different polls. If you want more on this stuff, take a look at this from Journalist’s Resource and the Data Journalism Handbook.

It can be tricky in a busy newsroom to think beyond what’s in front of you, but if you think something is genuinely interesting, look at how you can move the story on or add context. Could you look as the history of the issue, or are there case studies which help set it out? Could you go to a focus group?

It might sound like a lot of things to think about, but in reality, it’s a series of simple checks, followed by looking out at the wider picture. As Baron Lipsey concludes: “If you just get three points out of [this] then you get rid of 90 percent of problems. There are only a few basics you need to drum in.”

Feature Image Credit: Jason Condriet / Unsplash