researchED 2015 – factchecking claims isn’t just about accuracy
Thursday 24 September 2015
At the ITV general election leaders’ debate back in April, Nick Clegg claimed:
“If we want to make sure that our own youngsters get the jobs…we’ve got to train them up. Over the last five years we’ve got two million more people starting apprenticeships”.
He’s right that there was an increase of two million, but these new apprentices don’t necessarily represent better qualified youngsters. Look at the breakdown of the data and the biggest increase in starts was for those over 25, who made up 4 in 10 of the new starts. In other words—apprenticeship starts for the over 25s more than tripled, while starts for the under 19s increased by 3%.
Full Fact, the independent factchecking charity, worked with NFER in the run-up to the election to factcheck claims just like this. As well as adopting Full Fact’s own model of live factchecking, we also produced impartial briefings for voters on the issues we thought would be important to them in deciding their vote. As can often be the case in the education sphere, we were inspired to do the project based on the work in another country, in this case Norway. Their statistics body publishes neutral briefings on key topics in the run up to Norwegian general elections.
At this year’s researchED conference, we gave attendees a toolkit of what to look out for when checking claims, based on some of the things we worked on during the election.
Factchecking claims isn’t just about the accuracy of the specific statistics being referenced. Politicians—and others—often use statistics as shorthand for an entire argument. By factchecking these individual claims we seek to promote an informed debate and expose these wider arguments to greater research scrutiny.
One useful factchecking tool is asking yourself what question a statement is answering. The Education Secretary Nicky Morgan commented in April that “academies are outperforming the old council-run schools”. So we might say the question this is answering is how ‘good’ are academies? Refining this further based on the statement, the question is something like how do the exam results or Ofsted performance of academies and local authority schools compare? Given that over two thirds of academies have become academies because they were already performing well—they were ‘converted’ to give them more freedom—it wouldn’t be surprising if they had particularly good results. The real question we might want to ask is therefore, is the academies model more successful than the local authority model? Did converting to an academy make these schools perform better than they would have done?
We can’t measure something that didn’t happen, so that question is pretty difficult to answer. Instead what we can ask is, have academies improved more than similar schools like them which didn’t turn into academies? Now we should get the answer we really want to know.
Analysis by NFER has provided us with the answer we’re really trying to get at: in 2013, secondary sponsored academies (the ones that are mostly low performing before becoming academies) did seem to improve more than similar local authority schools over the space of two to four years; secondary converter academies (the ones which are mostly high performing before transferring to academy status) generally didn’t seem to improve any more than similar schools. The difference for sponsored academies was slightly more nuanced in 2014, when the findings were also affected by changes to the way GCSE performance is measured. So while the Education Secretary’s claim might have been accurate, it doesn’t mean that the academies model is a definite success.There’s more detail on these comparisons and other findings in our election briefing and on NFER’s website.
Another question to ask is whether a claim is based on a comparison of like with like. Labour said in their manifesto, that they would “end the wasteful and poorly performing free schools programme”. We cannot reliably compare the performance of free schools to the results for local authority schools. Very different numbers of each kind of school have been inspected by Ofsted: only 136 free schools and over 17,000 local authority schools. Similarly we have very little performance data. Only 21 free schools have published key stage two data, and only 10 have published GCSE data.
Apart from the small number of free schools, we’ve also only got inspection results for just under half of the 304 open free schools. So we don’t know whether these will be similar to the remaining half, whereas we have results for a large proportion of local authority schools. Free schools also get an Ofsted judgement regardless of whether they have pupils in every year—and many of them don’t yet, because they’re new schools and tend to fill up year by year. So the judgement isn’t based on the experience of pupils in every year group, whereas it is in other schools.
So it’s too early to say how well free schools are performing compared to other schools.
The 2015 general election project for us was just the start. Now we have shown the model works – we successfully rebutted and corrected claims from the major parties, newspapers and achieved an on-air correction on Newsnight within an hour of a claim being made—we want to make it bigger and better for 2020. If you’d like to find out more about our work in the meantime visit us at fullfact.org or @FullFact.