The fourth estate are a little annoyed with the DCSF

Does the DCSF publish school league tables? No, they don’t. The media do. And they seem to have got their collective support knickers in an unjustified and disingenuous twist.

It is a mark of the incredulity and anger among education correspondents that they have collaborated in getting their organisations to write to the DCSF and the UK Statistics Authority.

A joint letter of complaint to DCSF chief statistician Malcolm Britton has been signed by senior editorial figures at the BBC, national newspapers and the Press Association news agency whose tables are used by numerous regional and local news organisations.

And here’s the best bit …

With less than 24 hours’ preparation time, it will be much more difficult to produce any meaningful analysis of the information and to ensure there are no errors.

Doh! Why not take a little longer than 24 hours then? Why not take a week? Or a month?

Oh. I get it now.

There is a publication race to get the ‘best’ league table supplement out, which no doubt has a bearing on newspaper sales/website hits.

This complaint has very little to do with parents access to the information and much more to do with sunday supplements.  I don’t believe that quality of data is necessarily the real issue. I think they are more concerned that the inevitable level playing field that a week’s embargo used to bring to the publishing deadlines, has been blown apart by this initiative. Now they are all concerned that their competitors may steal a march on them and rush out a supplement very quickly, and grab a large tranche of the sales that would normally follow.

But I think we also need to ask our selves a serious question:

Do examination results data actually tell us anything useful about our schools?

I will always contend that they do. And actually this is one area where the DCSF (and I’m not a great fan of theirs) and its predecessors have really done a lot of good-and-useful work.

Because they don’t just publish the raw results data. They actually add-value to the results data by analysing the results against a number of indicators. These include

  • prior attainment
  • gender
  • special needs
  • ethnicity and
  • Income Deprivation Affecting Children Index (IDACI), a postcode-based deprivation index.

It is these value-added scores that give real meat to the data. And these are quite often ignored by the press because they are only available for the state sector. And as everyone knows the newspaper analysis are usually about saying how badly state schools are doing compared to the independent sector.

What is even more interesting is what analysis the dept doesn’t do, or at least doesn’t publish. Factors like

  • school size (do small secondaries perform better than large ones?)
  • religious affiliation (Is it true that faith schools are better?)
  • elapsed time since last Ofsted visit
  • new headteacher (or other senior leaders)
  • recent capital investments
  • teacher/pupil ratio

all would, could and should have a bearing on results. Much of that information is available either implicitly or explicitly from the DCSFs own data. It would be very interesting to see what Freakonomic-like trends could be extracted from the data, in the hands of an ininvolved analyst.

So when the teaching unions complain about the inequity of the league tables, what they are implying is not that the government should stop publishing league tables (which they don’t) but that they should stop publishing the data, full stop.

%d bloggers like this: