Yet the last few years, the proportion of outstanding, good, satisfactory and inadequate Centres has been very much unchanged. Here are a couple of screenshots to show how it is presented on Ofsted's useful DataView:
Compare that to the outcomes for individual Children's Centres in the last 30 days, and you'll see a very striking difference:
What might be going on? The usual health warnings apply. The DataView statistics are much more robust, covering a year at a time, whereas my graph only relates to 23 inspections in the last 30 days.
Additionally, there seems to be a mismatch between Ofsted's list of recent inspections, and the actual availability of reports. To take a random example, the site reports that Hamwothy and Turlin Moor Children's Centre has been inspected in the last month, on 17th December:
But when you click through, the most recent available report is from 2011:
So, perhaps there is a lot of missing data in my analysis, and that is distorting the statistics I have generated.
Additionally, there are now two sets of Children's Centre inspections - standalone, individual centres, and groups of centres. Ofsted has changed its inspection framework for Centres. And because inadequate centres are re-inspected more quickly than the rest, the selection of a single month is more likely to show poorer outcomes. So it is not easy to compare statistics, and the results of my Excel spreadsheet might be flawed.
All the same, it is a very striking difference. For three years, roughly 70% of Children's Centres are judged to be good or outstanding; and now more than 70% are judged to require improvement or to be inadequate. It seems unlikely that Children's Centres have got so much worse in such a short period of time, which begs a simple question. What is the explanation for this dramatic change?
No comments:
Post a Comment