Skip to content
This website uses cookies to help us understand the way visitors use our website. We can't identify you with them and we don't share the data with anyone else. If you click Reject we will set a single cookie to remember your preference. Find out more in our privacy policy.

“With data, as with social work practice, context is key.”

Having last month talked about his love of his new role at the What Works Centre, Michael changes tack to talk about something he seriously dislikes - the misuse of data and statistics.

Statistics have a privileged place in a lot of people’s minds – either a positive one, because having a graph or a figure lends an element of “truth” to a piece, or makes it seem more robust, or more serious, or alternatively because they adhere to Disraeli’s belief that there were three types of lies; “Lies, Damn Lies, and Statistics”.

For me, the truth is somewhere between the two viewpoints. Statistics are a powerful and valuable tool for understanding the world around us, and for surfacing the voice of those that cannot otherwise be heard. Without statistics, we wouldn’t know that cigarettes or asbestos cause cancer. We wouldn’t know, for example, children in care make better educational progress than do children in need. And we wouldn’t know that the earlier, and longer, a child is in care, the better their educational attainment tends to be. Both are conclusions from the Rees Centre’s work for the Department for Education last year.

The power of statistics is limited, however, by the data available, and our ability to understand it. Most data are descriptive, which is valuable, but cannot tell us what to do to improve the world without a nuanced understanding of the lived reality behind the data, and theories about the way the world works.

So, I’m a fan of data and statistics, but recognise their limitations. These limitations are much greater, however, if they’re misused, or misunderstood, by the people using or interpreting them.

Earlier this month, The Sunday Times published an article drawing a link between young people in pupil referral units (PRUs) and violent crime, and arguing that PRUs are hence ‘schools for crime’. It is certainly the case that young people educated in PRUs are more likely to commit crime than young people not in PRUs, but the report conflates a correlation with a causation – and it is very likely that third factors have led to both violent crime and to ending up in a PRU.

Interpreting correlational data as causal means we miss information about what’s going on, and let us draw a whole range of other, equally valid conclusions. Children in PRUs are about four times as likely to be in care as the general population, so could we say that PRUs cause children to be taken into care? More facetiously, 78% of children in PRUs are male. Do PRUs cause maleness? Obviously not.

Another major challenge with using statistics well is in understanding the limitations of the data. A report that got attention towards the end of last year found that the number of children taken into care at birth had doubled in 10 years. This is a striking statistic, but without understanding the level of need of those cases, the number of referrals coming in, and a multitude of other factors that exist outside of the data used, it’s not clear which conclusion we should draw, whether this should be lauded as an early intervention to avoid harm, or condemned for unnecessary state intervention in family life.

Similar statistics, on the rate of progression to university for care leavers, paint a bleak picture; only 6% attend, as compared to almost 50% in the general population. Careful analysis by Neil Harrison, from Oxford’s Rees Centre however, shows that this figure is misleading, because of the data it omits, and the real figure would be more like 12% – still low, but much higher.

Finally, we need to remember that ‘what gets measured gets done’, and the distortive impact that statistics can have if carelessly used. Because “white working class boys” perform worst of all groups educationally, we’ve seen a shift of the discussion, and of research, towards this group and away from “white working class girls”, who perform better, but not by very much. If data about one group – looked after children, for example – is easier to come by than for others, such as young people with child protection plans, then we run the risk of focusing our attention where the data quality is highest, rather than where the need is greatest.

Statistics can help us to illuminate the world and make navigating it easier, but we need to always remember their limitations, and try to overcome them.