It’s been a month since I started work at the What Works Centre for Children’s Social Care, and a month since I stated my intention to “Start as I mean to go on”, so I thought it was a good time to check back in. As my one-month anniversary at the WWC also lines up with Valentine’s day, it also feels appropriate to say that I’m loving it.
One of the most common questions I get asked, when speaking to practitioners, policymakers and families is, in various forms, “What is this what works thing?”. I suspect that each of the other what works centres has their own definition, but here is mine.
For something to be “What Works” for me, it needs to tick four boxes – impact, nuance, usefulness, empowerment.
For something to be “What Works”, it needs to give us a sense of impact. What this means is it has to give us an answer to questions like “If I pull this lever, what will happen?”, in pretty precise terms. With this as our goal, it means we’re interested in research that lets us talk about the effect of one thing on another – and not just the correlation between two things. To this end, the What Works network is very supportive of Randomised Controlled Trials (RCTs), which are the “Gold Standard” in answering this kind of question, and let us say things like: “When young people are given breakfast at school, their grades go up by on average five marks”.
Our enthusiasm for RCTs should not be mistaken for zealotry, however. In looking for impacts, we’ll sometimes need to make use of what are called “Quasi-Experimental” approaches – which use more elaborate statistical techniques to get at the same answers as we’d find from an RCT – because it might be quicker and easier, particularly once a programme has already been rolled out. Very often a quasi-experimental approach will make a good prelude to an RCT.
We also need to recognise that an RCT is not always a good place to start – if a programme is new, a lighter touch pilot study, like those that we’re running for our change projects, is an important step on the road to impacts, to help partners understand how to run the project effectively. Which brings us to…
At their most basic, an RCT produces a statistical estimate of the size and direction of the impact of something – a 5% increase in this, or an 8% reduction in that. This is great, but obviously misses a lot of the story underneath. What really is that something that we know the effect of? How was it implemented, what level of support did it have, was it more effective for some groups than for others? If you wanted to take an intervention and try it somewhere else, you’re going to need to know a heck of a lot more than just the headline result. For something to really be “What Works”, the “What” part of that question needs to be as important as the “Works” part, and that means that the quantitative RCT should (almost) always have qualitative and process evaluation built into it and standing tall beside it when we come to talk about the results. The world is complicated, and if we ignore that, it’s to our peril.
The world is complicated, but telling practitioners, managers and policymakers that isn’t giving them anything new and, importantly, isn’t giving them anything they can use. This is quite a simple point – if you’ve done a piece of research and you can’t identify who could actually make use of its findings, it isn’t What Works.
Finally, and most importantly, our work and our research need to be empowering. This movement isn’t about ivory tower academics, or Whitehall policy wonks telling children’s social workers how to do their job through a lens of science. Whether you’re a social worker deciding what early help a family might benefit from, a manager trying to decide how to support their staff, or a Director of Children’s Services deciding how to structure your service, empowerment is about making sure that there’s a strong evidence base to help make those decisions armed with as many facts as possible. You can always ignore us, and take your own road, but if you ever need us, we’ll be here.