Last week we were at the National Children and Adult Services conference in Bournemouth. As well as attending the conference, and talking with people visiting the stand of our partner the Social Care Institute of Excellence, we were pleased to host a session on the morning of the last day of the conference.
The session, which was about bringing children and families closer to decision-making in social work, focused on our pilot projects of Social Workers in Schools and Devolved Budgets – whose interim evaluation reports we published back in August, and whose final reports will be published in March.
In the session, three speakers with first-hand experience of the projects – Stockport’s Chris McLoughlin, Lambeth’s Michelle Hayden-Pepper, and Hillingdon’s Julie Kelly – all gave their own insights into the projects so far and how they’re going. Each was able to give powerful stories of the work that’s being done.
In Hillingdon, one of our devolved budget authorities, the additional budgets made available are being used for a huge variety of activities – from Nandos, to home improvement, to group activities like a stay on a farm to engage young people and give their parents a bit of a break.
In Lambeth, meanwhile, social workers placed in schools are able to build relationships with young people and teachers in the schools, and to demystify social work by making social workers just one of many professionals that young people see in their lives.
Alongside the powerful stories from Michelle and Julie, we were also able to share a short video of some of the reflections from young people involved in the pilot, whose experiences of their social worker involving them in decisions was very positive.
After the warmth and positivity of the session, as well as three of the speakers saying that they “wouldn’t go back” to their previous ways of working, one audience member asked in the Q&A whether the evaluation is really needed.
This is a good question, and an important one to ask. As an evaluator for most of my adult life, it can sometimes be tempting to say that things need evaluation, without thinking about the why. Here are a few reasons, which I hope I managed to convey at the conference:
Lots of things don’t work
There are far more excellent ideas than there are effective interventions. Many ideas for interventions, loved by their creators and liked by the people who receive them, simply don’t have the effects we expect, because the world is a complex and messy place where sometimes other things get in the way. Without evaluation, it’s hard to sort the excellent and effective ideas from the merely excellent ones.
Beyond the binary ‘does/doesn’t’ work question, there’s a question of whether something works better than an equally promising alternative, and for whom it works, and when. Evaluation is needed to help us get answers to these sorts of questions.
There are different forms of evidence
At the conference, we presented the qualitative insights of people leading an intervention, and the impressions of some young people. These are valid and important forms of evidence – if the young people hated the intervention, or the social workers resented and resisted it, we might well think twice before continuing. But these are not the only form of evidence. Good quantitative analysis, that tries to establish the causal impacts of an intervention, are needed too. Different people will be convinced by different forms of evidence, and sensible policy should consider the widest possible set of information.
Things aren’t as obvious as they seem
It might seem obvious that the Devolved Budgets and Social Workers in Schools ‘work’, but in reality, it’s a little less clear-cut. First, each of the three local authority partners involved in the project is taking a very different approach. We might not expect to see this if it really was ‘obvious’ that the projects worked in a particular way. Relatedly, after the session one Director of Children’s Services from a local authority outside of our pilots came up to me to express her concerns about the project, and to make sure that our evaluation was looking for some key sources of risk that she thought might occur in the project. We’re speaking again soon to make sure that these are picked up, but it was very far from obvious to her that these pilots are a good idea.
Independence is important
With the best will in the world, we can’t claim to be completely impartial observers of the things we’ve come up with ourselves. That’s why we commission independent evaluators (in the case of these projects, Cardiff University) to overcome this. In some cases, and especially for more formative interventions, this might not be possible or sensible, but it should be our aspiration if people are to believe that something really works.
Six is a small number
The pilots are currently being run in six local authorities – three for each of the programmes. That leaves 146 local authorities that aren’t doing either, and 149 that are only doing one of the two. Without evaluation that’s able to draw together the story of what’s happening, which elements work best, and the data, effective ideas might struggle to make it to scale.
We hope that the findings of these evaluations, which we will be sharing in March 2020, will provide food for thought for our colleagues in children’s social care, and will ultimately help improve the outcomes of children, young people and families.