Using evidence to improve - sharing our learning to date

 

Deputy Chief Executive | @JayEeeEnn

We talk about evidence a lot at Dartington and using evidence ‘in new ways’. We try to put this in practical terms without too much jargon, but we know it still sounds pretty abstract. What do we actually mean? Jenny North explains how our new report tries to bring it all down to earth.

The need for a new approach

One of the approaches at the heart of our attempts to use evidence in new ways is rapid-cycle design and testing. We developed this method to address a common problem: funders or delivery organisations with a desire to understand whether services ‘work’ and rushing to long-term, expensive impact evaluations. Too often this happens before organisations are ready, ending up in no-impact findings. As researchers, we were familiar with the experience of realising early on in such trials that there were several factors undermining impact, but not being able to do much about it – improvement wasn't the aim of the game, and besides, trial designs typically depend on the service under evaluation holding ‘a steady state’.

We know the value of randomised controlled trials and the evidence they produce: when done right, for the right reasons at the right time, they make enormous contributions to foundational bodies of knowledge. But when the Dartington Service Design Lab was established in 2017 (as a new iteration of the Dartington Social Research Unit), we knew that trials weren’t where we wanted to concentrate our efforts. At the time we were talking to delivery organisations who were feeling the same way – either trials had already shown that they were not having the impact they wanted to have, or they were thinking about developing a new service they wanted to test from the start, but not via multi-year trials.

Developing our method

We believed that shorter, lower-stakes ‘tests’ could help. What these organisations needed was information on implementation, attendance, and outcomes, as close to real-time as possible, that could help them understand what was going well or not so well, and make decisions in response to this information. Over time they would iron out problems in their service and build on strengths. Evidence could help them serve people even better and develop their service to a point where a more traditional impact evaluation was appropriate. We were able to draw on the work of other evaluators, particularly in the health field, to develop an improvement-focussed approach.

However, our interest, and that of our partners, was not just in the testing. An intentional and inclusive approach to designing what was to be tested came first. The Design step in our method is usually most intensive at the beginning of a project - we use theory of change tools but also draw on the relevant external evidence base and service design techniques. We focus on the importance of clear decision-making in Design, but also on hearing the voices of those who matter – children, young people and families, volunteers and staff.

Success factors

Our new report tells the story of how we combine Design and testing in a rapid-cycle method and the lessons we’ve learnt. We have shared the five steps of the cycle (figure below), hopefully without jargon and with insight into what is required to make each step a success. Some of the ingredients for success are down to the ingenuity and flexibility of the research team (which is most commonly an external partner like Dartington but could be done in-house depending on capacity) and how frequently we have had to flex research plans, identify new measures, and adapt data collection. Genuine engagement and co-design with those using the service is also crucial. But as important as both of these, is the culture, capacity, and leadership of the team delivering the service being tested.

Fig 1: The rapid-cycle design and testing method in steps.png

Fig 1: The rapid-cycle design and testing method in steps.png

For them, change is baked-in to the process: the assumption of undertaking a rapid-cycle design and testing approach is that you will make changes to your delivery, often at an unusual pace. This can be practically challenging – but also emotionally difficult to let go of old ways of working. Communication and support are essential in mitigating these challenges. Staff may not also be used to having real-time data on their own delivery as well as their colleagues – at first, this can be difficult, and it needs to be handled sensitively. Beyond the immediate delivery team, this may well be an organisation’s first experience of evaluation which is actively looking for problems as well as successes. Making the case for this approach, maintaining commitment and energy throughout the project, and sharing the results openly are all challenges leaders must meet to make these projects work.

We believe rapid-cycle design and testing makes evidence timely, relevant, and useful for organisations. Even services with proof of impact from very robust evaluations will undergo future adaptation – and it’s imperative that communities, staff, and funders know whether these can be delivered well, and what effect they seem to have. For organisations on their way to external evaluation, this is also an invaluable step – and one which will allow them to serve children and young people better now.

The full report can be read here.