Evidence and Impact: Policy context

Derren Hayes
Tuesday, November 26, 2019

Despite the political parties pledging billions of pounds of additional investment in public services during the election campaign – and assurances from ministers in the previous government that the era of austerity is over – the need for projects and organisations working with vulnerable children, young people and families to demonstrate their interventions are an efficient and effective use of resources has never been greater.

The Youth Endowment Fund tackles youth crime by supporting projects that work with vulnerable children. Picture: Africa Studio/Adobe Stock
The Youth Endowment Fund tackles youth crime by supporting projects that work with vulnerable children. Picture: Africa Studio/Adobe Stock

This is the case for local authorities, other public bodies, voluntary organisations and independent providers. Providing evidence that what is done has a beneficial impact on outcomes for children and young people may be crucial for the development of a service or continuation of an organisation.

Having a strong evidence base to demonstrate good impact is becoming increasingly important for children’s services organisations when applying for funding and showing they are performing to the high standards expected by funders, policy makers and inspectorates. The development of theories of change to underpin an approach, outcomes frameworks to capture and interrogate information about interactions with service recipients, and toolkits and resources to embed the principles of the approach more widely are now common place across the sector.

Here, experts from the early years, education, youth work, social care and public health sectors explain how the government, councils and other public bodies are using evidence-based approaches to shape policy and funding decisions, and the impact this is having on how statutory and voluntary providers and practitioners adapt how they work to better meet the needs of vulnerable children and families.

The role of evidence in education policy

By Tom McBride, director of evidence, Early Intervention Foundation

In 2013, the National Audit Office (NAO) published a report on government evaluation that was roundly critical. Among other facts, it revealed that there were no plans to evaluate at least £49bn of major project expenditure, meaning that the impact of all this public investment would remain unknown and unknowable. If the NAO was to repeat its assessment today, I don’t know what that figure would be, but my unevidenced hunch is that it would be lower now than it was just six years ago.

From where I sit, it seems that the significance of evidence in policy making has increased and it is becoming less common for government to roll out new policies or initiatives without putting in place plans to evaluate their impact.

When the history of the parliament that has just ended is written, I doubt that much will be made of the quantity or ambition of social policy making. The lack of a meaningful majority and a certain B-word has limited what can be achieved. But it has seen what, in my view, is a welcome continuation of a trend towards research and evaluation playing a far bigger role in the way social policy is made in this country. With all the noise elsewhere, it’s possible that this gradual but important shift may have been missed by many.

This shift has not occurred by accident: indeed, there have been many initiatives to increase the role of evaluation in social policy making. Perhaps one of the most famous of these was the establishment of the Nudge Unit within the Cabinet Office in a move to bring insights from behavioural science and an increased use of randomised control trials (RCTs) into government processes. This unit lives on today as the independent Behavioural Insights Team.

What Works Centres

Meanwhile, the What Works Network – a collection of independent organisations charged with conducting research into social policy issues, of which EIF is part – continues to grow in number and influence. One of the most prominent What Works Centres is the Education Endowment Foundation (EEF), which, since 2011, has been responsible for more than 150 RCTs in schools, and for promoting the use of evidence among teachers and school leaders via their online toolkit. New to the club is the Youth Endowment Fund (YEF) – a partnership between EIF, Impetus and the Social Investment Business – which is a What Works Centre dedicated to tackling youth crime and antisocial behaviour through the rigorous evaluation of interventions that seek to work upstream of youth crime by working with vulnerable children and young people. Between them, EEF and YEF have more than £300m of government funding, representing a serious commitment to building the evidence base in crucial areas of social policy.

Government departments

Away from the What Works Centres, the government appears to be increasing its commitment to rigorous evaluation. The Department for Education is directly funding RCTs in areas such as mental health prevention and early years professional development. The Ministry of Housing Communities and Local Government has invested heavily in an ambitious data-linking project to allow it to robustly evaluate the impact of the Troubled Families Programme across a range of key outcomes. The Department for Work and Pensions is also committed to evaluating its national Reducing Parental Conflict programme.

There remains many areas of government policy that would benefit from a greater engagement with research and evidence. We set out in our 2018 report Realising the Potential of Early Intervention, there are significant gaps in the children and families evidence base, including what works to support parents with substance misuse problems or those experiencing damaging levels of parental conflict. Filling these gaps needs to be urgently addressed through significant and strategic funding of research and evaluation. However, we should remember that big steps have been taken over the past decade and serious resources have been committed to building the evidence base – something to be celebrated in these turbulent times.

Youth groups improve evidence gathering

By Kelly Bradshaw-Walsh, director of research, design and insight, The Centre for Youth Impact

Effective impact and evaluation practice in the youth sector should ultimately lead to positive changes for young people. That’s why we do what we do. It’s why we, at the Centre for Youth Impact, are working with youth organisations to develop and test approaches that provide actionable insights to improve practice. While beliefs about, and approaches to, evaluation still vary across the sector, we are seeing small but significant shifts towards greater confidence and comfort in using data, and an increased focus on learning and reflection. We’re seeing this in five key areas:

  1. Moving beyond a focus on outcomes
    Youth organisations are starting to see the benefits of looking beyond outcomes and embracing more holistic approaches to evaluation that help to better understand “process”: what happens, why and how. Important learning can be gained from data on how, and which, young people engage with provision; and young people’s experiences, such as how safe or respected they feel when taking part. Objective measures of the quality of practice, and the setting in which youth provision takes place, can form the basis for a continuous cycle of reflection and improvement – a core part of the approach being tested in the Youth Programme Quality Intervention (YPQI) pilot.
  2. New ways of working with data
    Collecting different types of data, and using powerful analytical techniques previously unused in the sector, allows new insights into the relationships between what youth organisations do, who they work with and what happens as a result (see research section). This enables us to start answering questions such as: do young people who experience high-quality settings experience better outcomes? And, are we supporting all young people equitably?
  3. Shared evaluation
    It’s easy to focus on evaluation as providing powerful learning for individual organisations, but when data is available at a sector level, it opens up new opportunities for understanding collective quality and impact. Take-up of common frameworks across the sector has been slow in the past, though the Youth Investment Fund, a £40m investment by the Department for Digital Culture Media and Sport and the National Lottery Community Fund, has provided an opportunity to develop a shared theory of change and evaluation approach across 90 open access youth organisations. Alongside this, the centre has developed a new outcomes framework for young people in consultation with partners in the sector.
  4. Young people at the centre of evaluation
    Young people have always been at the centre of practice, but evaluation is often seen as being for external purposes. Putting young people at the centre of evaluation means approaches that simultaneously build greater insight and strengthen their experience of youth provision. It calls for youth organisations to actively listen to young people to inform everything they do including how, when and where they work, and to focus this listening on key elements of their theory of change. The Listening Fund is currently supporting organisations to develop their capacity to listen to young people and act on what they hear.
  5. Aligning evaluation with practice
    For many organisations with which we work, evaluation is no longer an add-on, conducted by one member of staff in order to “prove” their worth to funders. It has become an embedded, whole-organisation approach, focused on learning to improve the support they provide to young people.

While these developments in evaluation practice are exciting, they are not yet the norm. They mark a move towards a better understanding of need, engagement, quality and impact that will lead to more informed funding and policy decisions related to young people. However, seeing the benefits of these changes in the longer term will require sustained energy and commitment at a systemic level.

Evidence and impact in social care

By Michael Sanders, executive director, What Works for Children’s Social Care

Changes are happening in the way that governments make some important decisions that affect young people. In 2011, the Department for Education established the Education Endowment Foundation (EEF) with a £125m endowment to narrow the attainment gap between young people from disadvantaged families and their peers, through the use of evidence. This experiment has fared better than anyone expected – the EEF has become the world leader in producing impactful education research, and we’ve begun to see its recommendations steering behaviour in classrooms and in Whitehall, as well as inspiring a generation of What Works Centres along the same lines.

What Works for Children’s Social Care is one such centre, aiming to bring the same approach to the children’s services part of the DfE, but we are really continuing a journey which has already begun, and which is gathering pace.

The DfE’s Children’s Social Care Innovation Programme spent £200m across a wide range of projects to try to encourage new and innovative practice in the social care sector. Each project was accompanied by an evaluation, conducted by an independent group of researchers. Although most of these evaluations did not look at the impact of the new ideas, they helped us to learn a lot about what was promising practice, and what the barriers were to successful implementation.

Scaling up innovation programme projects

Leading on from this, we’ve partnered with the DfE on two projects that aim to take the next step with some of the most promising ideas. The Strengthening Families, Protecting Children programme is scaling up three whole system changes – Leeds Family Valued, Hertfordshire Family Safeguarding and North Yorkshire’s No Wrong Door. These projects, which will be rolled out across 18 local authorities in the next few years, will be evaluated using the most robust possible methods, so we can learn more about their impact in these new places. The second programme, Supporting Families, Investing in Practice, is seeing the scale-up of three programmes – Family Group Conferences, Family Drug and Alcohol Courts and the Mockingbird Family Model – to more than 50 local authorities, supporting thousands of families, again while rigorous evidence of impact is being produced.

We’re also aiming to build the pipeline of evidence around grassroots practice through our PINE (Practice in Need of Evidence) programme. Here, we’re working with professionals to help them build evidence around their own activities, and their own new ideas, hoping to foster some of the next generation of inspiring practice that gets taken up across the system.

These ideas are at an early stage, but everything must start somewhere. We are confident that the answer to “what works” in children’s social care will come from within practice, and we hope that PINE will allow those who know practice best to shape its future.

With a topic as important as the future of our most vulnerable children and young people, it is vital that we are making the best policy possible, and investing in programmes and interventions that are shown to work.

EXPERT VIEW
Why children’s charities need an improvement agency

By Dan Corry, chief executive, New Philanthropy Capital

The charity sector has a lot of regulators. It has the Charity Commission. It has the fundraising regulator. Specific kinds of charities might be regulated by other sector bodies like Ofsted or the National Institute for Health and Care Excellence. But regulation only gets you so far. Does the sector have a body which pushes it to be better, more effective and so help more people, families, children and communities?

Some people think this is one of the roles of the Charity Commission, and perhaps in the past this was the case. But in recent years, with budgets tight, the commission has focused more on enforcement than improvement.

Others might argue that it is the role of the sector bodies like the National Council for Voluntary Organisations and Association of Chief Executives of Voluntary Organisations to drive improvement in charities. They do good work in this area already, but as membership organisations they will never push too hard on impact and improvement, lest they offend their members.

For these reasons, we think there needs to be an independent civil society improvement agency which would hold a mirror up to the sector and share and promote best practice to help improve the effectiveness of the sector. What might this look like in practice and how could it help children’s charities?

First, I want to be clear I’m not advocating some kind of top-down target or inspection-driven approach. We know this has produced mixed results in the public sector and, in any case, is totally inappropriate in the voluntary sector. But I do believe there is a half-way house, where the sector takes more collective responsibility and challenges itself to improve, to learn and to innovate to be the best that it can be.

This needs to go beyond rallying cries for better behaviours, important though these are for inspiring action and clearing some space for action. We need to create an institution that has some bite and sharpness though respecting the capacity of different charities to be involved.

At its most light touch, it could share good practice and undertake analysis of data collected by the Charity Commission and others to identify trends and issues both at national and subnational levels. It could use this data to, for instance, identify where geographically the children’s charity sector is operating, where it is strong and where it is weak. It could convene peer networks among children’s charities focused on improvement and create opportunities for the sector to learn from charities working on other causes as well.

Crucially, an agency would organise and oversee peer reviews of charities. This has been found to be a powerful, inclusive, bottom-up approach to improvement in other sectors, but it is largely absent in the social sector, where it is most needed, given the lack of available benchmarking and other comparative data, and the heterogeneity of the sector. Mostly peer reviewers for children’s charities would be from those in the same sector.

Doing as much social good as it can should be the motto of all charities. An agency would enable that to actually happen more often than at present.

EXPERT VIEW
What is the future role of evidence in shaping services?

By Tim Hobbs, chief executive, Dartington Service Design Lab

We’ve seen many challenges in the adoption of evidence-based practice identified by the What Works Centres – what has worked in one place is far from guaranteed to work in a different place, time or context, and people don’t always like having models of practice imposed on them.

We’ve also seen a large rise in human-centred design approaches, “co-production” or “co-design”. These approaches, which place people at the heart of a service design process, can help address some of the “contextual fit” and engagement challenges that some evidence-based practice faces. But what about the evidence here? There is a risk that these approaches do not build on existing bodies of knowledge and evidence, and as a consequence, don’t have the impact they could.

So what is it to be then: evidence of impact or human-centred design?

It is, of course, not an either/or question. The future of commissioning and evidence-based practice is one that straddles both these worlds. It is a design and commissioning process in which those people using, delivering and commissioning services work together and engage with the latest research and evidence. This way, we can have greater confidence that people will not only engage with services, but that the services will also have a positive impact.

As such, here are a few important trends that we see developing:

  • Smarter use of data to better understand and focus services on needs. Our Three Circles report, released earlier this year, demonstrated that too often there is a fundamental mismatch between needs and services. Smarter use of new and existing data can greater focus segmentation of needs to inform service design and co-ordinated commissioning efforts.
  • Identifying “common elements” of practice, rather than specific interventions. Rather than spending significant amounts of money on evaluating specific, “named” interventions and services – which often struggle to be replicated – we should instead be targeting evaluation efforts on identifying mechanisms, or what implementation scientists refer to as “common elements”.
  • Improving, not proving. If commissioners and service providers could be sufficiently confident about the key effective mechanisms of practice, attention could shift from trying to evaluate and prove impact, to instead optimising or improving practice related to local needs, contexts and systems. Careful, planned and adaptive implementation is key. We expect to see a rise in more nimble, adaptive methods of learning – such as the rapid cycle design and testing that we’ve used and written about extensively on our website (see practice example).

At Dartington Service Design Lab, we’re increasingly working in these ways, bridging the divide between research, policy and practice. We’re not alone – indeed some What Works Centres are on the same path – and we need more people from the evidence-based and human-centred sides to embrace each other’s perspectives, and bring both to bear on issues affecting children and young people.

FURTHER READING

  • Youth Impact Fund, learning and insight paper, Matthew Hill, Karen Scanlon and Ed Anderton, NPC, April 2019
  • Ten steps for evaluation success, Dr Kirsten Asmussen, Lucy Brims and Tom McBride, Early Intervention Foundation, March 2019
  • Evaluation of Program Quality and Social and Emotional Learning in American Youth Circus Organization Social Circus Programs; C Smith, L Roy, S Peck, C Macleod and David P. Weikart, Center for Youth Program Quality at the Forum for Youth Investment, 2018
  • Four challenges of measuring the charity sector’s impact, John Davies, National Council for Voluntary Organisations, October 2018
  • Quality-outcomes study for Seattle Public Schools summer programs, 2016 program cycle; C Smith, L Roy, S Peck, C Macleod, K Helegda and David P. Weikart, Center for Youth Program Quality at the Forum for Youth Investment, 2017
  • User voice: Putting people at the heart of impact practice, NPC, November 2016
  • Impact measurement in the Neet sector, New Philanthropy Capital, September 2012
  • Catalyst framework of outcomes for young people, Young Foundation, October 2012

CYP Now Digital membership

  • Latest digital issues
  • Latest online articles
  • Archive of more than 60,000 articles
  • Unlimited access to our online Topic Hubs
  • Archive of digital editions
  • Themed supplements

From £15 / month

Subscribe

CYP Now Magazine

  • Latest print issues
  • Themed supplements

From £12 / month

Subscribe