Skip to main content

Our Impact


At Barrow Cadbury Trust we believe it is more important to do the things which most need to be done, rather than the things which are easy to measure.

Evaluating our impact and that of the work we support is of course important; we are custodians of charitable money and should be accountable for using it wisely. We don’t take that responsibility lightly. But we also don’t shy away from tackling difficult and enduring problems.  Problems which are structural, systemic and complex.  This kind of work requires risk taking, patience and usually a long time frame.  


Working Collectively

Our approach to evaluating impact is to view it cumulatively and often collectively.  We ask ourselves what have we, and the ecologies of partners we work with on these complex issues, achieved?  

None of us work on persistent social injustices in a vacuum.  When challenging the status quo we do not progress evenly; sometimes the external environment gets worse despite our best efforts.  This does not invalidate the work – very much the opposite is true.   

Our model of working is to build (and join) alliances for specific change – we are not a generalist ‘broad reach’ foundation.  So each individual grant is part of a jigsaw of activities.  Therefore, the impact achieved by each contributes to the wider collective impact.  This is very different to funding a thousand unrelated pieces of work and wanting to know the impact of each.    



Our Approach

The approaches and methods we use are varied and largely bespoke.  They range from full randomised control trials, or formative academic evaluations, to very light touch narrative outcomes reporting.  

We try to make life straightforward for our partners.  We know how time-consuming and frustrating it can be to put reports together which feel more like ‘serving the funder’ than serving the work.  Many of us have been in that position ourselves! 

Evaluation is discussed during the negotiation of the funding relationship and a broad brush framework put in place.  We are interested in what did not work as well as what did and see impact reporting as a learning exchange and not an exam.  As most of our partners are long-term, ‘reporting’ is an iterative conversation rather than a stand-alone document.