• Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email
  • Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email

Meg Attwood explains how enlisting the help of evaluation consultants helped arts-based Community Interest Company Light Box gain credibility and stay on the right track

You see it day to day so you know that your arts project has a beneficial impact on health and wellbeing. But, with limited resources, how do you prove it in a way that is credible to funders and partner organisations? How can you use evidence to inform the shape of your future work? And how do you evaluate in a way that is creative and reflects the nature of your organisation?

These were some of the questions facing artists Lucy Duggan and Lucy Barfoot when, in 2009, they founded Community Interest Company, Light Box, with the aim of sharing their positive mental health message through arts-based, public-access ‘Happiness Workshops’.

Following a pilot programme Light Box asked arts in health consultants Willis Newson to help them evaluate the next, more ambitious phase of their project in 2011. As Willis Newson’s Research and Evaluation Lead, I was impressed by their enthusiasm for evaluation and their commitment to a robust approach.

Working in partnership with Professor Norma Daykin from the University of the West of England, Willis Newson delivered a guided approach to evaluation – training, mentoring and consultancy – that developed the team’s knowledge and skills and embedded an evaluation framework within their everyday practice.

An initial meeting with the Light Box team (now a team of three, having brought Art History Teacher Kathryn John on board early in 2011) and their key stakeholders sought to understand the priorities for the evaluation. Lucy Duggan voiced their concerns: “I didn’t really understand about evidence... I had ideas about [big] outcomes, like reducing reoffending.” However it became clear that not only could they not demonstrate these “big” outcomes, but they were not expected to do so. The stakeholders were interested in softer outcomes such as improvements in wellbeing and perceptions about stigma – outcomes that Light Box could reliably measure.

With our support Light Box brought creativity to the fore, designing evaluation methods that were robust, but echoed the playfulness of their workshops and their focus on ‘Positive Psychology’. This creativity encompassed a variety of things, from bouncy balls, luggage labels and glass jars to a colourful and beautifully illustrated feedback form. Even the validated scale (Warwick-Edinburgh Mental Well-being Scale) used to measure the change in participants’ wellbeing had a positive focus. This wasn’t form over function – each of the methods enhanced the quality and credibility of the evaluation.

And the results? Not many feedback forms get a 99% response rate. Light Box’s did. The data demonstrates the positive impact of the project on participants and makes a strong case for continued funding. Results are communicated in a robust and credible evaluation report, an important advocacy tool. The report also highlights important learning that has already been implemented in the next phase of the project – for example, the need for targeted promotional activities to engage black and minority ethnic communities.

There have been other benefits, such as improved credibility. Light box now has information which can motivate volunteers by demonstrating the value of their input. And another legacy – a detailed evaluation toolkit – means they can confidently do it all again next time, without our help.

Even for projects with small teams and limited resources, a robust evaluation is possible. A guided approach offers a sustainable evaluation solution and one that can be tailored to fit your project and the priorities of your stakeholders.

Link to Author(s):