• Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email
  • Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email
Centre for Cultural Value banner

As the Centre for Cultural Value launches a free online evaluation course, Dawn Cameron explains how evaluation can help organisations understand the real value of what they do, not just what others want to hear.

An actor stands in front of a tent surrounded by water
Theatre company Slung Low's Flood Part 4. They have openly shared how they navigated conflicts in leadership
Photo: 

Slung Low, Malcolm Johnson

When I was studying for my master’s in Social and Public Policy at Leeds, I learned some basic principles of research strategy and design that I still use in my practice as a consultant. Things like: What questions are you seeking to answer? Can they be answered through this particular intervention? What is knowable?

Fast forward several years when, as coordinator of a leadership programme, I had my first experience of being evaluated. The evaluator was a brilliant woman who could not have been more inclusive, collaborative and accommodating. But the first time I read her evaluation all I saw were criticisms. Worse, I received those criticisms as personal slights.

It’s not personal

I had to have a word with myself. At the time, I thought that if the project was flawless, I would know it was successful. I didn’t consider that projects are made up of many parts and some of which are human beings.

I had to separate my ego from the practice she observed. Once I did, I found she saw things I didn’t and her viewpoint – being removed – lent a particular weight to her observations.

It's easy to agree we should talk about failures and learn from them. But it’s easier said than done, especially when it feels personal. As this case study from theatre company Slung Low makes clear, failure is easier to manage when there’s a sense of collective responsibility.

Evaluation can help us shift to a more collective mindset – that we’re all pulling in the same direction towards a common goal. There are bound to be challenges and mistakes along the way.

Facing up to criticism

I often reflect on a project we evaluated some years ago. On the whole, it was fairly positive but it did uncover some serious issues. The evaluation report suggested ways they could be addressed.

We met to discuss it, tears were shed and that was that. I don’t know who the report was shared with, but I’d guess circulation was very limited.

A few years later, the organisation found itself in dire straits. The issues raised in the report had not been addressed, and they festered then worsened. People were held responsible and the organisation effectively collapsed.

I can’t know what impact sharing the outcomes would have had, but I do know that not sharing them and not addressing serious systemic failures put the organisation at risk.

Evaluation in the open

When it comes to showing and sharing the outcomes of evaluations, my preference is for them to be published as widely as possible, confidentiality and privacy considerations notwithstanding.

This doesn’t happen as often as it should. But it’s also important to acknowledge there are times when being authentic and open can be extremely difficult in the context of diminishing arts funding (particularly revenue funding).

Organisations, especially smaller ones, are increasingly expected to deliver an array of public policy imperatives, while often being dependent on a short-term, freelance workforce. This can lead to increased pressure to develop projects that deliver to funders’ priorities rather than to organisations’ strengths and individuals’ needs and interests.

How do we sustain and integrate learning when organisational infrastructure can feel very fragile? All the more reason for evaluation to be central to our work and learning.

It’s encouraging that funders are calling for more open practices of evaluation. For example, Holly Donagh from the Paul Hamlyn Foundation addresses this in a recent episode of the Centre’s Reflecting Value podcast.

Of course, this doesn’t have to mean that we’re open about everything to everyone, but neither should it mean that evaluation can’t be a fruitful and rich learning journey for everyone involved.

A process, not an endpoint

I really enjoy being able to point organisations in the direction of excellent practice I’ve seen and heard about; we have much to learn from each other.

To find out more, the Centre has developed a free-to-access online course: Evaluation for arts, culture and heritage: principles and practice. It builds on the Centre’s co-created Evaluation Principles and aims to help organisations build evaluation skills and confidence, with support from some of the experts in the field.

The course will be available from Monday 18 September, and you can register now

Dawn Cameron is a freelance researcher, project manager and evaluator. (With thanks to Emma McDowell from the Centre for help to inform this article.)

www.culturalvalue.org.uk | www.armstrongcameron.com
@valuingculture

This article, sponsored and contributed by the Centre for Cultural Value, is part of a series supporting an evidence-based approach to examining the impacts of arts, culture and heritage on people and society.

Link to Author(s): 
Image of Dawn Cameron
Arts Professional welcomes readers' opinions. Please ensure your comments observe our policy.

Comments

Really interesting to read this article, thank you...and it prompted a couple of questions! First, wondering whether it’s ‘the other way round’ in terms of sticking to privacy and confidentiality, notwithstanding the importance of sharing learning as widely as possible? Second, wondering whether the role of the evaluator is gathering multiple experiences and perspectives alongside observation and other data collection, with a focus on the project/programme aims etc, so there’s a whole picture to analyse, in terms of capturing impacts and learning?