Articles

Measuring engagement

New research from Scotland shows that engagement is different from satisfaction, and needs measuring differently, as Heather Maitland reports.

Arts Professional
5 min read

Heather Maitland

For decades, marketers have been trying to understand their customers’ experiences. Mason Haire first talked about ‘selling the sizzle not the steak’1 in 1950. And it’s not just marketers. Retailers worry about their customers’ emotional experiences when shopping because they know that improving the ambience of the shop means people buy more.2 Designers worry about how people feel when they use their products because they want to create things that work.3 There’s a whole research field academics have named ‘funology’ that tries to understand what makes a computer game engaging.4 Producers of political documentaries have borrowed techniques from researchers who test TV ads so they can measure the impact of their films on people’s political beliefs.5 There’s a big debate going on among the geeks who design social networking sites about how to measure the depth of engagement that people have with their sites rather than just their usability.6 Mark Ghuneim from Wiredset identifies four levels of engagement, from bookmarking at the shallowest level to creating a fan community at the deepest.7

Scottish arts organisations requested this topic at their cultural summit in February. For us, the facts about the art are the steak. The sizzle is how our audiences, visitors and participants experience them. We tend to measure satisfaction instead: were people were bored during the workshop or appalled by the toilets? This is important. If the answer is yes, they are unlikely to come back. But we can’t stop there.

In its Quality Framework, the Scottish Arts Council talks about engagement: “We assume that every arts organisation has artistic, social and financial objectives, and is interested in how the process of public engagement is contributing to the achievement of those aims.” According to this framework, quality arts organisations demonstrate a commitment to ensuring that audiences, visitors and participants are ‘meaningfully engaged’ with the artistic work our organisation creates, produces or presents.8
The philosopher Csikszentmihalyi9 coined the term ‘flow’ to describe the perfect experience. He says this perfect experience only happens when what you are doing is challenging, you are concentrating so hard that all your awareness is wrapped up in what you are doing, you are in complete control and yet lose yourself in what is going on and you lose track of time. This sounds like a summary of what happens when you experience a great piece of art.

So how do we measure this kind of engagement? Museums are at the forefront of understanding how their visitors engage with exhibits. They regularly measure visitor experience to evaluate the design and layout of exhibitions, and whether their educational objectives are being met. Research into how audiences engage with performing arts experiences is harder to find. Performing arts organisations seem to be more interested in trying to collect a little bit of information – a name and address – from everyone who comes in through their doors. This enables us to look at our audiences’ purchasing behaviour and the postcode tell us a little about what they are probably like. Museums and galleries take a very different approach. They take a sample of visitors, usually less than 300, and find out what they are like, their behaviour, their experience and their attitudes and beliefs.10

Felicity Woolf’s Partnerships for Learning has dozens of ideas for evaluating people’s experiences of the arts including creating a graffiti wall at the end of an exhibition, taking time lapse photographs of workshops, asking children to draw how they feel after seeing a play and inviting comments via video box.11 They sound much more interesting than a questionnaire. But the author advises caution. It’s too easy to end up with 11 hours of video footage that no-one will ever watch or a folder crammed full of kids’ drawings. They may be cute, but can you tell how the children felt?

We need to do more than simply collect responses. We must be able to organise and interpret the information, then put it into context so that we can draw conclusions and use them to make decisions. The starting point is our organisation’s artistic, social, financial and educational objectives. From these we can shape the questions we want to answer. These questions will tell us exactly what we need to know about our audiences’ experiences and the most effective way of finding it out.

But we must be careful that we are not restricting our understanding of visitors, audiences or participants to what we think they ought to be doing and feeling. This is particularly important in evaluating education activities. It can be entirely appropriate to measure whether participants have learned what we intended them to. But this approach treats learning as a finite process that is completed once the educational activity is over. It also tends to measure whether they have learned facts rather than whether they have grasped concepts or got new skills.

Sometimes, we need to take a more open approach and find out what experience our participants, visitors or audiences are actually having. This kind of open-ended research is just as rigorous and reliable as the kind that focuses on evaluating whether the arts organisation’s intentions have been achieved. It is not anecdotal or subjective. The challenge it presents is to find a framework to help us make sense of the information we collect.