• Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email
  • Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email

New research from Scotland shows that engagement is different from satisfaction, and needs measuring differently, as Heather Maitland reports.

Heather Maitland

For decades, marketers have been trying to understand their customers’ experiences. Mason Haire first talked about ‘selling the sizzle not the steak’1 in 1950. And it’s not just marketers. Retailers worry about their customers’ emotional experiences when shopping because they know that improving the ambience of the shop means people buy more.2 Designers worry about how people feel when they use their products because they want to create things that work.3 There’s a whole research field academics have named ‘funology’ that tries to understand what makes a computer game engaging.4 Producers of political documentaries have borrowed techniques from researchers who test TV ads so they can measure the impact of their films on people’s political beliefs.5 There’s a big debate going on among the geeks who design social networking sites about how to measure the depth of engagement that people have with their sites rather than just their usability.6 Mark Ghuneim from Wiredset identifies four levels of engagement, from bookmarking at the shallowest level to creating a fan community at the deepest.7

Scottish arts organisations requested this topic at their cultural summit in February. For us, the facts about the art are the steak. The sizzle is how our audiences, visitors and participants experience them. We tend to measure satisfaction instead: were people were bored during the workshop or appalled by the toilets? This is important. If the answer is yes, they are unlikely to come back. But we can’t stop there.

In its Quality Framework, the Scottish Arts Council talks about engagement: “We assume that every arts organisation has artistic, social and financial objectives, and is interested in how the process of public engagement is contributing to the achievement of those aims.” According to this framework, quality arts organisations demonstrate a commitment to ensuring that audiences, visitors and participants are ‘meaningfully engaged’ with the artistic work our organisation creates, produces or presents.8
The philosopher Csikszentmihalyi9 coined the term ‘flow’ to describe the perfect experience. He says this perfect experience only happens when what you are doing is challenging, you are concentrating so hard that all your awareness is wrapped up in what you are doing, you are in complete control and yet lose yourself in what is going on and you lose track of time. This sounds like a summary of what happens when you experience a great piece of art.

So how do we measure this kind of engagement? Museums are at the forefront of understanding how their visitors engage with exhibits. They regularly measure visitor experience to evaluate the design and layout of exhibitions, and whether their educational objectives are being met. Research into how audiences engage with performing arts experiences is harder to find. Performing arts organisations seem to be more interested in trying to collect a little bit of information – a name and address – from everyone who comes in through their doors. This enables us to look at our audiences’ purchasing behaviour and the postcode tell us a little about what they are probably like. Museums and galleries take a very different approach. They take a sample of visitors, usually less than 300, and find out what they are like, their behaviour, their experience and their attitudes and beliefs.10

Felicity Woolf’s Partnerships for Learning has dozens of ideas for evaluating people’s experiences of the arts including creating a graffiti wall at the end of an exhibition, taking time lapse photographs of workshops, asking children to draw how they feel after seeing a play and inviting comments via video box.11 They sound much more interesting than a questionnaire. But the author advises caution. It’s too easy to end up with 11 hours of video footage that no-one will ever watch or a folder crammed full of kids’ drawings. They may be cute, but can you tell how the children felt?

We need to do more than simply collect responses. We must be able to organise and interpret the information, then put it into context so that we can draw conclusions and use them to make decisions. The starting point is our organisation’s artistic, social, financial and educational objectives. From these we can shape the questions we want to answer. These questions will tell us exactly what we need to know about our audiences’ experiences and the most effective way of finding it out.

But we must be careful that we are not restricting our understanding of visitors, audiences or participants to what we think they ought to be doing and feeling. This is particularly important in evaluating education activities. It can be entirely appropriate to measure whether participants have learned what we intended them to. But this approach treats learning as a finite process that is completed once the educational activity is over. It also tends to measure whether they have learned facts rather than whether they have grasped concepts or got new skills.

Sometimes, we need to take a more open approach and find out what experience our participants, visitors or audiences are actually having. This kind of open-ended research is just as rigorous and reliable as the kind that focuses on evaluating whether the arts organisation’s intentions have been achieved. It is not anecdotal or subjective. The challenge it presents is to find a framework to help us make sense of the information we collect.

Heather Maitland is a freelance consultant in marketing and audience development.
t: 01949 843161; e: hmaitland1@aol.com

1 Haire, Mason, ‘Projective Techniques in Marketing Research’, Journal of Marketing, 14, (1950), pp 649–656
2 Richins, Marsha L., ‘Measuring Emotions in the Consumption Experience’, Journal of Consumer Research, 24 (1997) and Machleit, Karen A., ‘Emotional Response and Shopping Satisfaction: moderating effects of shopper attributions’, Journal of Business Research, 54 (2001), pp 97–106
3 Batterbee, Katja, Co-experience: understanding user experiences in social interaction, (2004, Helsinki: University of Art and Design), academic thesis, p 120
4 Batterbee, Katja (2004), p 55
5 Henry, Leonard, ‘Measuring Audience Response”, Jump Cut: a review of contemporary media, 26, (1981), pp 68–69
6 Anita Campbell, ‘Start Measuring Audience Engagement’, (2007), consulted 11/2/08 at www.sellingtosmallbusinesses.com/start-measuring-audience-engagement/ and Steve Bridger, ‘Engagement is Not Made to Measure’ consulted 11/2/08 at http://www.nfp2.co.uk/2007/01/29/engagement-is-not-made-to-measure/
7 Mark Ghuneim et al ‘Terms of Engagement: measuring the active consumer’ consulted 10/3/08 at http://www.wiredset.com/root/ archives/008589.html
8 Scottish Arts Council, Quality Framework: guidelines for arts organisations, (2007), p 7
9 Csikszentmihalyi, Mihalyi, Flow: The Psychology of Optimal Experience, (1991, New York: Harper).
10 Allard, Michael, ‘The Evaluation of Museum-School programmes: the case of historic sites’, Museums, Media, Message, ed. Eilean Hooper-Greenhill (Abingdon: Routledge, 1995), p 235 and Economou, Maria, ‘Evaluation Strategies in the Cultural Sector’, Museum and Society, 2 (2004), pp 30–46
11 Woolf, Felicity, Partnerships for Learning, (1999, the Regional Arts Boards and the Arts Council of England)