• Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email
  • Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email

Looking to measure the impact of their work in hospitals, Air Arts found that pre-existing evaluation methods didn’t work. Laura Waters describes their bespoke approach.

Photo of woman with an instrument in hospital
An occupational therapist supporting a music workshop on the stroke unit

Air Arts is the arts charity for Derby Teaching Hospitals. It was established as a visual arts programme in 2007, with the construction of the Royal Derby Hospital to help create a welcoming environment and improve wayfinding in a new building.

The programme has since developed across all artforms and delivery styles with a strong focus on participation by patients and staff. Our initial participation work was well received, but felt more like a nice add-on rather than an intrinsic, valued part of the healthcare offer. To make our work really effective, sustainable and embedded, we needed to develop it in partnership with patients and staff, responding to their needs and fitting in with routines.

For us to spend time on in-depth research and gathering hard evidence we will have to divert our time and resources away from precious delivery

We designed ‘Engage’, a three-year programme funded by Arts Council England, as an intensive programme of development to research and test theories of participation and to develop models of engagement with staff and patients to aid wellbeing.

Lead artists met with nursing staff to find departments willing and able to support the project over a period of several months. They then spent time getting to know the department to ensure their project would fit seamlessly into the ward, and to identify the main issues, which ranged from anxiety and boredom to encouraging patients to use the dayroom and sparking conversation. The Engage models were then designed in partnership with patients and staff to tackle these issues and the weekly sessions took place over several months at the bedside, in dayrooms and outpatient clinics, as one-to-one or group sessions.

We were keen to find evidence of the impact on wellbeing of the staff and patients, and to record and analyse the changes that were taking place. Every session was monitored for numbers, age, gender and ethnicity, and qualitative feedback was gathered through observation, interview and comment cards.

This gave us information on how the work was being received, who was taking part and how well it was embedded into the service, but we still wanted some more formal quantitative measures of impact on wellbeing.

Use of questionnaires

We considered a variety of formal wellbeing and mood scales, but concerns were raised that these were inappropriate for the hospital environment for the following reasons:

  • The questions were too lengthy, detailed and in many cases inappropriate for patients in hospital. One in four of our patients suffer from dementia, so detailed questionnaires are simply not an option.
  • Some questionnaires would take too long to complete when interactions with patients were very short due to severity of illness or clinical interventions.
  • No single set of questions seemed to fit all patient groups.
  • Finally and most importantly, it was felt that using a questionnaire during a creative interaction would fundamentally alter the nature of the interaction by focusing on the end-result rather than the process and potentially reduce the impact or even negatively impact on the creative activity.

There did not appear to be a pre-existing method of data collection that could be universally applied, so we worked with the patient experience team and came up with two simple questions: “How did this activity make a difference to your day?” and “Would you recommend creative activity like this to friends and family?”

This worked well as an interview or a written response and it could easily be included as a natural conclusion to the activity without the artist needing to switch into evaluation mode. We gathered a 100% record for recommendations from patients, with staff and patients reporting a positive impact on their mood and hospital experience. Staff told us that the project gave them the incentive and even permission to use artistic activities as a way of caring for patients and enable them to care more holistically.

Isolating the effects

This is all great, but it doesn’t necessarily translate into a significant impact on levels of wellbeing, recovery rates or staff morale. So we looked at patient experience statistics, which showed us that during the Engage residencies, overall NHS friends and family test scores for the wards had gone up and staff absence figures had gone down.

We were cautious to draw any far-reaching conclusions, as it is difficult to isolate the effects of our project from the many other factors influencing these scores, particularly as the purpose of this work was to embed it into the very system which we needed to isolate it from to measure it.

With a more formal study, it may be possible to demonstrate real cost benefits through a decrease in staff absence or shortened average length of stay as a result of creative interventions. But even if we prove this for one project, is this transferable to the rest of our work, and would it be useful? While I am keen to provide evidence of our work in a more formal way, I am mindful that this should not distract us from our real purpose which is to enhance the hospital environment and improve patient and staff experience.

A middle path

In the main, our funders are already convinced by the value of the work through our qualitative evidence and cynics will remain cynical even in the face of more solid evidence. For us to spend time on in-depth research and gathering hard evidence we will have to divert our time and resources away from precious delivery. And it may not even yield a very meaningful result.

An added difficulty for an acute hospital is that a patient’s average stay is only three to four days, so our hospitals do not lend themselves to in-depth study or gathering longitudinal evidence.

A middle path is to ensure our qualitative evidence is well presented, detailed, well analysed and well communicated, leaving us time and resources to get on with the job of delivery.

In spite of the practical difficulties in evaluation and research, arts in health research in general is a rapidly growing field with many excellent studies now being published, and we can make the most of this by combining our qualitative evidence with references to relevant studies to demonstrate the overall value and impact of arts in health, and to give our projects a more academic and rigorous underpinning.

Inspiration to continue

I will finish by sharing a patient quote from one of our residencies, which encapsulates the overall response from Engage, and inspires me to continue the work: “I wouldn’t mind coming to hospital if every day was like this.”

Laura Waters is Arts Programme Manager at Derby Teaching Hospitals NHS Foundation Trust.
www.airarts.net
Tw: @air_arts

Link to Author(s): 
Headshot of Laura Waters

Comments

Thanks for sharing Laura. Sounds like you came up with a solution that was fit for purpose in quite a challenging environment. As a researcher and evaluator often working in diverse and challenging environments I am very much of the belief that the comfort and agency of the participant/artist comes first and you have to build the most meaningful and rigorous evaluation framework from there. Working in hospitals and dementia care environments especially so. What I find most challenging however is the ongoing discourse that qualitative research and data is somehow less valid, or the poor cousin of quantitative approaches. Having worked in a public health and medical research environment which completely understood the value and rigour of methodologically robust qualitative approaches, I don't know why the arts sector continues to see quantitative research as somehow more 'solid', 'harder', 'rigorous' etc. Quantitative approaches can be totally rubbish, whereas well-thought through qualitative approaches can be amazing and provide all the 'evidence' anyone might need. I find the idea of what counts as 'evidence' in a lot of arts evaluation quite lazy, regressive and patriarchal. The discourse, of course, is continued by trustees, chairs, funders etc. who know nothing about social science research asking for more and more of the 'hard stuff'.