• Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email
  • Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email

Arts Council England is forging ahead with plans to impose a standardised system for measuring artistic quality on its NPOs, despite a lukewarm sector response and warnings that this will require a “quantum change” in organisational attitudes to data. 

Photo of Hull Truck performance
ACE will be using standardised measures to assess quality at larger NPOs
Photo: 

Hannah Nicklin (CC BY-NC-SA 2.0)

A national system for evaluating the quality of artistic work is to become compulsory for many of Arts Council England’s (ACE) National Portfolio Organisations (NPOs) despite an evaluation of a pilot scheme revealing wide-spread concerns about the approach.

An independent evaluation of the scheme has found arts organisations to be sceptical about the value of a standardised approach to measuring quality, and to have serious concerns about the administrative burden associated with it.

ACE is pressing ahead with a roll-out of the scheme despite the feedback. Organisations across all artforms will be “required to use a specified system to complete an agreed number of evaluations each year and support each other by providing peer reviews”. This will be mandatory for NPOs receiving more than £250k a year in regular funding, while those funded below this level will be encouraged and supported to use it.

Quality Metrics

The controversial Quality Metrics pilot, led by the agency Counting What Counts Ltd, has attempted to develop a “meaningful measure” of artistic quality that yields consistent and comparable findings across different artforms and types of organisation.

The pilot programme gave 150 NPOs the chance to gather opinions on the perceived quality of their artistic work, based on a set of standardised dimensions, referred to as ‘Quality Metrics’, which were originally drawn up in a sector-led exercise by a Manchester-based consortium of arts organisations.

Participants in the pilot ‘triangulated’ their research by combining survey data collected from the public and arts sector peers with their own self-evaluations. They used a new digital ‘platform’, Culture Counts, to manage the process and handle the data.

Evaluation findings

The independent evaluation of the pilot programme was conducted by Canadian consultants Nordicity in partnership with a UK-based consultant whose name Arts Council England has refused to release.

Their evaluation report reveals that arts organisations don’t view all the metrics as appropriate measures of quality. Rather than adopt the standardised measures, it concludes that they want a “bespoke, tailored approach” that aligns with their individual artistic objectives and allows them to “select metrics that measure what matters”.

The evaluation is at odds with earlier research by John Knell, Director of Counting What Counts Ltd, and former ACE Director of Research Catherine Bunting, who claimed that their study had “proved that the cultural sector is capable of generating a clear consensus on outcomes and standardised metric dimensions to capture the quality of their work”.

Concerns

Pilot participants were found to be “broadly positive” about the idea of using the Culture Counts platform when cost was not a consideration. But the majority of organisations said they would be unwilling to pay to use it or would only pay up to £100 per year for it – a fraction of the £2,000 a year fee that Counting What Counts Ltd is proposing to charge.

Organisations involved in the trial reported a catalogue of concerns, most significant of which was the administrative burden arising from implementing the system. Recruiting peers to evaluate their work proved to be the biggest challenge, with many organisations giving up on this entirely. Only 921 of the 2,250 peer evaluations that should have happened actually took place.

Among the 11 factors listed as likely to deter the sector from adopting the Quality Metrics system were:

  • the current survey design functionality and the accessibility of the Culture Counts software system
  • the lack of collaboration between receiving and presenting venues and the administrative burden placed on touring organisations
  • a lack of confidence in the reliability and validity of the data
  • confidentiality concerns over data ownership
  • limited capacity for delivering the programme, especially for non-ticketed venues and those with combined arts programmes with many ‘one off’ events.

Some participants were also concerned about ‘survey fatigue’ as audiences are faced with a growing number of questionnaires.

Slow process

The evaluation pointed to a widespread lack of organisational expertise among NPOs in interpreting and using data gathered from the surveys. It concludes that “a quantum change” in organisational attitudes towards using data and assessing quality would be required for the programme to be successful.

The consultants said: “Organisations believe that suitably assessing quality across the portfolio is not going to be a quick win, and certainly not resolved through one national test phase.”

Their view is contrary to that of the Culture Counts team behind the trial. Announcing their own report, which includes details of the data output from the Quality Metrics trial, they said: “The project has resoundingly confirmed that funded arts and cultural organisations, if offered the right tools and support, can self-drive large-scale evaluation activity within a very short time frame. They have also displayed a willingness for new ways of engagement with peers and audiences about the quality of their work.”

Ploughing on

ACE has announced: “The trial has given us confidence that there is sufficient support within the sector to examine the quality of work in an exciting, contemporary way, enabling us to now support a wider roll out of the quality metrics.” It said the system “will help us all understand and talk about quality in a more consistent way”.

The service provider for the roll-out will be chosen in a competitive process. The successful contractor will be appointed by the middle of 2017, with a view to supporting all NPOs to adopt the framework during the 2018-22 funding period.

Some NPOs will be required to sign up to the Quality Metrics programme as part of their funding agreements. An ACE spokesperson told AP that any financial implications for NPOs will be “determined ahead of NPO guidance being published in October”.

Mounting costs

The future cost of the initiative – and whether any of the costs will fall to arts organisations themselves – remains unclear. An ACE spokesperson told AP that they intend to specify a proposed budget upper limit for the tender process to “help ensure value for money”.

Public investment in the Quality Metrics programme has already topped £700k, including £75k to develop the metrics; £300k to develop the Culture Counts approach; £27.5k to develop a parallel set of metrics suitable for participatory work by with and for children and young people; and the £300k for the recent NPO trial.

Sector reaction

Some arts organisations that will be affected by the roll-out of the Quality Metrics programme appear to be unaware of the plans, which have been announced in a blog post by ACE Executive Director of Arts and Culture, Simon Mellor. Few have been willing to go on the record with their views. 

One arts professional, who has been involved in both the Culture Counts national trial and responded to the evaluation of the programme, told AP: “I would be concerned with reducing artistic quality to a statistic – it is an important measurement but one that is often personal. To evaluate quality of experience requires a deep understanding of the context of the work. I would hope that the diversity of the audience is taken into account.”

His views have been echoed by an NPO that has not been involved in the trial, but will be required to adopt the system. A spokesperson told AP: “One person’s idea of quality is completely different from the next. Context is everything and this looks like a blunt instrument that will add cost but not a great deal of value. It appears that there is a value to ACE in reducing its reliance on experienced relationship managers actually going out to carry out assessments, in favour of an automated tick box culture that will miss the nuances and surprises that are generated when you think and programme outside of the box.”

Author(s): 
Liz Hill

Comments

I am confused... Culture Counts and Counting What Counts both have John Knell as a director and the platform is up and running. When ACE says the contract will be put out to compeititve tendering for a service provider, how can it? Who else is going to take on another framework (and would they have to pay Culture Counts a license?)? Surely it's entirely anti-competitive, because the system will belong to the two companies that have so benefited from much of the £700k of public money that ACE has spent on this - all of which was doled out without due process or tendering or anything else. How is any of this legal? How does it conform with anti-competitive law or fit with Nolan standards for awarding contracts? And if Counting What Counts Ltd stand to profit from the system and can charge what they like, is it not in breach of competition law to make NPOs sign up to a system without choice? AP - I think you should be looking into the structures behind these companies and then making some FOI to ACE. John Knell is a great chap and good at what he does, but this really seems to be rigging the market in favour of one private company. This is nearly three quarters of a million quid on something that hasn't worked and is hated - how does ACE get away with it? Why do we let them get away with it by not having the balls to speak up and speak loudly?