• Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email
  • Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email

The biggest ever invitation to tender issued by Arts Council England is for a contract to deliver a standardised system for quantifying artistic quality, which 300 larger arts organisations will be required to use.

Photo of laptop with graphs on screen

Arts Council England (ACE) has published a £2.7m invitation to tender – its biggest ever contract – for the delivery of a monitoring system that will measure the artistic quality of its National Portfolio Organisations’ (NPOs) work.

ACE is pressing ahead with the system despite serious concerns raised following a pilot project last year to test such a system among 150 NPOs. An independent review of the pilot found that arts organisations wanted a more flexible system that would align with their individual artistic objectives, and ACE’s announcement that the system was going to be rolled out provoked anger and disbelief on social media.

Using the system will be mandatory for around 300 of ACE’s largest NPOs, and a further 600 will be encouraged to use it. All will be supported to adopt and implement it.

Standardised quality measurement

The tender is looking for a service provider to deliver a digital platform that will enable NPOs across all art forms to collect audience responses to a standardised set of research questions, known as Quality Metrics, which ask them to rate their arts experiences on a numeric scale. The system, which will operate throughout the new four-year NPO funding round, will also use the same scale to record peer assessments and each organisation’s assessment of its own artistic quality.

ACE is in the process of defining Participatory Metrics to measure the quality of arts organisations’ participatory work. The winning bidder will also support the ongoing development of this work and ultimately incorporate these into the digital platform.

Compulsory involvement

For all the NPOs in ACE’s 2018-22 portfolio that receive £250k a year or more in core funding, use of the system will be compulsory. Around a further 600 NPOs will be deemed ‘optional users’, and the service provider will be required to support them to adopt the system. Should take-up of the system be low among NPOs for whom it is not compulsory, ACE “may require a number of other significant grant recipients to take their place”.

Each organisation will have to conduct a minimum of four audience surveys each year and submit the data collected to a central pool. Participating organisations will have access to their own data, but it will also be reported to ACE for quality monitoring and decision-making purposes, in order to provide them with a “comprehensive and consistent view of quality across the portfolio”. The system will be linked to ACE’s current Artistic and Quality Assessment programme, which the new system will ultimately replace.

Although NPOs won’t be charged a fee to be linked into the Quality Metrics framework, the supplier will be permitted to charge them if they operate beyond the standard specifications of the scheme, such as completing too many evaluations, collecting too many responses, adding bespoke questions or requiring additional data analysis.

Metrics in doubt

ACE has spent over £700k since 2013 on developing and testing the 12 statements around which the new Quality Metrics system will be constructed. These were drawn up by a group of Manchester-based arts organisations as generic quality measures that can be applied across all artforms and contexts. But ACE is now raising questions about their suitability and no longer views them as being final.

   

The core quality metrics – currently under review

Self, peer and public:

  • Concept: it was an interesting idea
  • Presentation: it was well produced and presented
  • Distinctiveness: it was different from things I’ve experienced before
  • Challenge: it was thought-provoking
  • Captivation: it was absorbing and held my attention 
  • Enthusiasm: I would come to something like this again
  • Local impact: it is important that it’s happening here
  • Relevance: it has something to say about the world in which we live
  • Rigour: it was well thought through and put together

Self and peer only:

  • Originality: it was ground-breaking
  • Risk: the artists/curators really challenged themselves
  • Excellence: it is one of the best examples of its type that I have seen

Following concerns raised by arts organisations participating in the pilot study, consultants have been appointed to investigate their suitability “for the diverse audience and participant groups our National Portfolio Organisations work with”. Their investigation is testing the language of the metrics statements among arts and cultural audiences, and with participants in different age groups, with disabilities and other additional or complex needs, from different socio-economic backgrounds and for whom English is not their primary language.

Policing the system

An independent evaluation of the pilot found that arts organisations were unconvinced that standardised metrics are appropriate for assessing the quality of their work and instead want a context-specific “bespoke, tailored approach” that aligns with their own artistic objectives. It also noted a catalogue of concerns about the logistics of the approach, including the administrative burden it places on organisations; the difficulty of recruiting peer evaluators; the need for close cooperation between venues and touring companies; and the impact of “survey fatigue” on audiences, staff and the reputation of their organisations.

ACE told AP that anecdotal evidence and feedback from the pilot suggest that the potential for cheating is “something the sector is conscious of”. Therefore, the winning bidder is being required to monitor NPOs’ use of the system, report back to ACE on their compliance and put “procedures in place to prevent users ‘gaming’ the system”. They are also being tasked with “upskilling NPOs in their use of data, encouraging greater use of the framework and supporting NPOs to develop effective peer review networks”.

Confidentiality and data ownership

The question of who will own the database generated through the Quality Metrics research has not been resolved. ACE told AP: “The details of this will be agreed through contract negotiation with the preferred service provider.” Consequently, it is uncertain whether the supplier will be permitted to use the data to profit from its unique position by selling additional services to the sector or to ACE. ACE has confirmed, however, that “the service provider will not be able to commercialise the IP [Intellectual Property] generated through the contract”.

Odds-on favourites

Although the closing date for responses to the tender is not until 6 March, it seems unlikely that other potential suppliers will be in a position to compete effectively with Counting What Counts Ltd, the company that ran the grant-funded £300k pilot study which informed the development of the tender document.

Director of Counting What Counts Ltd, John Knell, has been involved with the development of Quality Metrics from the very start, but until now he and his company have been funded by grants, and there have been no opportunities for other suppliers to bid for the work.

The tender document also specifies that the Quality Metrics system must be ready for the chosen provider to start “engaging and supporting organisations” to use it from July 2017. Other potential suppliers are at a significant disadvantage under this timetable, given that ‘Culture Counts’– the digital platform used by Counting What Counts Ltd in the pilot – has been developed and tested for England’s NPOs over several years.

It is likely that The Audience Agency (TAA), the national arts development agency supported by ACE, will be a partner in the winning bid team, as written into the specification is the requirement for the service provider to link the data generated with TAA’s AudienceFinder data analysis tool.

Reactions

Making a standardised quantitative system mandatory marks a U-turn in ACE’s attitudes towards assessing artistic quality. Previous Chief Executive Alan Davey went on record to explain why ACE had “got rid of the previous box-ticking approach” to measuring success. “We’ve learned from experience that over mechanistic, usually instrumental, measurements do not begin to describe the true value of the arts,” he said.

ACE’s new Chair, Nick Serota, also appeared to express disbelief at the Quality Metrics plans when they were first announced, but ACE has now clarified his position: “…whilst Nick may not be in favour of box-ticking this is not what Quality Metrics is. In short, we can confirm that Nick is in favour of the plans.”

Others are more sceptical. The CEO of one organisation likely to be required to use the Quality Metrics system said: “This is a lot of public money to invest in the hope that it might deliver something tangible for ACE and the sector. There appear to be many unanswered questions over the project’s aims, viability, and integrity, which hopefully will be addressed before it goes live.

“The track record of public sector investment in technology based solutions isn’t great, so one wonders how much return on investment it will bring before it is invariably replaced by a new idea or a solution to the age old problem of how you quantify the impact and value of creativity.”

Another NPO director said: “Leaving aside the issues that colleagues have raised about the appropriateness of such a method of ‘measuring’ art, the cost of this is just staggering. ACE seems to have been convinced by someone to spend a huge amount of money on something that could be done for very little with a set of standardised questions and a Google form. I can’t see what the money is actually going to be spent on.

“If this ultimately is delivered by the people who suggested it in the first place then it smells very bad. Just think of all the great art that could have been produced with the £3m spent on this. I give it a two out of ten.”

 

Read Editor Liz Hill’s view: Why Quality Metrics is a really bad idea

Author(s): 
Liz Hill