Researcher Stephen Pritchard raises concerns that the latest evaluation of ACE’s Creative People and Places programme was based on fatally flawed methodology.

Arts Council England’s (ACE) Creative People and Places (CPP) funding programme aims to “increase attendance and participation in excellent art” in areas of “least engagement”. It recently released two commissioned reports evaluating CPP’s performance. A New Direction (AND) were commissioned to coordinate an evaluation based on meta-analysis produced by Ecorys; The Audience Agency (TAA) were commissioned to produce audience profiling and mapping. Unsurprisingly, self-congratulation ensued, with AND Partnerships Director, Holly Donagh, claiming CPP was producing “powerful results” that could become “a model for the future of arts engagement”.

The trouble is that both reports are fatally flawed. They both rely solely on data from CPP projects and are both produced by organisations with vested interests. Neither report seems to think it might be useful to garner first-hand responses from people who took part in events in these places, nor do they attempt to analyse the perceptions, knowledge and understanding of CPP projects from people in the communities who may not know about or intentionally did not participate in CPP events. This leaves the reports and the data upon which they are based looking inherently biased. Furthermore, AND’s “logic model” and “theory based approach” are both weak and focused solely on outcomes and outputs. This, like TAA’s report, which reduces audiences into flippant categories like ‘Trips and Treats’, ‘Facebook Families’ and ‘Dormitory Dependables’, turns people into “segments” and numbers and pretty little graphs.

How can an evaluation about increasing participation in groups labelled as being low engagers fail to provide sufficient demographic data to enable a detailed analysis of age group, gender, ethnicity and socio-economic background and disabilities/illnesses? The data was often of mixed quality, widely different in terms of proportions of responses, inconsistent and, in some cases missing. And, worryingly, the limited demographic data collected about “previous arts engagement” represented only 19% of total events, and data about “socio-economic background” was collected in only 4% of cases!

The projects did not have a consistent approach to collecting, analysing and disseminating data and, although excellence was deemed particularly important, the 21 projects could not agree on a definition! In almost every case, it was found that the projects did not consider or address ethical issues seriously or at all – they were not following ethical standards; therefore, the meta-analysis cannot adhere to ethical standards. This, when considered alongside the lack of methodological clarity with which the wildly varying data was collected by the various project consortia, makes the data almost impossible to verify and therefore it is deeply unreliable. Indeed, without detailed critical ethnographic research into CPP projects and their wider contexts, the reports reflect a one-sided and probably biased account of CPP which ticks the boxes and leaves everyone singing Hallelujah before quietly sneaking off in the hope the data and methodology is not questioned.

But it is not just the questionable nature of the data that is problematic. The agencies commissioned by ACE to produce the reports means their evaluations are highly likely to be biased. Both TAA and AND are National Portfolio Organisations (NPO). Both are classed as ‘Sector Support Organisations’, with TAA funded from the Grant in Aid budget and AND from Lottery funds. TAA joins the National Portfolio in the 2018-22 round with annual funding of £750k; AND is an existing NPO and will remain so in the next round with funds of £1.5m per annum. And, whilst TAA has a national remit, AND is funded only for work in London.

Furthermore, TAA’s Chair, Sheila Healy, is also Chair of ACE South West Area, a member of Tate St Ives Advisory Council, a member of the HEFCE Catalyst Fund panel, and a director of Situations and Kneehigh Theatre. Another of TAA’s board, Andrea Nixon, is Executive Director of Tate Liverpool, and a board member of the Crafts Council and North-West Museums Group. However, it is even more difficult to accept the decision to commission AND to evaluate CPP because AND is a consortium partner with the Creative Barking and Dagenham CPP project. Interestingly, AND are based in Shoreditch, not Barking and Dagenham.

It is therefore incredibly difficult to accept the neutrality of two NPOs with vested interests in CPP and ACE to produce these two reports about CPP on behalf of ACE – the initiator and funder of CPP. It is therefore of little surprise that the reports celebrate seriously suspect data as “evidence” that CPP is having positive impacts. (The AND report mentions “evidence” 95 times!)

If the arts want to make claims about evidence, then they really should commission some professional, unbiased academic research rather than waste funds on inadequate ‘grey literature’. But these reports function in the same way that hypnopaedia works in Brave New World: if sufficiently repeated, people will accept them as undeniable truths. So, undeniably, progress is lovely. Progress is lovely. Progress is lovely. Progress must be lovely.

Stephen Pritchard, PHD Researcher
Northumbria University
Link to Author(s): 


Not sure I agree academic research is necessarily the best way forward for arts projects. A lot of smaller arts organisations are increasingly concerned academic institutions are receiving more and more of the funding available. See the £3 million awarded to academics in Business Schools to look at teaching the arts resilience - when most smaller arts organisations and their communities know 10 times more than anyone about that subject. See also the recent fiasco about AHRC supporting a group of white, middle class, middle aged researchers to teach the arts how to be more diverse.

Hi CT! Your comments about academic research are relevant but do highlight some particularly catastrophic examples. My point is how the "research" is USED; how it's disseminated and the claims made about how the "evidence" (a word used 95 times in the report) legitimises a state-initiated programme based on seriously questionable intentions. My critique of the CPP report is founded on three key arguments (although word count prevented me from addressing them): 1) I am vehemently opposed to most quantitative measurement of the arts, humanities and social sciences - particularly that which makes empirical claims and is used to further the ideology of positivism; 2) I am fervently opposed to CPP and find its claims and raison d'etre to belittle and undermine communities rather than "empower" them; 3) I believe the report actually reproduces the hierarchical, "we know best" (when it is patently obvious they do not) system of the arts and much "grey literature" which fails to involve people - this means that it contradicts the very stated aims of CPP. I explain in my critique that I support critical ethnographic research and suggest that academic research of this specialist and highly qualitative and participatory nature would be best suited to a detailed and rich analysis of CPP. I suspect such an approach would reveal the limitations and inadequacies of CPP and potentially undermine all of its overblown claims. I believe that a critical ethnographic approach as the core of a critical participatory action research project would produce not only great, grounded research but that it would also involve participants and communities in designing the research and undertaking it. I believe this approach would reflect the stated aims of CPP, rather than, as the present report does, contradict it. The CPP research does not involve participants or community members, it does not use a critical methodology, etc. This is problematic because CPP claims to be an "action research project" based upon "participation" at every level and with communities taking control of, or at least co-producing, every aspect of the project. A critical participatory action research project with an interdisciplinary team led by community members and participants, rather than directors, etc., would fit these aims. The present report does not. It is for these reasons that I claim the report is fatally flawed. Hope this opens up the debate and makes my position clear?

I couldn't agree more with Stephen about the importance of appropriate methods of researching participative and socially engaged arts - both of which necessitate an expanded approach one capable of being ethnomethodologically viable and ecologically valid ..... Stephen.... is there a full version of the report I could read? Please get in touch

My response - apologies for IPhone issues earlier - was to your unquestioning claim that academia was the best place for this evaluation to come from. As someone who left academia because of its increasing focus on earning money, even if that means taking money that should more rightly be with small, grassroots arts organisations, I am equally sceptical about Arts Council and others placing the funding for evaluation in the hands of large academic institutions. These are not two isolated examples as you suggest. Universities are increasingly elitist, capitalist, business models and offer little in the way of challenge to the cultural hegemony. And just as willing to tow the line of their paymasters. So, yes put the focus on participants. But trust that artists and the communities they work with might be able to evaluate their own projects. And give them these massive funding tranches. In solidarity.