• Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email
  • Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email

AHRC’s investment will support artificially intelligent programmes to create unified virtual collections of the UK’s museums, archives, libraries, and galleries.

Woman looking at artefacts in the British Museum
AI will create a digital database of The British Museum's founding artefacts
Photo: 

Matteo Vistocco

Five new projects worth £14.5m will consolidate the UK’s heritage assets digitally and uncover new, diverse histories.

The Arts and Humanities Research Council (AHRC) is investing in the research and development of artificial intelligence (AI) to connect the UK’s cultural artefacts and historical archives.

Details of the major funding programme, which aims to dissolve barriers between the UK’s culture and heritage collections, were announced on Tuesday (September 21). 

READ MORE:

AHRC Executive Chair Christopher Smith called it “the most ambitious phase of research and development [involving AI technology] ever undertaken” in the culture and heritage sector.

It is the largest investment to date in UK Research and Innovation’s £18.9m programme, Towards a National Collection.

Launched in 2019, the five-year programme aims to open up digital access to the UK’s heritage collections globally, according to programme director Rebecca Bailey.

“Eventually everyone will have the ability to access an outstanding trove of stories, imagery and research linking together the limitless ideas and avenues in our national collections.”

AHRC’s investment will support five ‘Discovery Projects’ designed to address specific physical, digital and organisational solutions for bringing the UK’s disparate collections together.

With 15 universities, 63 heritage collections and institutions and more than 120 researchers and collaborators, the project is the largest of its kind anywhere in the world.

British Museum Director Hartwig Fischer, a member of the 19-person steering committee, said the programme will strengthen Britain’s global leadership in digital archiving.

“I cannot wait to see what happens when we bring all this talent and dedication together to build the new future for our shared national collection.”

Re-imagining history

Two projects aim to modernise existing collections, challenging practices around how heritage is recorded and valued.  

Our Heritage, Our Stories will scale up its community-generated digital content and Transforming Collections will address structural inequalities in its collections to amplify marginalised voices.

Through AI, collections will uncover patterns of bias and potential ‘hidden’ connections.

Tate Director Maria Balshaw said the museum, which is a partner of both Our Heritage, Our Stories and Transforming Collections, will “generate new knowledge for and with the public”.

“We see this as a brilliant opportunity to work closely with our academic and sector partners in a programme that will have a huge and positive impact for audiences in the UK and internationally.”

Collaborative research

The remaining projects will involve digital researchers working alongside historians and heritage professionals to create new archives.

One Discovery Project, Unpath’d Waters, will forge a centralised database of the UK’s marine heritage, whilst The Sloane Lab curates a digital catalogue of the British Museum’s founding artefacts. 

The Congruence Engine focuses on industrial history, using AI techniques to explore the textiles, energy and communications sectors.

It will use machine learning and natural language processing to create and refine datasets and provide routes between records and digital objects.

British Museum Director Fischer believes the results will “allow us to explore what the digital future for our organisations can and should be”.

Users will be able to “search across collections cared for in different parts of the UK, to pursue their passion for knowledge and understanding, discover their own pasts and answer their own questions,” she added.

Author(s):