Making digital connections to create immersive work played to all our strengths, Sarah Ellis writes.
Stuart Martin, © RSC.
Audience of the Future started in 2018 as a large-scale initiative funded by UKRI (Innovate UK) to explore the future use of immersive technologies in the creative industries. This month the programme will culminate in live performance specifically made with immersive technologies for digital distribution. It shares much of the R&D that we’ve explored over the past few years and responds to the pandemic by showing how we moved from live performance to finding ways to connect and create togetherness in people’s homes during lockdown.
A desire to learn
The programme aimed to undertake R&D and demonstrate that through a performance which reached a wide range of audiences. Working with real time technologies enabled a uniqueness of form and helped expand our theatre making toolkit for our artists and community.
2018 feels like a long time ago. Back then, we were evaluating the impact of our 2016 production of The Tempest and thinking about how we could take that forward. In that production, we had started exploring the potential of immersive technologies in live performance by creating a digital avatar of Ariel using real time motion capture and game engine technology. Audience of the Future felt like the perfect opportunity to build on that and develop this work more widely.
We realised we could not only utilise this technology ourselves, but explore this with a wider range of artists, arts organisations, technologists, practitioners and producers. We wanted to consider a more diverse approach to these new tools and technologies with them. How could we create a consortium that worked across all forms of live performance and enable companies to learn from each other? We set out to connect with a range of partners and identify a series of R&D challenges beyond those we had discovered ourselves. Together, we formed a group of people with different skills, strengths and expertise but a shared set of questions, objectives and curiosities.
Playing to our strengths
By researching and developing immersive technologies in real world contexts, we were able to create new intellectual property and test new commercial models. We were also able to build a base of evidence around technical innovations to support decision making in future projects.
It became clear we could accelerate a lot of work already under way in our sector and provide impact beyond the consortium organisations. A collaborative model began to emerge, bringing together a group of amazing people who could apply this approach through a sector-wide lens rather than an organisational one. It allowed us to work with freelancers and people with new expertise.
What we quickly learnt was that this was not so much about technology as people – learning all the varied ways of working we had and how to be open and connect with each other. We agreed to share what we learned with each other and our communities. We also agreed everyone could develop their own strands of R&D and present work throughout the project in their own right.
Partners like Manchester International Festival presented Skepta’s Dystopia987 – an immersive experience using projection mapping and layers of storytelling that redefined the notion of a performance space. Using their expertise and artistic vision, they connected with other partners for support, recognising each organisation’s strengths and where we could learn from each other.
Putting it into practice
By March 2019 we were on track to create a virtual performance using real time technologies in Stratford-upon-Avon. When the first lockdown happened, we had to stop production completely. We then commissioned a piece of audience research that highlighted how our audiences craved togetherness and liveness, as well as the scale of digital inequity in people’s homes. It also highlighted the opportunities for the consortium to support each other through this time and ensure we could continue to develop our work.
Our audience will now experience this performance digitally. Recognising the digital divide, we’ve made it accessible on desktops, laptops and tablets. If audiences want to get a little closer to the performance, they can sign up to be a ‘firefly’ – an experimental form of interactivity we’re using to explore whether our performers and audiences can achieve a sense of co-presence at a distance.
The work will be performed in mo-cap studio, created specifically for this event. Some of the technologies we are using are known to the arts and other industries. What is unique is how we’re blending them together to achieve togetherness and liveness. Real-time motion capture technology, face-tracking technology and digitally designed characters and environments combine to create a whole new set up for digital distribution. This includes a bespoke web player and graphics technology that creates layer upon layer of interactive connection.
In moving towards a digital distribution model, we’re testing what can work for the technology people have to hand. This has proved to be a big challenge but important in breaking down some of the barriers around this kind of work. By bringing together technologies that are often used in isolation we can break down silos and find something new. The purpose of collaboration is to stay open and see what you didn’t expect. The legacy of this collaboration will be to share what we have learnt – whether that’s the technology and how it’s used or the understanding of our audiences and potential commercial models working at scale. What we have found so far is connection through collaboration that brought people together and created something we never could have achieved on our own.
Sarah Ellis is Director of Digital Development at the Royal Shakespeare Company.