• Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email
  • Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email

Filmed theatre is still a relatively new medium - between film and live theatre. So how do we develop the visual language to take it to the next level? asks Mike Richardson.

Wrist-based bio-monitor

We’ve all had that experience. A moment where time falls away, our bodies cease to exist, and our complete and undivided attention is held fast by the magic of story. Your moment may have been ignited by a passionate monologue from the lead in centre stage, or the precious seconds before the dramatic cliff-hanger of the newest, hottest Scandi crime drama. Or, perhaps in the thrashing melody of an electric guitar of your favourite death-metal band. Regardless of the cause, we have all had a moment; a moment of total immersion. 

Immersion can come from any narrative medium, but live theatre perhaps captures audiences in a way that is different from others, and difficult to reproduce. Anyone who has both sat in an auditorium to watch a performance, and also sat at home on their sofa to watch a recording or a live stream, will know they are chalk and cheese – practically, incomparable. Why is this? And how can we best translate the properties of the live-stage to the digital-screen?

As part of a team of cognitive psychologists, I have been trying to understand the differences between in-theatre and remote audiences' internal states in order to improve filmed theatre experience. It’s a collaborative project between Bristol Old Vic and Complicité, and researchers from the University of Bristol and the University of Bath. 

Calculating audience synchrony

We are conducting a large-scale research project comparing audience immersion in-theatre and live-streamed of Drive your Plow Over the Bones of the Dead, directed by Simon McBurney. The project, funded by MyWorld, uses bio-monitors and motion-tracking to record audience members’ heart rates, skin temperature, movement and other physiological responses while they watch the show. 

Using this data, we can calculate the level of ‘audience synchrony’, how similar different audience members’ internal states are. Previous research strongly suggests that the more similar an audiences’ collective heart rate is, the more immersed the audience is in the content. 

This may sound abstract or complex, but the logic is actually very simple. If audience members are all immersed in a narrative they may forget their bodies, their minds focused on that exact point in the story. Distractions fall away or, in terms of psychology, they are experiencing total cognitive immersion in an identical stimulus. It stands to fairly good reason that their minds and bodies will be doing roughly the same thing. 

Alternatively, if an audience is bored, some might be thinking about the dinner they just had, what ice cream they want in the interval, or whether they recognise that actor from Eastenders. Other research shows the less an audience moves, the more immersed they report being -  or the more they fidget the more distracted they are.

The use of wrist-based bio-monitors

Using these physiological measures has some upsides. The simplest way to find out if an audience is immersed is to ask them. But the very act of asking the question completely breaks the immersive experience. Imagine how engrossing Lady Macbeth’s final soliloquy might be with a researcher sat next to you asking you to rate your feelings every thirty seconds. 

Using modern wearable technology, we can record internal states of multiple audience members at once, with no more hassle than putting on a watch. With these wrist-based bio-monitors, we can also get a better sense of the holistic performance, as past research demonstrates that audiences mostly remember the beginning and end of narratives (we call this a primacy and recency effect). 

This measure of in-theatre synchrony can be tracked and time-locked to the run time of the show, allowing researchers to see moments when the audience is more or less immersed. This gives our control group, our ground truth, what it is ‘like’ to be in the theatre. 

We can then compare what it is like to watch the live stream of the show. Does the remote audience experience immersion in the same places as the in-theatre audience? Where do the in-person and remote audiences differ in physiological synchrony? And crucially, is there anything we can do to make the live stream feel more immersive, like being in the theatre? 

Testing live outcomes on virtual audiences

There’s another part to this research, an experimental production. Using this measure of physiological synchrony, Bristol Old Vic and Complicité are adapting the shooting of the live stream night-by-night, informed by the anonymous bio-monitor data. In this way we can investigate the digital visual language used in filmed theatre and test the outcome on the remote audience. We’re testing over one hundred people in the theatre, and over three hundred people watching at home.

Through this work we hope to increase theatre accessibility and to create something new, a way to live-stream shows that are as immersive as possible. For those who can’t get to the theatre, or can’t go as much as they’d like, or for me on my sofa on a Tuesday night, can we bring the magic of theatre into the home in a better way? 

This project is the first of its kind, but we hope it will inspire other theatre makers to consider new ways they can bring the wider audience into their shows. While we can’t share any concrete findings just yet, an early peek at the data shows that the in-theatre audience are more synchronised at key narrative moments in the play. I won’t spoil those moments though; you’ll just have to go and see the play for yourself. 

Dr Mike Richardson is a post-doctoral research associate in the Dept. of Psychology at the University of Bath. 
m.richardson@bath.ac.uk
www.bath.ac.uk/research-groups/human-computer-interaction-the-create-lab/

Link to Author(s):