• Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email
  • Share on Facebook
  • Share on Facebook
  • Share on Linkedin
  • Share by email

Freedom Studios recently filmed and broadcast a performance live from multiple mobile phones. Imran Ali explains how.

Photo of live editing on tablet

In October last year I joined Freedom Studios as its first Technologist-in-Residence. Alex Chisholm, the Co-Artistic Director, and I share an interest in science fiction and the impact of digital technology on art and culture. Our hope was to use my residency to explore the impact of various emerging technologies – virtual reality, augmented reality, storytelling platforms, open data – on theatre and on live performance.

Tajinder Singh Hayer’s North Country, a post-apocalyptic tale set in a future Bradford recovering from a plague, offered a rich, timely and provocative story universe upon which we could assemble a few digital experiments. It is essentially a series of direct addresses from three characters, taking place across several decades and in various locations. This story seemed readily re-mixable by space, place and perspective.

Though fire-damaged, full of dead pigeons and lacking power, broadband and heat, we found it offered a range of ready-made dystopian looks

Storycasting for a digital audience

Our first experiment involved restaging a scene using a 360º camera from the point of view of an unseen, off-stage character. Providing this novel perspective on the story, outside the live performance itself, provoked some interesting questions.

As the company began to consider touring options, the possibility of restaging the play for a digital audience became increasingly compelling. We began to speculate about capturing and streaming a live performance, using a service like the live-streaming app Periscope.

One possibility included orchestrating a flash mob, providing multiple perspectives and views to audiences watching on their smartphones. Another involved a performance where each character’s perspective would be available as a discrete stream, with viewers able to follow and switch between characters at will.

We settled on the latter but, rather than the audience, we would retain control of the narrative flow. This would mean we could proceed without altering the actors’ performances or the script too much, as well as retain the play’s narrative pace.

In contrast to NT Live’s streaming of theatrical performances, we would target the mobile audience of Facebook Live, building on our own social media following. Actors would film and light themselves using smartphones; a director would switch between those phones from another device; we’d stream live via 4G or WiFi to Facebook and our audience would join from their own phones or computers.

And so, our notion of storycasting was born, architecting and directing a story from multiple live streams, locations and time periods. In essence, the architecture of a story could be live streamed, with others assembling them into narrative – in this case by the play’s directors, but possibly by viewers in the future.

The challenge of live editing

North Country was originally staged in the round in an abandoned Marks & Spencer store. For our storycast to be similarly immersive, we chose a derelict building in Bradford’s Little Germany quarter. Though fire-damaged, full of dead pigeons and lacking power, broadband and heat, we found it offered a range of ready-made dystopian looks.

With the lead time for a fixed-line broadband connection running to several weeks and no locally accessible WiFi, we were reliant on mobile data for live streaming. Fortunately, local 3G and 4G signal coverage was strong and a quick test live stream to Facebook Live from various points in the building worked well.

We later sourced Switcher Studio, an app that turns iPhones and iPads into a live editing studio. Using a trio of iPhones as remote cameras, we fed their footage into an iPad, from where we could switch camera angles, live streaming the combined footage to Facebook via WiFi hotspot provided by a 4G phone.

Each character would be delivering their dialogue straight to the front-facing camera of each phone and not only could we switch between each actor, but also select either the front or rear camera, giving us the possibility of showing the character’s point of view.

Complications with sound

We planned to move the actors through the building’s five storeys as the story progressed, but our 4G WiFi hotspot would only work if all other devices were on the same floor, so our crew would have to follow the actors throughout the location, being careful to stay quiet and out of shot of all three cameras.

By default, Switcher Studio’s remote iPhone camera feature could only capture footage in landscape mode and not portrait, making the phones more awkward for actors to hold for long periods in one hand. With each phone running continuously for over an hour, they also needed a portable USB battery attached, further weighing down the phones.

Our ‘switcher’ iPad seemed to be the only device recording sound as the remote iPhone cameras weren’t sending their sound to the switcher to be mixed. After advice from the app’s developers, we were told this capability was in development, but we would need to mic each actor and mix the output into a separate audio mixer (an iRig PRE) that would output to our switcher device. As such, we placed radio mics on each actor and monitored sound levels as we followed the actors and crew through the building.

At several points in the story, an additional crew member was used when all three actors were required in the same shot, with some very clever and rapid hand-signalling from each actor to signal to the camera operator when they were about to speak. One character also had two very quick costume changes to make inbetween shots of the other two characters.

Though our early challenges were largely technical, ultimately the larger challenge was smooth camera transitions between characters, keeping the crew out of shot, moving through the location – and doing everything live. Fortunately, several complex rehearsals along with a brave cast and crew meant the final performance went without a hitch.

Future possibilities

Alhough North Country was suited to this form of delivery, the central question that it raised was whether by streaming all the component elements of a performance (characters, places, time periods), we could allow viewers to reassemble them on their own terms, in a kind of participatory live streaming?

We’re already starting to imagine a future version of North Country, where viewers can select viewpoints, follow characters, locations and timelines of their choosing, perhaps even contributing and curating their own story elements. Indeed, we’re already considering the possibility of a ‘storycasting’ kit for performers to adapt existing works or perform new pieces.

We believe this kind of storycasting offers some compelling paths for innovative storytelling in theatre beyond simply streaming and perhaps to a more open-source remix of live theatre.

Imran Ali is Technologist-in-Residence at Freedom Studios.
www.freedomstudios.co.uk

Link to Author(s): 
Photo of Imran Ali