Almost prescient in its timing, the 5G Festival project sprang into life in 2019, just as the Covid-19 coronavirus pandemic was shaping up to impact the world and decimate the live entertainment industry.
Together with growing calls to action over environmental concerns, an exploratory venture into the world of remote entertainment technology couldn’t have been a more welcome endeavour.
The initiative was conceived to test the viability and potential of a live immersive collaboration platform for artists based around the ability of 5G to transmit video and audio with ultra-high bandwidth and sufficiently low audio latency to permit real-time interaction of musicians as if working in the same physical space. From the outset however, and by their own admission, those involved were unsure of its viability.
Any remaining reservations were put to rest recently with the staging of the first 5G Festival live event, bringing together musicians playing at the Brighton Dome, with others some 60 miles distant and on opposite sides of London at Metropolis Studios and the Virgin Media O2 Arena’s Blueroom. The event followed a series of trials involving The Dome and Metropolis initially with just three artists, testing the 5G network immersive video streaming platform that the project partners were developing, and now presenting it to a select public audience for the first time.
Led by Musical Director Kojo Samuel, the event featured a new group called The Remotes, with Newton Faulkner, Memorial, Jesse Appiah and Sylvia Mwenze live in Brighton Dome, Lola Young and Natalie Lindi live at the Virgin Media O2, and Sipprell and Pearl Harts live at Metropolis Studios. They were supported by a house band comprising Adam ‘Smiley’ Wade on drums, Aleksey Lopez on acoustic guitar, Mitch Jones on keys, and Jesse Appiah and Sylvia Mwenze on backing vocals (at the Brighton Dome), with guitarist Ross Chapman at the O2 Blueroom. In all, 22 musicians took part across the three venues – nine at the Dome, seven at Metropolis and six at the Blueroom.
The Dome sound system had been expanded from its usual stereo configuration to provide immersive sound for the evening. Also, in the Brighton Dome’s Founders Room was a live relay of a separate performance taking place at Metropolis Studios – again, using an immersive loudspeaker set-up.
Aiming to bring all aspects of immersive audio to the live, studio and broadcast markets, the 5G Festival (itself part of the DCMS 5G Testbeds and Trials Programme, 5GTT) threatens to revolutionise the live music industry through its use of the 5G network, creating new commercial opportunities for arts and entertainment, and giving artists and audiences new ways to interact with each other. Alongside these ambitions, Metropolis has built the highest resolution Dolby Atmos certified studio in the UK, which also played its part in the 5G Festival event.
Led by digital technology centre and 5G specialist Digital Catapult, the initiative partners include the Warner Music Group; the Brighton Dome & Brighton Festival (working with Brighton 5G testbed partner Wired Sussex); the Virgin Media O2 Academy venues; Metropolis Studios; immersive audio and live streaming specialist Sonosphere; the Audiotonix group (comprising Allen & Heath, Calrec Audio, DiGiCo, DigiGrid, Klang Technologies, Solid State Logic and Sound Devices); and digital technology companies Mativision (5G, 360° immersive live streaming and distribution) and LiveFrom (blockchain ticketing).
Digital Catapult Chief Technology Officer Joe Butler describes what has been achieved through this demonstration as being ‘a blueprint for the industry’. ‘This project is very much of this moment, and reflects the direction of travel of the industry,’ he argues.
The day prior to the 5G Festival event, the consortium hosted presentations, first at Metropolis and then at the Virgin Media O2 Blueroom, with a trip on an electric 5G smart bus – that was able to look in live via wireless 5G at the rehearsals taking place – to carry the invited audience between them. The presentations laid bare the workings of the project, as well as much of its achievements and discoveries, and identified possible future objectives.
At the Blueroom, the presentations made use of The Portal, a white cubicle that brought members of the team at the Brighton Dome into the room in real time as 3D projections, able to speak and field questions as if present.
Expert in delivering demanding 360 VR live streaming projects, Mativision’s Common Service Platform (CSP) provides the central resource for all video and audio assets. Everything is uploaded here, and drawn down wherever required – the audio elements comprising more than 200 channels using AES67/ST2110 standards. For the demonstration, synchronisation was provided by GPS clock through sources at each location.
For the event, the CSP was located at the Brighton Dome, and supported by additional edge CSP processing located at Metropolis. Leased lines between the venues were used to interconnect the processing.
Working with and around this platform, the project’s partners are able to bring their respective technologies to bear on the collective effort. In the case of the audio, the Audiotonix group holds a strong hand through its various areas of expertise – including DiGiCo for live mixing, Calrec for AoIP interfacing and AES67-to-Madi conversion, and Klang for immersive in-ear monitoring.
Essential to the workings of the audio – and to the success of the entire venture – is the issue of latency. If the audio transport delay between venues exceeds the capacity of the musicians to work comfortably together, it simply doesn’t work. ‘Latency is a real-world issue that we face daily,’ observes Metropolis Brand Director, Gavin Newman.
Part of the project’s research, then, was to establish this limit and design a network that can operate within it. Simple in principle, this uncovered a couple of surprising results. Video latency was also tested to determine whether the performers could take visual cues from each other remotely.
In an earlier trial, the delay on the musicians’ monitoring was progressively increased to determine the failure point.‘We tested the musicians with different increments of latency, by adding latency each time they played the same song,’ reports Sonosphere Commercial Director, Jamie Gosney. ‘We got it up to 40ms before we broke them and they couldn’t play together anymore. We the dropped that back to 25ms, and they were able to play together like they were in the same room. For the showcase, we managed to get the latency down to an 8ms round trip.’
Delays were least tolerable when monitoring in mono, better in stereo, and significantly improved with Klang’s immersive implementation. In addition, monitoring levels were lower with immersive monitoring as the perceived loudness was increased.
Alongside immersive monitoring, the performers were also able to use lightweight AR glasses (sourced in China specifically for the project) to bring distant players into their space to aid visual communication when performing.
Described by Gavin Newman as ‘monumental and epic’, this impressive demonstration of what has been achieved to date is not the end of the 5G Festival consortium’s aims. Anticipating a further two years’ development, there is still much to investigate and achieve. As noted by Audiotonix’s Group Chief Technology Officer, Neil Hooper, in his presentation, ‘5G is still evolving and is going to change what we do’.
The ‘future’ agenda necessarily involves further testing and refinement, including improving automated audio configuration systems. For use in other settings, the 5G Festival set-up would have to be reconfigured to meet their specific components and requirements. Making this quick and straightforward would widen its likely application base and reduce the cost of deployment. Among other developments, it is anticipated that synchronisation can be brought under the 5G umbrella, eliminating the need for local GPS clocks – a need that adds complication and expense, and could exclude smaller venues from being able to bring the new technologies onboard.
These and other advances will be necessary before a commercial solution can be achieved. On its own, the footprint occupied by the equipment deployed at the Brighton Dome would likely exclude commercial viability. But with a further 18 months to two years’ work, the G5 team are confident that these and other hurdles can be cleared.
The prospect of a commercially effective system opens many doors and raises many questions. For live concerts, it would make possible both remote collaborations and the ability to present gigs across multiple venues – offering the possibility of shorter tours being available to larger audiences, and in venues that do not reduce the band to insignificant, distant figures, and free from the compromises of stadium sound. In prospect, each concert potentially has global reach.
Rehearsals too could be conducted remotely, rather than assembling all of the musicians in a single space. Both of these applications would also reduce the carbon footprint of touring. For music creation too, partnerships between distant musicians would come without the complication, expense and carbon penalty of travel.
‘We all know that there is a commercially viable product here,’ states Mativision’s David Jacklin. And work has begun to identify what that might be…
‘The way we approached the commercialisation options was to first segment the main three markets for this technology,’ says Warner Music Group’s Tiago Correia. ‘At its core, we believe there are three monetisable audience groups – the consumer audience, musicians and songwriters, and venues and promoters.’
With several surveyed groups already in the consortium’s thinking, the audience at the 5G Festival was also invited to offer its opinions through a questionnaire presented after the event. This included value added by a ‘content rich’ concert experience brought about by the technology, and readiness to pay for it in various settings.
‘The 5G Festival has shown us the tangible impact of technology on our experience of live music, both how we could make it and consume it remotely,’ observes Digital Catapult CEO, Jeremy Silver. ‘This project has been about elevating and enhancing live music for all involved. None of us wants to replace the authenticity of the real life experience, but if we've learned one thing in the past two years, it’s that remote can be meaningful too. 5G has enabled real-time remote jamming between musicians for the first time. Combined with augmented reality, this opens up super exciting potential for reducing our carbon footprint and making global music collaborations work.’
‘We love to challenge conventional thinking,’ asserts Jamie Gosney. ‘We believe that now is the time that stereo gives way to immersive audio, as mono did to stereo. Now everybody can have the best seat in the house.’
See also: