Gorillaz: Song Machine Live

 

use case // 2020

Overview

 

Hosted by LiveNow, the band played to fans in Europe, Asia, and the US with three live performances over
one weekend in 2020. The production was the group’s most technically ambitious to date, and required a remote software solution to bring the creative vision to fans across the three continents. Creative Directors at Block9 trusted Lewis Kyle White of Pixels and Noise and Technical Lead Scott Millar to raise the bar on what audiences could expect from a live-streamed virtual concert experience.

Challenge 

 

The band’s crew were locked in a Covid-19 bubble, with much of the production’s technical team based elsewhere. The performance featured a 14-piece band and a handful of live guests in the bubble, including Peter Hook, Kano, Slowthai, and Robert Smith. Artists such as Beck and Elton John performed as holograms that were created using AR techniques rendered in Notch.

 

Working with nine cameras, six of which would possess AR elements, the show’s content creators were dotted around the world, leaving White and Millar to build a virtual stage in Notch using the 3D data they had to hand.

 

The duo had to find a way to tell the show’s story effectively, where virtual and human band members could interact effortlessly in real time. The delivery of blended animation, AR and live performance was a big step up in the band’s production values at that time.

Key Stats

 

  • three live performances over one weekend

 

  • six AR cameras to deliver virtual content like Elton John Holographic performance

 

  • Due to covid Band and Crew had to work remotely from all over the world

Client quotes

“SP was the crucial backbone of our entire content system on this globally watched production. Since the success of Song Machine Live, I’ve used it many times, and will continue to do so in my future projects.”

 

– Pixels and Noise, Lewis Kyle White

Solution

 

The team constructed a workable environment built around the band members. Having heard good things about Stage Precision’s software tools – which were in the early Beta stage in 2020 – from industry peers, White and Miller decided to load the whole show into SP.

 

Initially engaged as a pre-programming tool, as the show developed, SP was also used for running camera calibration, asset management, data collection and distribution. The cameras were calibrated using World Space.

 

The process began by adding the camera information into the virtual stage. White then built the rigs by making a 3D camera jib that would move in unison with the jib in the physical world. It was loaded into 3D packages followed by Cinema 4D. This was repeated in a traditional sense offline with mediums that involved multiple still frames and rendering at different angles. The information could be previewed from that point, essentially generating a system to pre programme the entire show in SP.

 

The team utilised robotic cameras built into the rigs. Only moving on an axis that was true, the 3D objects would change based on their real-life counterparts.

 

Going back and forth between the technical team, stage, show and set designers, White and Millar were able to develop an AR system using Notch as the Render Engine. The show ran through disguise media servers, and any 2D content was fed from disguise into 3D space and then back.

 

While in PreViz leading up to the show Millar set up a VPN to help bring the team into a Virtual work space While they were all still in lockdown in London, White could run SP over the VPN connection directly into a disguise operators House remotely, who was in turn running the disguise and timeline content and sharing the GUI output back over the web for the creative team to programme the show.