Gamers 8
use case // 2022
Overview
Gamers8, one of the world’s biggest eSports and gaming events, was held in the Kingdom of Saudi Arabia (KSA) from July – September 2022, with elite tournaments taking place alongside a series of attractions, concerts and entertainment shows in Riyadh Boulevard City.
“It’s a mixture of competitive eSports – which attracted players from all over the globe – alongside arts, culture, and live music” said Leon Herche, Head of Creative Production at visual design house, bright! studios.
“These are multichannel events, so you always have one main arena where the larger games are happening, then other streams where smaller matches are being played sitewide at the same time.”
Brought to fruition by the team behind Gamers Without Borders, Gamers 8 saw some high-profile battles take place over an eight-week period including Dota 2, Rocket League, Fortnite, PUBG MOBILE and Tom Clancy’s Rainbow Six Siege.
bright! studios was engaged to assist with bringing these tournaments to the next level by creating and running augmented reality (AR) visuals, giving an extra layer of entertainment to remote viewers who weren’t able to experience the atmosphere first-hand.
In addition, bright! was commissioned to build an advanced media control workflow for the tournament festival that could automate the larger aspects of this highly complex project. bright! also had to maintain a variety of on-site LED and handle all video playback via a disguise-based media server system.
Challenge
The main technical challenge for bright! was how to implement AR workflows for the Gamers 8 main stage, and for the English-language commentator studio. This meant managing a complex, six-camera Augmented Reality system.
“All of the tournaments are covered in both English and Arabic,” said Herche. “So, you have hosts in the studio and others in separate rooms, providing commentary during the matches followed by an analysis directly afterwards.”
The team needed to establish a Stage Precision (SP) based show automation workflow across all different tournaments, including API (application programming interface) integration for the games themselves, which could communicate various pieces of in-game information to technologies and people involved.
Key Stats
Gamers8 featured a total prize pool of $15 million (USD) across five titles
More than 1,000 activities and attractions were available at the event, such as themed installations, immersive experiences, and live music
It took 35 bright! studios staff and freelancers to help deliver the festival’s technical visual production and AR requirements
Solution
Using SP software as the common ground between all of their various duties, the team built a system that could read data from all of the different games and send trigger commands to the video switchers, AR elements, live-rendered graphics in Notch, disguise timeline trigger points, and other inputs from standard graphic systems such as Expression or Resolution.
Herche continued: “Key elements such as the disguise media servers and Resolume video playback, procedural generated live visuals using Notch and Unreal Engine for perspective backgrounds, custom game APIs acting as trigger events, and the AR elements with various tracking system sources, plus automated show cues, meant that we needed to unite system in one manageable software application.
“From start to finish, Stage Precision served as the glue that held all of these different technologies together,” he added.
“We were also playing or distributing over a web server, creating graphical user interfaces (GUI) for everybody involved, such as the directors, producers, and technical staff,” said Herche. “It gave everyone an overview of what data was coming out of the game through the API, so they’d know who was playing, who was leading, who was losing, and who would be best to get on stage afterwards for an interview.”
In-person audiences were able to see the stage animations and effects on the LED screens, but the actual studio for the broadcast was not visible to the on-site audience at all, as it was only made for broadcast.
“Looking back, I think the amount of API integration and overall logical dependencies we did, while controlling all of the other systems, was pretty ambitious,” said Herche. “Luckily, SP is made for this kind of thing, and the new software we trialled was incredibly useful as well when it came to AR compositing.”
The team were so confident in Stage Precision, that they used an alpha version of their new, AR-focussed real-time playback tool, Space Composer, which gave the ability to put AR or real-time rendered graphics over live camera images with an easier workflow than most standard applications have made possible to date.
“It’s the missing link between Shield, SP, and other real time engines, and it achieves that by combining them seamlessly,” said Herche. “With Space Composer, you’re able to play back Unreal and Notch at the same time, while layering the content. As the state of the software was still in development, of course we built backup scenarios just in case we encountered any problems, but they weren’t needed. Although, it was reassuring to know that we could always switch to the old method of using Shield inside of Unreal at any time.”
Client quotes
“Stage Precision enables users to fulfil the whole scope of a complex project in a brilliantly simple way. Without SP, these kinds of jobs would be more complicated, more time consuming, and more costly, because the longer a project takes, the bigger the budget has to be. The general expectation from the client side on Gamers8 was pretty high, especially as they were aiming to deliver one of the largest eSports festivals ever held. From our POV, to have such confidence in the software was truly invaluable.”
– Leon Herche, Head of Creative Production at bright! studios