AOIN Produces 'Extended Reality' Scenes for 'America's Got Talent'

America's Got Talent XR scene
Custom modeled vending machine for performer “Noah Epps”, alongside Dekogon’s Arcade Machine UE4 Marketplace assets. UE4 mannequin for scale. (Image credit: AOIN)

SAN FRANCISCO—I was fortunate enough to spend the better part of the pandemic working with some of the most talented and forward-thinking creatives in live television production on season 15 of “America’s Got Talent” on NBC. My company, All of it Now (AOIN) was hired by Fremantle to work with the show’s Screens Producer Scott Chmielewski, DMDS7UDIOS, to create and build live XR (Extended Reality) scenes for some of the season’s top contenders.

There were a lot of vendors involved in bringing this very, very ambitious show to life. DMDS7UDIOS ran everything live on set, my team at AOIN together with creative teams at Gravity in Germany and XITELabs in Los Angeles produced XR content for the season’s signature performances. This was no small task, as each of the season’s seven episodes featured two XR performances, and in many of these performances we managed the initial integration and technical handoff of assets to the DMD team who drove the onsite programming, calibrations, integrations and real-time rendering out to a state-of-the-art 60-foot wide curved LED stage.  

What made the creation of this much digital content manageable was having multiple design teams on board working in parallel in Unreal Engine. With source control, we were able to create branches for each designer to work in independently, and then merge their respective changes with the larger team whether the assets or scenes were coming from Gravity, XITE or AOIN. Source control provided a clear path to produce this quantity and quality of content in the timeframe we needed to hit repeated weekly deadlines.

DMX blueprint for virtual lighting fixtures, programmable by on-set crew (Image credit: AOIN)

Blueprint for DMX programmable lit spheres used in episode 1522  (Image credit: AOIN)

Each week we would receive creative direction for each XR performance in the form of mood boards and concepts that came in from creative producer Brian Burke and Harriett Cuddeford from the talented Syco Entertainment team. We then had three days to submit our first iterations, and the remainder of the week to refine our scenes for Friday ingest and onto the live performances, just one week later.   

These were ambitious visuals. Rather than a few months for each scene, we had five days and we were constantly figuring out how to reverse engineer the initial creative concepts to maximize assets we could pull from the Unreal Marketplace and Quixel Megascans to synthesize into environments that would look great on television. The Megascans in particular played beautifully on screen—the forest and foliage tools gave us a level of realism in performance visuals that would have otherwise been impossible. One of my favorite performances of the season, Noah Epps in week four, used the Decagon arcades from the Marketplace assets. 

America's Got Talent XR Workflow

Creating a fully immersive world, so that the camera can explore all areas of the scene.  (Image credit: AOIN)

The hardest part about this show is that it’s live and performance-based, and there are constant on-the-fly changes. In some cases we had to adapt some scenes meant to be shot in XR, or performers would be swapped for use in a different CG scene. The teams were able to make both minor and major adjustments to scenes in a short amount of time, and sometimes even repurpose scenes from XR to rendered videos for non-XR performances. The tools in Unreal Engine gave us the flexibility to render out scenes dynamically and have the option of outputting real-time or rendered content.

The sheer volume of content required from this production made it practical to have dispersed creative talent. Our team in San Francisco could be working on butterflies, and XITE’s team on modeling, then handing off to Gravity’s team in Germany to work on trees and foliage, all in the same world for a given performance. The collaboration was extended with the show’s creative directors who sat in on daily review sessions where we would share Unreal Engine scenes on Zoom calls, and could iterate in real time without having any of the creative direction get lost in translation. 

Forest Environment for Daneliya’s performance the LED volume is represented by the semi-transparent grid - this is useful for checking what will be rendered inside the LED volume, and what will be rendered as set extension.  (Image credit: AOIN)

Onset, the DMDS7UDIOS team worked with director Russ Norman to capture performances using four XR cameras and Stype Red Spy Fiber systems for camera tracking. Florian built virtual cameras and other helpful assets into an Unreal template so that he could give us an accurate camera plot that allowed us to block everything in advance, and know how to place assets in scenes in a more thoughtful way. This tool was very useful and avoided a lot of potential issues by the time it came to live rehearsals. The XR performances were shot on a 60-foot wide curved LED stage, which along with a highly reflective floor, posed certain creative challenges. We had to take the real-world reflections and the virtual world reflections into consideration on shoot day, and Scott was able to use live DMX data to control hundreds of virtual DMX fixtures and effectively manipulate lighting in virtual environments

Over 9500 individual DMX sphere lights for real-time playback and control. Spheres were grouped into 9 DMX programmable sections for onsite programming and control.  (Image credit: AOIN)

America's Got Talent XR Workflow

Balancing the distribution of lit, unlit, and black spheres was a manual process, but created incredible results.   (Image credit: AOI)

The sheer scale of each of these XR performances, and the ability to create these fantastic worlds, makes this season stand out. The “AGT” team wanted to try this out a few years ago, but the technology wasn’t quite there yet to make this a practical option. Two years later we came back, and amidst the social distancing requirements due to the pandemic, performances were spread across multiple timelines, which made the production timeline more conducive to working in XR. It was a thrill and an honor to get to work under the brave creative supervision of Brian Burke and Harriet Cuddeford who came up with incredible concepts to build into XR performances.

This was our first full season episodic series done entirely in Unreal Engine for the XR performances, and we would have struggled to complete the XR portions of this show in any other way. We were heavily dependent on the ability to support multiple designers and automate their creative flow with source control, along with having access to premium pre-built assets in the Marketplace. We also needed the ability to roll back when a producer said “we liked what you did yesterday, can you revert to that version?” or if a performer would want to go in a different direction creatively on the day of live rehearsals. 

Working in Unreal gave us the unique flexibility to avoid being locked into too many creative variables while delivering scenes that transport the viewer in a whole new way. Producing live XR content in a real-time game engine feels like what surely will be the future of performance-based television programming.  

Danny Firpo is the co-founder of All of It Now Productions.