The Future of Virtual Production is Here image

The Future of Virtual Production is Here How Extended Reality Group helped bring real-time 3D sets to life with Las Vegas superstars.

Virtual production studio Extended Reality Group (ERG), is known for creating high-tech original experiences and presentations for the likes of Ford and General Motors. Yet nothing compared to the scale of the latest Resorts World Las Vegas “Stay Fabulous” campaign they collaborated on recently.

Working together with Psyop and Hooray Agency, ERG used Cinema 4D, Unreal Engine and Houdini to help create a short commercial to kick off the “Stay Fabulous” campaign, featuring Celine Dion, Carrie Underwood, Katy Perry, Luke Bryan, Tiësto, and Zedd.

Made using the same type of cutting-edge technology behind The Mandalorian—the film is the first commercial filmed entirely by in-camera virtual production, shot on an LED volume.

Wanting to know more about the development of the 3D sets in the volume, we talked with ERG’s founders and Executive Producers Zack Kingdon and Evan Glantz, as well as Senior Technical Artist Patrick Beery.

Glantz: When we were first approached about this project, we were asked to receive 3D-modeled assets so we could optimize the geometry and textures for the ingest and operation in the Unreal Engine workflow and display application on a LED volume stage.

We were also tasked with doing all scene construction lighting, and final touches, working directly with the Creative Director Marco Spier. As the project progressed, and the creative scope changed to incorporate additional scenes, our contribution grew to include the creative services needed to design and build the new environments.

Kingdon: No. We have used volumes in the past, recently we’ve done several shoots for Ford Motor Company, General Motors, and a handful of others. But this was definitely a very high-profile project.

Glantz: The unique thing about this project was that we didn’t use an LED floor and we had practical and physical set pieces. A lot of the other work we’ve done using volumes included some form of LED floor. Similar to a green screen stage, LED floors with a volume allow for the virtual world to fully encompass the talent, props and camera.

For most shoots with a LED floor, the value-add is that the ground and other elements of the environment will all match because they are generated by the same software. A well-known example of this is Katy Perry’s American Idol performance where the LED floor was used in tandem with the rest of the LED volume to transport her to different environments that animated and changed throughout the song and were captured in a continuous camera shot.

In the case of this shoot, we had an amazing live-action set design team, who created a more grounded space for the talent to live in and interact with. This was the first time we did something at that scale, blending virtual set pieces with the real set pieces.

Beery: They were so excited to be able to walk in and see the environment. Zedd said it helped him get into the mood. But there were some challenges, like Carrie Underwood had some trouble walking on the piano steps in her scene because the set pieces were so jagged.

Beery: Psyop had a team that would handle a lot of the base modeling and preliminary modeling for things like the pinball scene. Once we got that, I would take that into Cinema 4D and retexture things, add more details to work better with lighting and add supplemental content or objects we thought they might need. We made some of the particle sims, like the Stay Fabulous logo, with C4D and X-Particles, imported them in and put them into a vertex animation.

There’s a balance working with these real-time scenes. You may be running 120 frames per second on your computer, but once you plug in to all of the different systems and tracking, you may lose half your frame rate. So your scene has to be optimized and running as smoothly as possible.

Glantz: When it came to the set design companies replicating the things that we created virtually, it was a mixture of sharing models, assets, and screenshots of the pieces they were going to be replicating. There was definitely a decent amount of tech-viz.

Beery: We had some awesome concept sketches from Psyop that really informed the look and gave us great references to perfectly match.

Kingdon: The team that did the practical work was incredibly impressive with how quickly they were able to fabricate the practical set pieces.

Beery: We didn’t really know what the set pieces were going to look like at first, but once we got on site we were able to adjust our virtual set. For example, the piano keys in Carrie Underwood’s scene were a lot more jagged and roughed up than we had expected.

I remodeled them in Cinema 4D very quickly so we could put it all in the scene accurately. There are a few things like that, where the practical doesn’t match one hundred percent, so we just had to go in and quickly make some adjustments, redo the UV’s, things like that so we could import it into the Unreal Engine.

Beery: The director is not going to sit there waiting for three hours while you make adjustments. So, honestly, we get things done as quickly as we possibly can. If anything's taking longer than an hour or two, it’s probably something we need to address and find a different solution for. We did get two days of prep, but we had a lot of scenes to go through. And that time is really for the director and director of photography.

Beery: The underwater scene with the boat was a challenge to figure out. We pioneered a new technique that we’ve been calling the LED “Band-Aid.” Essentially, we had to figure out where the camera was going to be placed so we could create a seam between the volume and the boat.

Glantz: We came up with the concept when we were doing some testing. We put a DJ booth up for some music live streams during the pandemic, and we added a second LED wall as another layer. So you have your main LED wall in the back working as the volume, and then another smaller LED wall in front of the booth. Both screens are tracking and displaying the content and, from the right angle, the camera displays it as one seamless image.

We utilized that technique for the boat scene by having a second screen in front of the boat set piece. That way, we could display the water underneath the boat in a way that would match up with the main volume LED wall behind the boat. We’ve never seen that used in similar productions, and NantStudio and Lux Machina, who are at the forefront of pushing this technology, were really impressed by how well it worked.

Beery: The great thing about Unreal is that it’s agnostic. I used a lot of Cinema 4D and Houdini, and someone else on site was using Maya and we were all able to use Unreal with no problems. On set there were really two of us making all the changes to the scenes, me and Bryan Brown from NantStudios. He’s worked in graphics and game design for years, and it was nice to get to work with him.

C4D is our main tool, and it’s so easy to work with because you don’t have to set up so much pre-stuff. You don’t always have the time to do R&D in Houdini. But with Cinema 4D, you can just get going with deformers, falloffs and things like that. It lets us accomplish the things we need to do quickly.

Beery: It’s definitely a bit of back and forth. They bring practical lighting into the volume, and that is more environmental with reflections. It’s pretty easy for us to go in and change color in case we need to match a practical. We plan out our light system, knowing that it will change on set. We’re not baking in lights, we keep them dynamic, so we can go in with multi-user editing in Unreal to change them. We don’t have to take the whole scene in the volume down just to change a light.

Glantz: For this particular pipeline we had two lighting directors on site, which really helped. We had someone on site helping control things in the post-process volume so they could listen to the director of photography and help make those changes.

Beery: The editor was editing while we were there shooting. So it’s all a real-time process. I think we had an edit out the day after we wrapped. That’s the beauty of this type of production, some color grading is pretty much the entire post-process. Everything else is just straight out of camera.


Author

Michael MaherCineasta/Autore – Denton, Texas