GDC 2024: Virtual production and in-camera VFX the Lux Machina way

Virtual production and in-camera VFX the Lux Machina way

In 2018 virtual production had been in use for a while in the video games world for virtual scouting and lining up shots for cut scenes, but The Mandalorian took it to a new level. The show’s production team pioneered the combination of huge LED screens with camera tracking to produce in-camera VFX, changing the way many films and TV shows would be made. The studio behind this system was Lux Machina, a Vicon customer that’s one of the leaders in the field of motion capture for the entertainment industry.

At GDC 2024, India Vadher-Lowe, Lead Motion Capture Specialist for Lux Machina, which has worked on other blockbuster projects including House of the Dragon and Barbie, explained what the studio does and how it does it for an audience at Vicon’s booth.

“We first developed our pipeline for the Mandalorian— a full production that was built on reflections in armor,” said Vadher Lowe. Lighting is one of the key reasons for using an LED volume rather than traditional green screen. By wrapping stages in huge, high definition LED walls, studios are able to not only give actors something to react to, but also to make sure their footage is correctly lit by the virtual environments on the screens. On productions such as The Mandalorian, with its highly reflective armor, or subsequent jobs such as a Chemical Brothers music video and Apple’s Masters of the Air series, which feature a reflective airstream trailer and airplane fuselages respectively, getting those lighting effects right is critical.

To make in-camera VFX (ICVFX) work, having rock solid data on the camera’s position is essential.

“The primary focus is to define the exact positional data of the camera,” Vadher-Low explained. “You want the position and orientation of the camera within 3D space in relation to that virtual environment. That way, you can seamlessly integrate your real-world footage with your virtual elements. But we’re not just capturing and tracking what the camera is doing. We want to know what the camera is seeing. And that is camera frustum. 

“The frustum is the virtual viewpoint of the camera. Essentially, it’s a truncated cone shape that refers to the area in 3D space that is visible to the camera. And that’s affected by certain parameters, whether that’s the camera’s field of view, the aspect ratio, near and far clipping planes, or the position and orientation of the camera and space.”

As Vadher-Lowe put it, it’s not only about the actors’ performance—it’s also about rendering performance.

“If you have a very weighty unreal scene, you need a considerable amount of processing power to be able to render that out to a higher resolution. By tracking the camera, you’re able to do what we call frustum culling and render out the viewpoint of the camera. You’re essentially rendering only what the camera can see in real-time, because anything outside of that is not contributing to the final image.”

To facilitate camera tracking, Lux Machina has developed its own active tracking crown. “We’ve designed it to counteract the difficulties that you have working with motion capture and with traditional filming techniques. Your key rule when working with motion capture is that you want to minimize occlusion. Because the cameras are working by line-of-sight, they need to be able to see the object to capture data. 

“You also want an environment that has minimal reflections, because any reflections can cause added noise. When you’re on a film set, everything is reflective and everything causes occlusion, so we’ve built a tracking crown which counteracts that, because we want to be visible to as many cameras as possible in that area.

“So, how do we attach it? We calibrate the nodal point of the lens in relation to where that tracking crown is. It’s really important that we have a really accurate calibration of that nodal point, because then you’re able to get the correct parallax between your foreground and your background.”

Vicon’s suite of VFX motion capture tools are an essential part of the process.

Getting the right tool for the job.

“My job is really to choose the correct tracking system for the job,” said Vadher-Lowe. “There are so many ways that you can track a camera and you really need to take into account the positives and negatives. With Vicon, you’ve got quick, low-latency mode. You’ve got Object Tracker, which is really quickly processing grayscale data and throwing it down a low latency port, and that ensures that we are able to get the data from Shōgun into Unreal as quickly as possible.

“We’ve also got flexibility. I spoke earlier about lens calibration—you have the ability to make changes to that, you can de-rig and re-rig the camera. You also have accuracy. With Vicon you have submillimeter accuracy, which equates to sub-pixel accuracy with LEDs.”

The calibration process is also key to Lux Machina’s workflow.

“We use the Shōgun video camera calibration, and we calibrate it as if we’re calibrating motion capture cameras,” said Vadher-Lowe. “You’re collecting wand samples, and as you collect them it’s able to determine the radial distortion, your principal points, the entrance pupil threshold, and the skew of the camera—the position and the orientation. It’s a quick process that can take anywhere between 10 and 20 minutes per lens. When you’re working on a traditional film set where you may have a heavy lens package, knowing that you have the ease of a quick lens calibration means that you can get through that very quickly. You’re not holding a film set up.

“We also build our own servers. This is heavily pushed towards optimizing real-time tracking, especially in Unreal Engine.

“These are some of the great ways that you can really maximize your camera tracking.”

For more on the work of Lux Machina, see our recent spotlight in The Standard. To learn more about Virtual Production, click here.

Watch the full library of GDC presentations from the Vicon booth here:

WATCH NOW

Interested in stepping into motion capture? You can reach out using the below form.