GDC 2024: The magic of Vicon’s real-time retargeting

Bring Characters to Life with Real-time Targeting

One thing that the Game Developers Conference drives home every single year is the sheer (and growing) complexity of making video games.

“Every time a new technology comes out to address a challenge, we find new challenges as a result,” David ‘Ed’ Edwards, VFX Product Manager for Vicon, told an audience at a motion capture showcase. “There’s this constant cyclical pattern of working out how we make life easier for people making games. Even though motion capture is just one of a whole series of technologies involved with game development, it’s a critical one. It’s one that’s central to a lot of different pipelines. So we’re constantly asking ourselves: how do we address that issue? How do we make creating games easier, simpler and faster?”

There are many answers to that question, but in recent years many of them have come back to real-time streaming.

“We introduced this to the industry because we didn’t want people to have to wait for processing to complete to see what their data looks like,” said Ed. “The reason we introduced props was because we didn’t want people to just imagine what it would look like. All of these things are done to reduce reliance on the phrase, ‘just imagine’.

“We never really want people to say ‘just imagine what this character will look like’. Because when you say that, you introduce ambiguity, you introduce interpretation, you introduce a degradation of confidence in what they were working towards. So one thing we are constantly aiming for is removing that need to ‘just imagine’. We want to show people what they’re working on at that very moment.

“One of the ways we do that is retargeting.”

Retargeting is the process of mapping a human performer’s skeleton onto the skeleton of a character that doesn’t share their bodily proportions so that they can PCAP (performance capture) that character.

“What this involves is taking each of the performer’s bones and mapping them to the bones of the character we want to drive,” said Ed. “And in some cases, where perhaps extra bones exist [in a tail, for example], we can just switch those off because we don’t want them to be influenced in any way by the performer’s motion.”

The initial work is done in Shōgun Post, but once the retargeting is set up it can be imported into Shōgun Live.

“With a few magic clicks we see our retarget in real-time,” said Ed. “We’re getting highly accurate tracking. We’re getting everything in real-time. We don’t need to reprocess it. We see our character as they are designed, in the here-and-now.”

A technical solution to creative problems

While retargeting is a technically impressive feature, its purpose is creative.

“This is really important for directors, who can now see their idea coming to life,” said Ed. “You’re not reliant on storyboards, you can see what it looks like static and what it looks like in motion. And that’s really important because again, we are reducing your reliance on imagination. From a performer’s perspective, your actor can now see the character that they’re driving and they can map their performance to that.

“These are all important things that mean that as a team, you can creatively work together to create scenes that are indicative of the final product. Previously, you had to wait for it to be done by the renderfarm, you would have to wait for your FBX to be processed.”

This instant visualization means that changes can be made in the moment, not after-the-fact.

“If there’s something off with the proportions, or we feel that the performance needs to be modified to better suit the characters as they’ve been created, that can be done in the volume,” said Ed. “It’s not something we need to wait on feedback for, three weeks down the line. It can happen in the space. And, of course, we can also record the data, we can review it later, we can play it back.”

This real-time visualization isn’t only available as pre-visualization in Shōgun Live—it can be streamed straight into Unreal Engine 5, complete with character skins and environments for an even clearer sense of how your animation will look in your final cinematic.

“As you can imagine, you can replay this data with different models, you can bring in a different retarget,” Ed explained. “What it really does is empower creators to make iterative decisions as they go, and do it with confidence as to what the final result will be.”

For more information on how a Vicon motion capture solution can improve your creative output, click here.

Check out the highlights of Ed’s presentation below. Alternatively, you can view the full version along with others from the day here.

Watch the full library of GDC presentations from the Vicon booth here:

WATCH NOW

Interested in stepping into motion capture? You can reach out using the below form.