Will Smith faces-off against a younger version of himself in Ang Lee’s action-thriller Gemini Man, setting a new benchmark in VFX with the creation of a completely digital human character. Guy Williams, visual effects supervisor at Weta Digital, explains why the creation of Smith Jr. required more than simply digitally de-aging the actor.
“The crux of the movie is there’s a 50-year-old assassin who wants to retire is hunted by an assassin who we find out is a clone of himself that’s 30 years younger,” says Guy Williams. “The important part, or spine of the movie is about the existential crisis that two people feel when they realize one is a clone and a precursor. So to really play on that dichotomy and the conflict on a mental level, you have to recognise both the 50-year-old assassin and the 20-year-old assassin as being the same person. The idea is to pull you into the point where you feel their angst and can relate to it more, so we couldn’t just have a different actor play the younger part.
“The question is why couldn’t we have Will Smith play both roles and just use traditional de-aging techniques? There were two reasons for that. One was that he’s on frame twice – he’s talking to himself a bunch of times. You would have to do a very complex motion-control setup to do that. It’s achievable and has been done in the past, but really puts a burden on the filming process, especially because they also physically fight each other, which makes it so you couldn’t do that even if you wanted to. So there’s that limitation.
“The other thing was, as amazing as these approaches are, they don’t scale perfectly. If we do a lot of shots, they’re built off the amazing brute strength of individual artists, so if you have to do a lot of work, then you might have to bring in people who aren’t as good as your best people, and you start to spot the differences.
“Ang wanted to shoot 4K, 120fps Stereo 3D. He didn’t want as good as people have done to date, he wanted to do better. So right off the bat we knew the needs of the story and the desires of the director – we were looking at creating a fully digital human.”
Williams goes on to explain that creating such an ambitious digital illusion was not dissimilar to the performance-capture techniques used to create digital characters in films like the Planet of the Apes series. The real challenge, however, was avoiding the ‘uncanny valley’ – that unsettling feeling that arises when something that is designed to look human isn’t entirely convincing.
“The only difference is a talking chimpanzee is something that we kind of understand but don’t totally recognise. Whereas our brains are wired 100 per cent to recognise a human face and the emotion in a human face. You rarely run into the uncanny valley when it comes to animals because we don’t associate with animals like we do with other human beings. So we have to get it 100 per cent right.”
Williams says that VFX testing during pre-production was limited due to the expense and effort involved, but fortunately producer Jerry Bruckheimer and Ang Lee had complete faith that Williams and his team could pull it off.
“You just kind of commit to it, he says. “But there are a bunch of validation tests where you take photographs from earlier movies and pose our digital character to match those. It’s not so much seeing if we’re able to do it, it’s more knowing we aren’t yet there. It’s to show us where we’re off and keep refining it. Imagine you’re painting a person and the groundwater test is having your painting next to the person as you’re painting it so you can constantly improve your painting to look like the person.”
When selecting early Will Smith performances to use as a reference for creating Smith Jr., Williams says the first Bad Boys and Independence Day were the primary choices, adding “with a smidge of Fresh Prince of Bel-Air, Six Degrees of Separation and Men in Black.”
So was Smith a little freaked out when he saw the finished effect?
“He was always very complimentary,” says Williams. “At one point I had him backstage with no cameras or microphones and I said, ‘just tell me the honest truth – did we achieve the result you’d hoped for?’ And he said, ‘you don’t understand dude, it was incredibly trippy – I’m looking at this and I know that it looks like me, it talks like me, that it is me there, but I know I didn’t do that. It’s very confusing to me!’ He found it very interesting to see himself do something that he had never done before.”