Ang Lee's 'Gemini Man' (Will Smith)

Sounds like everything presented in the trailer was accurate, better or worse, but better for me.
 
I think people are just used to cliche action films but this has the gimmick of one character being CG and directed by Ang Lee. We probably haven't even seen the crazy **** yet.
 
‘Gemini Man’: Ang Lee On the Challenges of Making His ‘Will Smith Clone Movie’

When Ang Lee introduced the first-ever screening of the full version of his new film “Gemini Man” he focused on the obvious: the many, many technical challenges involved in making a movie unlike any made before. It meant the creation of Hollywood’s first fully digital human character, “Junior” — a motion capture of Will Smith transformed into a 23-year-old version of the actor – while at the same time experimenting with the high frame-rate (120 fps) and 3D cinematography he employed in “Billy Lynn’s Long Halftime Walk.”

Lee said that one of his principal goals was to continue his “pursuit of what digital cinema should be,” but joked that at the same time, he “still had to deliver a Will Smith clone movie.”

“Gemini Man” is, at its heart, a Jerry Bruckheimer action film — but it’s one that Lee constantly reworked, returning to some of the themes that have dominated his career. When he was first sent the script, which Paramount was anxious to shoot soon, Lee returned a nine-page letter proposing how he would approach a page-one rewrite, which he assumed would yield a polite rejection. As he told a gathering of journalists during a post-screening roundtable, he was surprised when the studio said yes, and gave him six months to develop the script.

“The subject matter of nature versus nurture really got to me,” said Lee, speaking with journalists after the screening. “Gemini Man” centers on 51-year-old Henry (Smith), a retiring ace sharper shooter, who discovers he’s being hunted by his younger clone. “I’m a cinema person [and] just in my head when I heard the idea… It really makes you wonder about your own existence and what you would tell your younger self,” he said. “I’m old enough to think about that kind of thing.”

Lee made it clear that one of the three main reasons he agreed to the project was the challenge of creating Junior, which he knew from the start couldn’t be accomplished through de-aging like is being used for “The Irishman.”

“We do the whole body, not just the face,” said Lee. “We do it from scratch – that’s why I don’t like to call it de-aging. It’s not just brush up. Age does more mysteries than just wrinkles. It’s kind of sad what life does to you. Every layer of skin, bones, it’s how you age, your eyes, the enamel on your teeth. Just the subtle changes.”

Lee referred to the creation of Junior as a “leap of faith.” The process was so labor-intensive that he only saw final rendered images of Junior three months ago. One early test involved creating a CG Smith and placing him in a car scene from “Bad Boys,” and viewers — when showed side-by-side with footage from the actual movie — couldn’t tell the difference. However, the process that was used for the “Bad Boys” Pepsi challenge was radically different than how the living, breathing, and acting Junior would be created.

“That was for morale,” Lee said of “Bad Boys” test. “For the real deal, you don’t see it until a good half of year after post. You’re two feet are already in. It’s really just a leap of faith. It’s also an attitude. You won’t allow, if it doesn’t work, we shelve the movie, I don’t know.” He laughed. “No. Until you test [it] and people say, ‘OK, oh, we’re watching the movie, we’re not thinking that’s a digital person.”

According to Lee, to create Junior as he would appear in the final film, the base had to be a motion capture of Smith, which then would be altered, sometimes using footage from older Smith movies as reference. Lee was adamant that a pure CG character, from his perspective, was not possible at this time – it’s impossible to use only references.

“I believe in a cocktail, at least for now, you blend what’s real and what’s not,” said Lee. “Sometimes, you feel like you are doing a poor imitation of God’s work. It’s very challenging. It’s quite enigmatic. So it’s really a mixture of things… we cannot really use his old reference, only in shots [where we were having] trouble. And then we look at his old movies with similar angles … Half of his shots don’t look like him, but no one cares, because [they know it’s Will Smith.] It’s very curious.”

The process was painstaking, leaving Lee scratching his head most days. The final product of Junior does look real, even in a handful of tight medium shots of the character in dramatic scenes opposite Clive Owen, and later with Smith himself. But the young clone’s screen persona is different from the pop culture figure he’s based on: The naive soldier is earnest, with a polite southern demeanor – while looking every bit like a younger Smith, the digital creation is not burdened with carrying any of the Smith’s charisma and personality that make him a movie star.

One of the other major reasons Lee said he wanted to do “Gemini Man” was the opportunity to use both the motion capture character and the high frame rate to create a realistic, but carefully choreographed, fight scene.

“I shot ‘Crouching Tiger,’ and for 20 years I couldn’t crack it. Nobody can crack it. You hurt people,” said Lee. “It’s really dancing, it’s not really fighting. When I flip you over, you let me make that flip, it’s not because I flipped you over. The physics are the opposite of what a real fight is, and plus the strobe [how Lee refers to 24 frames per second], you never see intentions. So it was never realistic.”

The canvas of the action film in general allowed Lee to use the 120 frame rate approach to delineate action far more clearly than is possible in the chaos of a normal 2D action scene. Each movement and facial expression is far more articulated.

To take a crack a more realistic fight scene, Lee and his team spent nine months in post trying to alter and perfect a four-minute fight scene between the older and younger Smith in hand-to-hand combat. Lee explained that while the overall beats of the fight choreography can’t be altered, he and his team added scrapes, missed punches, and the messiness of fighting in general.

“A realistic [fight] lasts what, two seconds? You see [it on] YouTube. You want drama, so you have to choreograph it. It’s really a paradox,” said Lee. “I still [tried] to change their choreography and worked with stunt [team], but each time I see a bloody nose, and somebody gets hurt, I have to stop. So I still do the traditional Hong Kong style of choreography, but [it’s] more visceral, more thinking of the realism of somebody exchanging punches, and when they’re angry, the drama, the staging, it’s more drama-orientated and more messy, [which makes] it more favorable to the later animation work.”

Lee was extremely pleased with the possibilities of shooting action in the HFR and 3D, which captures action in a different way. He pointed to an impressive action scene early in the movie involving a motorcycle chase as a prime example.

“If it’s a 2D movie, because it’s so strobey, the way to pump energy is horizontal speed, speed and quick cuts, people get used to that,” said Lee, adding that with 120 fps, the amount of detail in the action changes everything. “You get people excited, I think it’s the detail [you] get to see it. I think that’s new a kind of filmmaking, and I think it’s this kind of staging [frontal], in and out of the perspective, a first-person-third-person kind of exchange. It’s a different kind of language, involvement, for both the filmmaker and the viewer.”

In both action and non-action scenes, Lee used the incredible depth of the frame, composing shots that allow the viewer to look far into landscapes and locations. At the same time, he was able to play the foreground and background action against each other by utilizing how the viewer’s eye picks up on any discernible movement with the high frame rate frame. Watching the movie, it’s clear that he learned a great deal of what worked and didn’t work from “Billy Lynn.” He also experimented with new ways of lighting scenes. The night-time photography is often day-for-night, as the viewer is able to peer endlessly into the detail of the blue night.

Lee has clearly been motivated in recent years by a desire to find the aesthetics and visual language of his stories through new technologies. He’s still learning how it works, but said he was convinced that 2D storytelling doesn’t translate to 3D HFR.

He described the challenge this way: “[Within] a dimensionalized clarity, how do you find what stimulates our eyes [and are you] willing to divulge yourself to a fictional world and join a story?”

He ended his discussion of the film by expressing a desire for larger audiences to experience the film in 3D at the highest frame rate possible. “The theatrical experience,” he said, “should be more than telling a story.”
 
This might be this year’s World War Z. Tons of pre-backlash and talks of it bombing, only for it to be a giant success and decent reviews.
 
Ang Lee On "Cloning" Will Smith and Learning From His Mistakes for 'Gemini Man' [Interview]

You could say that Lee’s recent career has been leading him straight to Gemini Man. The idea of digital creations caught his interest with 2012’s Life of Pi, a survivalist drama in which a CGI tiger played a major role. And of course, there was his 2003 superhero blockbuster Hulk, which may have piqued his interest in experimenting with digital film technologies, but definitely was a precursor to Gemini Man‘s central conflict about duality and bad dads. “[It’s] a little bit of unfinished business from that movie,” Lee conceded. “This is something they try to hide, cover up who they really are. That’s probably me, I don’t know. I do have some father issues and my being a father myself. But that’s just how I make movies.”

That’s still how Lee makes movies: from an emotional, character-driven perspective. As much as his films from the past decade have revolved around new technological innovations, Lee is still a character-first director. “To sell a story in a situation without a person really driving the performance…I don’t think is believable,” Lee said. “And I know it’s not just about science.”

For Lee, even with a digital effects-heavy film like Gemini Man, the science only makes up “10% of my work.” The rest of the work is in making the effects invisible, by adding normal human error. “It looks like somebody shot it and messed it up,” Lee said. “That takes the longest time [to do].” Especially in the case of 3D, which plays a prominent part of Gemini Man, Lee said.


People would be intimidated by 3D and more data and more details. But yes, it’s more scary. Most people wouldn’t try it. But you also have more material to work with — we had 40 times more data, which means we had 40 times more chances to flop or succeed. You just have more to work with. And when you’re in 3D, it’s sharper, your mindset is sharper. Both for the filmmaker and the viewer, you see it from a different place.


Why Will Smith is Not “De-Aged”
One of the things that Lee saw right away was that they couldn’t use simple “de-aging” technology on his star. “You cannot put on makeup, erase wrinkles, and do a different hairstyle and act like a person is younger, or cast Will Smith’s son to play him, and call that a clone,” Lee said. “I think Junior being a CG character is a requirement. It’s a whole package.”

Junior, the younger clone of Will Smith’s character Henry, is a completely CGI character, created through references to Smith’s past films (“part of it is 6 Degrees of Separation, a lot of Bad Boys”) as well as some motion-capture and stunt doubles. Smith would act out both roles. But it was more than simply putting dots on Smith’s faces, Lee said.

“We did it from scratch. That’s why I don’t like to call it de-aging, I t’s not just a brush-up. Age has more mysteries than just the wrinkles. When we started I was looking at him and thought, “Should he look older? Is he too young?” No. It’s kind of sad what life does to you. Every layer of skin, every bone, it’s just sad how much you age, even your enamel in your teeth, it’s all the subtle changes. It’s very inspiring, actually.”

Smith had previously spoken about Lee’s direction when he was performing Junior’s lines versus Henry’s lines. “Act less good,” Smith recalled of Lee’s advice. But Lee disagrees that this was Smith’s whole approach to the vulnerable, naive Junior. “It’s a lot more complicated than that, it’s not just unlearn what you learned, it takes a lot more.”

In the end, Junior is more than just an amalgamation of past Smith roles, or an older Smith trying to recreate his performance style of his younger days. “His own childhood and also his relation with his father, who was a military guy, it all comes to play,” Lee said.


Making The Action Hit Harder
One of the three main reasons that Lee wanted to direct Gemini Man was to create Junior. The second one, and arguably the one that he most succeeds at, was to best utilize the digital technology to enhance action. The hits land harder, the punches crack louder in Gemini Man, and Lee effortlessly merges his experience with traditional Hong Kong fight choreography with the new potential of digital effects. “[I wanted to make the action] more visceral, more thinking [about] the realism of somebody exchanging punches when they’re angry, the drama, the staging,” Lee said. “It’s more drama-oriented and more messy.”


“I want to do a realistic fight, but choreographed. I’ve been trying to do that since I first shot Crouching Tiger and for 20 years I couldn’t crack it. Movies being 100 some years old, nobody can crack it. It’s really dancing, it’s not fighting. When I flip you over, you let me make that flip because it’s not because I flipped you over. The physics is the opposite of what a real fight is, with the strokes you’ll never see intentions of that, so it was never realistic. But a realistic fight lasts two seconds You want more drama, you have to choreograph it, so it’s really like a paradox.


His efforts come to a head in two major sequences in Gemini Man: a jaw-dropping motorcycle chase through the bright, vibrant streets of Cartagena, Colombia in which Will Smith throws a motorcycle at Will Smith; and a bone-crunching hand-to-hand fight scene between the two Will Smith’s in an underground catacomb. The latter scene took nine months for Lee to film — “longer than I expected because once you want it to feel real, it’s something else, it takes a course of it’s own,” Lee said.


Lee wants to continue experimenting with merging digital effects and 3D technology. “It changed the course of action,” Lee gushed.

“Motorcycles, for example, motorcycles in a 2D movie is so strobe-y, the way to pump your energy is to do horizontal speed. So you have the speed and quick cuts. People get used to that. But now you see the performance. You don’t see the strobes, it doesn’t look that fast, what do you do? I think, what gets people excited? I think it’s the detail: a change in a magazine, this that, the strategies, you get to see it. So I think that’s a new kind of filmmaking and also this kind of a staging in and out of the perspective. First person, third person kind of exchange. It’s a different kind of language and involvement for both filmmaker and the viewer.


“I want to do a next movie [with this technology],” Lee added. “Take the next step and see what could happen with this, when you change the physics of it.”


Learning from his Mistakes
Gemini Man may be Lee’s most existential movie, also because it’s the fruit of Lee’s own existential crisis, in a way, that followed the release of his first poorly received high-frame rate experiment, Billy Lynn’s Long Halftime Walk. The 2016 war drama film, shot in the extra-high frame rate of 120 fps — the same that Gemini Man is shot in — was critically lambasted, with many calling its visual innovations a distraction. With the reception to that film, Lee said he’s honing something he calls the “digital cinema aesthetic.”

“Could I make it look good, light it differently, not just stripping away the artifice from the old age like Billy Lynn. A lot of people didn’t respond to that very well. They get angry. So I tried to pick a genre movie with a movie star. But I light it, expose it, and art direction it in a different approach. I think that’s…the beauty of digital, I suppose — I was just guessing, I was groping. So that’s the dimension, clarity. But within that, how do you find what stimulates [that part of] our eyes that thinks it’s good looking? And are you willing to devote yourself to a fictional world and enjoy that story? What is that state of mind that requires that it looks pretty?”

In pursuing that “aesthetic,” Lee said that he made mistakes every day. It was almost a year into shooting Gemini Man that he found that he had achieved the digital effects he was striving for. Lee would tweak and perfect various elements — one of the most challenging, he would say, wasn’t the big, epic action sequences, but the more traditional filmmaking techniques like a simple rack focus.

“We’re really used in 2D to the rack [focus]. But you don’t feel the depth. You feel the shift of clarity, you feel — somewhat — the filmmaker direct your attention from here to there. But that feeling, that’s the 3D feeling. The depth feeling, it’s also where your stage, the object you’re identifying with which is him which is outside your space.

I think the relation with the frame, it’s something quite tricky, quite interesting. Because I think with the frame, a psychologically you have your space and their space. When something’s here you feel the pressure, the intensity, when it’s turned around, you feel like you’re yourself doing it — the way you identify with how you stage that object you were looking at, in which way is it something else?

Lee isn’t done with this technology. He is planning to continue to experiment, to innovate with high frame rates and fully digital characters. But he knows that there’s a point where he has to pull back — a completely CG film doesn’t appeal to him or, he acknowledged, to audiences.

“Somewhere in the back of our head, we know it’s CG, it’s not real,” Lee said. “I believe in a cocktail at least for now: you blend what’s real and what’s not.”
 
With Gemini Man, Ang Lee may just have rewritten how movies are made — and watched

It took two decades to make it to the big screen, and as the credits rolled on the inaugural screening of Gemini Man, it was clear that the movie — or at least not director Ang Lee's version of it — couldn't have happened a moment sooner.

Lee has been on the forefront of cinematic progress over the last 30 years, either driving forward or bringing to mainstream cinemagoers new methods of combat (Crouching Tiger, Hidden Dragon), visual effects (Life of Pi), and camera technology (Billy Lynn's Long Halftime Walk). Gemini Man represents a synthesis of all those advancements, a career culmination that he hopes is just the beginning for both him and the film industry. He is a deep advocate for digital cinema because he believes it can change the medium or even spin off into some new kind of visual art form, putting audiences inside a world in ways that open infinite new possibilities for storytellers.

"People think 3D and anything high-tech is the opposite of art and soul — I don't buy that," Lee told a handful of reporters at a lunch in Manhattan after the movie's inaugural screening. And Gemini Man should be seen in theaters, in 3D — a rarity in an industry flooded with wonky conversions that happen after the fact.

Gemini Man stars Will Smith as Henry Brogan, a retiring corporate assassin who gets caught up in a conspiracy that puts the target on his own back. His assassination has been ordered by his old Marine Corps commander (Clive Owen), now the mega-rich, megalomaniacal CEO of an advanced military contractor firm. Here's where it gets twisted, and the film technology comes in to play: Owen's character, Clay Verris, has been secretly raising a clone made from Henry's DNA, and he sends Junior, the perfect killer, in to take down the older Henry.

Much of the pre-release focus has been on Weta's breathtaking digital rendering of a young Will Smith — a full from-scratch creation, not a de-aging — but Lee's technological advancements are clear long before the midway point at which Junior enters the movie.

Lee shot Gemini Man at 120 frames per second (normal movies are shot at a comparatively glacial 24fps) and with a 3D camera, and both tools fundamentally changed parts of his approach to planning, shooting, and editing scenes. From the very first shot, on a coastal horizon, the difference is obvious — shot with any other camera, the background would be flat, almost like a painting, but with 3D cameras and the frame rate, the dimensions are clear, and you can feel the distance between different things on the screen, as if you could reach out and grab them.

That's because the 3D provides a different Z-axis for the cinematographer, which means that depth can be captured. This isn't the kind of gimmicky 3D you find at a theme park show, where it feels as if a dinosaur might pop out at you — Lee isn't trying to shock, but immerse. It's most clear during fight and chase scenes, when you understand the depth between objects and feel like you're in the middle of the action.

Gunshots and explosions are less cinematic and more visceral, because there's so little artifice; forcing everyone to watch enough scenes like that might take the abstraction out of the gun control debate and scare people into backing a ban on assault weapons again.

Lee says they had to light, block, and shoot fight scenes in very different ways, both to adjust to the technology and to take full advantage.

"Given the real illusion of 3D, you can actually put things outside of the frame which is psychologically your space, so I think if I put something in your face, that's just more intense and it feels like you're doing something yourself," Lee explained to SYFY WIRE at the lunch's powwow. "So it includes both over-the-shoulder and point-of-view shots, in a language that we never really get. We have over-the-shoulder shots, but the staging is different.

"Within the frame is third-person to me, and this is first-person to me, so you really have that kind of movement, so I don't think you need to cut as much to direct how people see it," he continued. "You're in the movie, watching somebody, getting hit at [yourself]. It's just a different experience."

gm07673r.jpg

Credit: Paramount Pictures

The other part of the equation was the 120 frames per second, which provided Lee the opportunity — and, again, the challenge — of a much more information-starved camera. On Billy Lynn, they learned that they had to change entirely the way they lit their shots, because the camera took in so much more light. They shot a battle sequence in Billy Lynn under natural lights, in broiling Morocco, but this time around, they used as many artificial lights as possible.

The sheer detail caught by the 120fps also meant that they required far fewer camera and editing tricks to suggest kinetic action.

"Just imagine if you're involved in a fight or at close range, watching somebody really fighting in that corner, without strobe, with two eyes coming to an angle when your mind is sharper, you go deeper," he said. "How do you want to stage it? So you don't just use speed to pump adrenaline, and horizontally you actually get to see detail and get involved that way to feel the excitement."

Lee, who brought Wuxia films to the mass American public, asked his actors and stuntmen to engage in what he calls "messy fighting," a highly technical name for throwing down with less rehearsed choreography.

"We rehearsed with stuntmen for a long time, and I tried to change their habits, and I hurt them quite a bit, bloody noses but for real," Lee said. "We did that instead of what everybody uses when making movies, which is dancing, choreography. You count the number, you make the same wah hah hah noise, so you can hear each other and save an accident. And we broke the rhythm up."

The harder contact was rendered in CGI (the bloody noses were not meant to make it on screen), and the bigger stunts were done with more traditional techniques, given the size of the movie and cast he was dealing with. He was already asking enough of his actors, who were required to feed the data-starved 3D, high-speed camera in ways that far exceed their normal gigs.

"[With these cameras] you read through people," Lee said. "They cannot fake it — they have to fake it different, rather. They have to upgrade their skills, really have to get a visceral feeling and every take, I have to hit them with different thoughts to distract them so they can react to something, so they look alive instead of performing well. They have to be a lot more genuine, a lot more complex. I think it's a wonderful world, when you do the digital face and a body study. It's like a microscopic study of what drama is. What movement, what emotion, how to make contact with emotion, what triggers things and what age does to you, cell by cell."

image007.png

Source: Paramount

The digital recreation of a younger Smith is entirely convincing about 95 percent of the time, at least on the high-end projector used at the first screening — ironically, it's the high frame rate, and the need for more details, that dipped the occasional close-up the slightest bit into the uncanny valley. But given how well-known Smith is — perhaps the world's biggest movie star — it's an impressive accomplishment, especially because, as Lee suggested, it's Smith's intangibles that separate him from most other actors.

"One of the hardest things, if not the hardest thing, in the animation was how do you get the secret of him getting paid the big bucks: Will Smith's charm," Lee said. "Whether he's crying, he's angry, or menacing. There's a lot of menacing scenes, but you still love him. What is the secret? How do you animate that? How do you capture that? He cannot play that in his 50s, he now has a different charm. And you cannot retrieve from his old movies. You can use it as a reference, but what drives it? What final touches of that? It's mostly in the lighting, not just texture."

The hope is that Gemini Man will do well enough, and engage enough people with its groundbreaking visuals, that more movies can be made with the technology. And for Lee, who has become a pioneer in the field, the goal is to use those movies to places where they are best displayed.

"My dream is, I hope this movie does some business, and other filmmakers will join in and will develop this thing, and bring people to the big theater," Lee said. "I would like to imagine single theaters come back with a big screen, different seat layout, more immersive. I would like to see people go back to the temple, spiritually, of the theater. That's my dream."
 
Really detailed article on the time and effort used to bring Junior to life

The Game-Changing Tech Behind 'Young' Will Smith in 'Gemini Man'

Another day in the laboratory, and all was quiet apart from the scritch-scratch of styluses on graphics tablets. On dozens of screens, glowing in the low light, were the various components of a human body: the dislocated sphere of an eyeball, the strange topography of skin in extreme close-up, a thicket of sprouting hair follicles. At one particular coffee-cup-strewn workstation, an artist studied the essence, the key to the team's efforts—a clip of the '90s sitcom The Fresh Prince of Bel-Air.

It was 2018, and the crew at the New Zealand-based visual effects studio Weta Digital was hard at work manufacturing Hollywood's hottest new talent, ahead of his big-screen debut a year later. He's a new species of actor, with unswerving focus, superhuman strength, and total commitment to the role. He doesn't take breaks or require the services of hair and makeup. And he doesn't need a trailer, since he lives on a hard drive. They call him Junior or, sometimes, “the asset”: the most ambitious computer-generated human ever made for a movie. He's also the spitting image of a 23-year-old Will Smith.


In June of this year, in a postproduction facility in Manhattan, a crew member shows off the nearly complete asset. Up onscreen is a shot of the real-life, 49-year-old Will Smith as he looked on the set of Gemini Man, wearing a facial-capture headset, his face and neck specked with tracking dots. The film, set to be released in October, is a sci-fi action thriller directed by Ang Lee that follows a retired assassin, Henry (Will Smith), who finds himself in the sniper-scope sights of another, younger assassin (digital Will Smith), who has been forged out of Henry's own DNA. It's a story of a man trying to outwit himself, of weather-worn wisdom pitted against cocky youth. It's also a cautionary tale about humans hubristically meddling with awesome tech.

A click of the mouse and suddenly Smith is transformed ... well, “beyond recognition” isn't the right phrase. The squinting eyes, the jut of the chin, the precise tessellation of the lower lip and upper lip stay the same. But the eyebrows thin, the cheekbones and jawline firm up, the brow unfurrows—headset and facial hair vanishing along with more than a quarter-century of wear and tear. It's one hell of a face-lift.

The Gemini Man screenplay, originally written by Darren Lemke, has been kicking around Hollywood in various forms for around 20 years, waiting for the tech to catch up to its central conceit. At different times, Harrison Ford, Jon Voight, and Mel Gibson were expected to star, and Disney did some promising experiments with digital face technology in the early 2000s before abruptly aborting the project. In 2012, the director Joe Carnahan shared a trailer he mocked up as a pitch for the job, of Gran Torino-era Clint Eastwood facing off against Dirty Harry-era Clint Eastwood. (It's still available to view on YouTube.) Even then, the feeling in the visual effects industry was: Not yet.

Is it time now? Bill Westenhofer thinks so. As Gemini Man's lead visual effects supervisor, he oversaw the combined efforts of around 500 artists at six visual effects studios, including Weta Digital, who were charged with conjuring Junior. Westenhofer, who is 51 years old, has the physical presence of a set builder rather than a pixel pusher, with the easygoing air of someone who's made a living out of realizing impossible things. “My job,” he says, “is like coming to the edge of a cliff with a sheet and some rope, jumping off, and building a parachute on the way down.”

In the mid-'90s, after completing a master's degree in computer science at George Washington University, Westenhofer began his career at the Los Angeles-based visual effects studio Rhythm & Hues. He carved out a niche working with animals: the chatty critters of Babe: Pig in the City and Stuart Little, which led to Cats & Dogs and the beastly besties of The Golden Compass. Rhythm & Hues' work on the majestic lion Aslan, from The Chronicles of Narnia: The Lion, the Witch and the Wardrobe, got the studio the gig for Life of Pi, which would prominently star a digital Bengal tiger. “Our process was just to believe it was possible,” Westenhofer says, “to get from something that's a little pathetic-looking to being completely convincing and real.”

Life of Pi earned Westenhofer and the Rhythm & Hues team an Academy Award in 2013. The recognition was bittersweet: Eleven days before the ceremony, the studio had filed for bankruptcy. Accepting his award, Westenhofer was in the middle of calling for more recognition of visual effects workers as artists when he was drowned out by the menacing strains of the Jaws theme. Accepting his own Best Director trophy later that evening for Life of Pi, Ang Lee neglected to mention the crucial role visual effects played in the film—but the director didn't forget about Westenhofer. In May 2017, Lee came aboard the Gemini Man project; Westenhofer, a roving freelancer, signed on three months later.

For years, there have been considerable advances in the software, the hardware, and the effects industry's underlying understanding of human physiology. But a speaking, interacting digital human in a starring role is still a huge leap from the relatively fleeting digital human cameos of Rogue One and other films—and a far more fearsome undertaking than a 500-pound cat. To pull it off would take a multiyear, brute-force effort. “I was confident we could push the technology the rest of the way there,” Westenhofer says. Plus, there was a gigantic dare involved. “A digital person is something that's been spoken about since I started visual effects 26 years ago,” he says. “It's the holy grail.”

When the Gemini Man teaser trailer dropped in April, set to a moody rendition of “Forever Young,” many assumed Smith had simply been “de-aged” in the same fashion as several performers in recent Marvel movies (most notably Samuel L. Jackson's Nick Fury in Captain Marvel). Those radically rejuvenated visages were achieved through after-the-fact photographic manipulation—extreme airbrushing of the actors as they appeared on set.

But Lee shot Gemini Man at ultrahigh resolution and an ultrahigh frame rate, part of his mission to create cinema that looks and feels more like reality. At that crispness of image, even an actor's stubble can be distracting onscreen; a digitally dermabrased Junior would have looked as if he were wearing kabuki makeup. Instead, Junior would have to be pure data, manufactured from the ground up, driven by the real Will Smith through motion capture.

In director Ang Lee’s new movie, Gemini Man, retired assassin Will Smith is hunted by his younger self.

The young Will Smith of Gemini Man is not the fresh Prince you probably remember. He is physically bulkier, having been raised in a militaristic household by a shadowy character played by Clive Owen. Also, he lacks the signature mustache—the discussions over that decision were “intense,” Westenhofer says. And Smith is now a better, more serious actor than the kid in the fluorescent cap who jeered “Smell ya later!”

In an interview, the human Will Smith seems to be, if anything, invigorated by the passage of time. “Man,” he tells me, “life is delicious.” He was intrigued that the behind-the-scenes process of making Gemini Man mirrored the cloning plotline of the film itself. “I've always loved science fiction, and what's really interesting on this one is that there's a certain amount of science fiction within the film becoming science fact outside of it,” Smith says. “The guys are actually doing it.”

Back in early 2018, Weta scanned Smith with an array of high-resolution cameras while he performed a predetermined set of facial calisthenics, compiling a vast database of the full expressive repertoire of Smith's face. A model of Smith's body was fitted with a core digital skeleton and muscle system, the biomechanical chassis Weta uses for nearly all of its computer-generated humanoids, creating a basic digital replica—or clone, if you will—of modern-day Will Smith. In the primordial nothingness of Weta's hard drives, Junior was born. The next task: reverse-engineer by hand that digital 49-year-old into the shape and form of a 23-year-old.

For visual reference, the team amassed stills and clips of early- to mid-'90s Will Smith. Bad Boys, Six Degrees of Separation, and, of course, later episodes of Fresh Prince of Bel-Air were pored over like sacred texts. The team unsentimentally scrutinized old childhood photographs and even sourced a relatively crude facial cast of the actor that came from Independence Day.

Around mid-2018, the team tried an experiment they called the Pepsi Challenge, dropping the Junior-in-progress into the middle of a scene from Bad Boys, which was made in 1995, and animating it manually. It was a little wobbly, but it convinced them they were on the right track.

Throughout the rest of 2018, Junior continued to evolve into something resembling fleshy, organic life, one facial and bodily component at a time. One team modeled Smith's teeth—not only the enamel on the outside but the dentin on the inside—while another worked on the way the lips compress. They mapped out the pores, so that the skin creased and wrinkled along its natural fault lines. They studied the balance of the pigments of the skin and the particular way black skin refracts light at a subsurface level. They even chiseled away at the skull. And they spent a lot of time gazing deeply into Smith's eyes: the sclera, the cornea, the inky film on the inside of the eye called the choroid. They modeled how the cornea interacts with the iris and how the iris interacts with the pupil. They modeled the conjunctiva, the thin transparent membrane that envelops the surface of the eye. Then there was an ocular breakthrough. “We've always done a relatively spherical eye,” says Guy Williams, one of the heads of the Junior team at Weta, who reports to Westenhofer. “We realized that a real eye, when it's sitting inside the socket, is actually squished into shape.” So, yes, they proceeded to squish Will Smith's eyes.

The team at Weta was building on its own extensive research on the dynamics of skin, blood flow, breathing, and more, which the effects artists largely taught themselves. “God knows we're not saving lives here or anything like that,” Williams says. “But you have to treat it almost like medicine. You have to be clinical and objective about it.” Cold dispassion meant representing Smith's face honestly in its human imperfection, however slight. “If he has a quirk about his face—and I won't go into those out of respect—you have to have that in there.” (Look out for Smith's one misaligned mandibular lateral incisor, dutifully replicated in CG.) As the months wore on, Williams says, the refinements and adjustments became comically pedantic. “There are times in this job where you have a roomful of mature, middle-aged people sitting around a monitor, looking at a screen, talking about the amount of shine a pimple should have.”

The team eventually got Junior to a point where he looked eerily authentic, at least in still images, but he remained an empty vessel in want of a performer to animate him—a “soulless corpse,” as Williams put it. Ultimately it was up to Smith to make Junior move. For scenes featuring both Henry and Junior, Smith shot what the crew called the A side (playing Henry) on a real-world set, and then shot the B side (playing Junior) months later on a motion-capture stage in Budapest, wearing bulky headgear while an array of cameras recorded his every movement, large and small. “It's really difficult with that amount of technology,” Smith says. “Rigs on my head. The camera has to be really close. You can't move a lot.”

Still, Junior wasn't ready for his close-up. As sophisticated as motion capture is, and despite the massive trove of measurements taken of Smith's every gesture and movement, it still cannot record the full richness and depth of human behavior—the subcutaneous subtleties and minute movements, the microexpressions, the difficult-to-pinpoint qualities that comprise humanness. So, right up until August of this year, Weta's animators studied Smith's emoting frame by frame, beat for beat, and then tried to manually massage and coax the same out of Junior, while striving to maintain the illusion of unposed spontaneity. “What we're finding is that to get past this uncanny valley it's not really one single thing,” Westenhofer says. “It's the symphony of all the different things happening at the same time. If any one of them fails, you're left with something that doesn't look real.”

With the difference in features and proportions, there's an element of creative interpretation in the work, but the idea is that, viewed side by side, the digital model of Junior would replicate Smith's original performance in all its physiognomic nuance. For some sequences, that fine-tuning took as long as 12 months; rendering a Junior shot alone took hours or days. The stakes were high. This was, as Westhofer said, the holy grail, and a failure to produce a believable Junior would rain down mockery. “I can't go into the finances of it,” Williams says, “but it was as expensive as any other asset we've ever done.” Lee has been less circumspect, happily divulging at a press event earlier this year that the digital Will Smith cost twice as much as the real one (though how much the real one costs, the studio wouldn't say).

By early 2019, Westenhofer felt they had produced sequences that finally felt in sync. Then, the dawning realization—it's alive. “It's an exciting feeling,” he says. “When that first render comes out, there's a little bit of this sense of ‘I've created life.’”


Double Take
Synthespian, virtual actor, vactor, cyberstar, de-aged, youthified: Whatever you want to call them, a new breed of performer is on the rise. They're increasingly digital—and decreasingly lifelike. –Caitlin Harrington

For years, digital humans were little more than disposable alternatives to flesh-and-blood actors, mercilessly dispatched to cop a mauling from a T. rex in Jurassic Park, be drowned in Titanic, or get blown up in Pearl Harbor. The 2001 film Final Fantasy: The Spirits Within was notable for attempting to have digital humans in starring roles. But it wasn't until 2008's The Curious Case of Benjamin Button, making use of new photographic techniques to study and simulate human skin, that a digital human pulled off a sustained and convincing performance: an aged Brad Pitt, computer-generated from the neck up. Still, the movie's director, David Fincher, is said to have complained at the time that the tech “sandblasts the edges off of the performance.”

Tron: Legacy followed, with a young, somewhat glassy-eyed Jeff Bridges; a passable digital Arnold Schwarzenegger stomped into Terminator Genisys. Some of the most important technical developments, though, were made in the service of nonhuman characters. Weta's work with Caesar and the simian cast of the new Planet of the Apes films, as well as the orcs of 2016's Warcraft (overseen by Westenhofer), achieved expressive performances, thanks in part to “path tracing,” a more sophisticated simulation of how light bounces, penetrates, and reflects off surfaces and materials. When the actor Paul Walker died during production of Furious 7, Weta was able to complete his performance. In the film, more than 250 shots of a digital Paul Walker were integrated with existing footage. Gemini Man represents the next step in the evolution of the technology. (Martin Scorsese's The Irishman, slated for release by Netflix in the fall, is expected to feature Robert De Niro, Al Pacino, and Joe Pesci wearing their digital—not merely de-aged—younger mugs like hockey masks.)

Rogue One, the craggy countenance of Peter Cushing proved less of a problem for the artists at Industrial Light & Magic than the pure-as-snow visage of a young Carrie Fisher. But even more fundamentally, moviegoers have a tendency to regard any CG actor warily, waiting for the anomalous jolt, the telltale glitch. As Smith himself says, “You're always trying to see some part of it that could break the suspension of disbelief.”

Smith's mind, at least, is sufficiently blown by Weta's achievement in Gemini Man. “There's a completely digital 23-year-old version of myself that I can make movies with now,” he said at a screening of the movie in July. “I'm gonna get really fat and really overweight. ‘Use Gemini Junior!’” The audience laughed. But really, credible human performances will get easier, cheaper, and more efficient to counterfeit, and when that happens, there ought to be no shortage of job opportunities for a perennially ageless Will Smith. Not just for more stories involving clones, time travel, identical twins, and time-traveling identical clone twins, but also for dramatic roles that mostly eluded him in his twenties (or, you know, a Fresh Prince reboot).

“If there's a role in a movie that I would have been perfect for 25 years ago, and if we have a photorealistic 25-, 27-year-old version of myself—it's like, why not?” Smith tells me later. He would personally love to see Junior in a romantic comedy, he says, perhaps alongside a digitally resurrected performer from a bygone era. “I had this weird dream about five years ago. I saw a movie with me and Audrey Hepburn. I don't know where that came from, but I was like, oh my God. And now the technology exists!”

Smith predicts, too, that he could more easily assume different body types; instead of physically transforming for roles, as he did for 2001's Ali, he could wear a beefier digital body. Or the likeness of a historical figure. And who's to say that the Will Smith avatar always needs to be piloted by the actual Will Smith? He could authorize another actor to do the job or entrust the performance entirely to the director and animators. Already, there are discussions in the industry about whether machine-learning software—an evolution of the AI software that was developed to create convincing “intelligent” crowds in the Lord of the Rings films—could be programmed to analyze past Will Smith performances and simulate a new one.

By unsettling coincidence, ersatz humans are on the rise outside of the movies. In the past couple of years, “deepfake” technology, inexpensive and accessible to anyone, has been used to create bogus videos of everyone from Scarlett Johansson and Tom Cruise to Barack Obama. In June, Smith unwittingly starred in a deepfake of his own, a viral videoof his face transplanted onto the body of Cardi B being interviewed on The Tonight Show.

Compared with the two years' worth of work done by hundreds of professionals, the quality of the Cardi B video, along with other deepfakes, is awfully crude. Still, the drive to fool the viewer is the same, as is the worry: Weta's breakthroughs give filmmakers the ability to depict performers, living or deceased, saying or doing things they never said or did. That might not seem like such a problem in the film industry, where creating illusions is the whole point. But it does suggest that performers will have to start being vigilant about safeguarding the rights to their image if they submit to a scanner.

For the rest of us, these technologies continue to stretch the boundaries of believability—onscreen and off. “The public needs to be educated as to the kinds of things they can expect,” says Darren Hendler, director of the Digital Human Group, a special department inside the visual effects studio Digital Domain. Decades ago, people were alarmed to realize that photographs could be retouched or manipulated and could no longer be seen as an unbiased record of reality. As Hendler puts it, “We're all going to have to change our understanding of what we believe is real.”

More than 500 visual effects artists spent two years digitally building a young version of will Smith.

On my last visit to the postproduction facility in Manhattan, Ang Lee stops by to check how the shots are coming along. In a theater equipped with projectors originally built for military flight simulators, we watch the scene where Henry first comes face to face with Junior. “You don't know ****,” Junior says, his digital eyes welling with digital tears. “Kid,” Henry replies, “I know you inside and out and backwards.”

Lee is a restless filmmaker who has moved from genre to genre. He also invests digital filmmaking and visual effects with special significance. “People call it technology,” he says. “**** no. It's art.” It's an art that he hopes to see at the service of better, more human stories. “What we do here is basically imitating God's work,” he says—echoing the divine theme of the quest for the holy grail. “The creation of something that looks alive, that looks like it has a mind of its own.”

With only a couple of months left for postproduction, Lee isn't entirely satisfied with the scene we just saw; it's humbling, he says, to spend two years puzzling over why a one-second shot of a digital human just isn't jelling. “If you think about it, being here, being healthy, just being alive—that's a miracle. God's work,” he says again.

“That isn't really fair,” Westenhofer chimes in. “We only had two years. God had 13 billion years.”

The paradox is that Junior isn't intended to be a distractingly dazzling effect but, rather, an invisible one. The goal is for average moviegoers to forget, at least temporarily, that they're watching a CG actor and be immersed in the storytelling. “The ultimate success,” Westenhofer says, “is to work ourselves out of any recognition.”

One way or another, there'll be considerable talk about Junior—debates about his realness and hushed predictions about his future. Paramount, the studio behind Gemini Man, declined to clarify Smith's rights regarding his digital counterpart. Similarly, Weta was not able to disclose what would happen to the amassed petabytes of precious Junior data. The actor, for his part, confessed he didn't even get to keep a copy of Junior on a thumb drive. “Unfortunately, the digital companies still control the assets,” Smith says. “Legally I control my likeness; they control the actual information.” For now, Junior will lie in a patient, dreamless sleep, entombed in some vast antipodean storage system until his next big role. When he's inevitably plugged in and powered up again, good as new and fresh as ever, maybe for a Gemini Man sequel or Breakfast at Tiffany's 2, perhaps he'll think to himself: Life is delicious.
 
Even though this isn't de-aging, I'm way more impressed with Junior than young De Niro from The Irishman.
 

Users who are viewing this thread

Back
Top
monitoring_string = "afb8e5d7348ab9e99f73cba908f10802"