How ‘The Creator’ Crafted a Bold Vision of the Future on a Fraction of a Normal Blockbuster Budget

Filmmakers used prosumer cameras and super-wide lenses and designed VFX after principal photography

The Creator BTS
20th Century/Regency

Gareth Edwards’ “The Creator,” in theaters now, is a unique kind of sci-fi blockbuster from a major studio in that, while it looks like it cost a lot to make, the “Rogue One” director and his filmmaking team’s indie spirit and approach to the production actually kept the budget of the 20th Century Studios film under $100 million. Here’s how they did it.

The film is a tale set in the distant sci-fi future, where a war against artificial intelligence rages on. The U.S. is almost completely anti-AI while New Asia, a region of southeast Asia, still allows it. A former special operative (played by John David Washington), is tasked with tracking down the AI’s secret weapon: a little kid (Madeleine Yuna Voyles) who has the power to control technology.

While the plot will remind you of any number of movies and television series that have come before, the way that “The Creator” was made is wholly unique. By using prosumer cameras, a “spontaneous” shooting style and applying visual effects after principal photography, they made a movie rumored to have only cost $80 million look better than some of the biggest-budget movies this year, such as “Indiana Jones and the Dial of Destiny,” which cost around $300 million.

“Pretty much everything we did was very unconventional,” Greig Fraser, one of the film’s cinematographers and the Oscar winner for Best Cinematography for his work on “Dune,” told TheWrap.

Dynamic Duo

The production utilized two cinematographers, with Oren Soffer joining Fraser. Edwards operated the camera most of the time, Soffer told TheWrap. “The entire process of discovering the visuals of the film, deciding what’s in frame, what isn’t in frame, all of that was designed and planned extensively in prep to be spontaneous and reactive on set, and to create an environment for Gareth and the actors to explore,” Soffer said. In order to “create these opportunities,” according to Soffer, the entire team had to “break the conventions of filmmaking a little bit.”

The idea of a big budget sci-fi extravaganza embracing a looseness and improvisational feel is wholly unorthodox. Most films of this size and scale require endless storyboarding and what is known as pre-visualization (or pre-viz), where the movie’s major set pieces are mapped out in a computer ahead of time. Soffer said that Edwards was guided by two principles – to make a movie that felt big “and not spend $300 million,” and to “create a film that feels real and authentic and immersive for an audience.”

In order to accomplish this, the team hit on a different approach: they would design all of the visual effects in the post-production window. Whereas a traditional film would use concept art and storyboards, the filmmakers “shot the film as if it was all on-location with real people and real actors.” Then they edited the entire film before making post-production decisions about visual effects. “A lot of decisions about who becomes a robot, where the ship goes and what the cityscape looks like was all designed in post, Soffer said.

“We knew what the story was, we knew what the beats were, we knew what that component was clear from day one,” said Jay Cooper, who served as the visual effects supervisor for Industrial Light & Magic. “It’s not like Gareth went out and improvised and we came back and made a movie. But in terms of the level of design and thoughtfulness that went into things, and working with the photography once it came back, and finding places where we could add to it in such a way that we maximized what was there, those decisions were definitely made after initial photography.”

In one harrowing sequence where the American military forces deploy a bomb robot to destroy the bridge to an AI village, the filmmakers decided in post-production which parts of the village would be knocked down, as well as the design of a “huge, oppressive U.S. tank,” Cooper said.

There were also, according to Andrew Roberts, the onset VFX Supervisor for ILM, another key difference in shooting: minimal use of green screen. “We really tried to keep as small a footprint as possible so that it didn’t distract the actors,” Roberts said.

Soffer described the shooting style as “spontaneous,” where the sets and locations are lit to make it seem like they’re not lit at all, and for it to allow for a little more breathing room than in a traditional big budget production.

The two cinematographers overlapped constantly and have been working on the movie together through the final color grading process, Fraser said. The prep for “Dune: Part Two” unexpectedly encroached on production of “The Creator,” which necessitated some “overlap” between them, he explained of the unusual decision to have two cinematographers on the project.

Fraser was one of the key figures responsible for designing “the volume,” a virtual soundstage that projects finished visual effects onto a screen that the camera picks up. Fraser first developed the tech for “The Mandalorian,” but it’s now used on any number of blockbusters from Marvel movies to “The Batman” (which Fraser shot). Cooper even referred to him as “the grandmaster of the volume.”

While the movie prioritized reality and concocted its visual effects long after filming was complete, Cooper said the filmmakers utilized the volume for two sequences – one in an airlock and another in a biosphere. (Both are on the NOMAD – a giant, hovering space station routinely deployed to take out the AI threat.)

Ultra-Wide Aspect Ratio and Prosumer Sony Cameras

Another element of the production that makes the film feel different is its ultra-wide 2.76:1 aspect ratio, which gives the movie wide scope and makes it stand out from the boxier approach of some recent films. (Other films that have used the aspect ratio include 2015’s “The Hateful Eight” and the 1959 classic “Ben-Hur.”)

“When we were looking at cameras, often a lot of cameras have a sensor size where if you de-squeeze them with a traditional two times anamorphic, which is what we were using, they end up doing something super wide. And very few films ever present in that aspect ratio,” Fraser said.

Even though the images were never designed to be seen that wide, Edwards “was responding to that organic nature of what lenses do.” Quickly, Edwards decided he wanted to shoot the entire movie that way, using super-wide lenses on consumer grade cameras (they ultimately chose the Sony FX3, Sony’s entry-level 4K Cinema Line camera, which can be purchased for less than $4,000 at your local Best Buy). “We were like, ‘Yeah, you go girl,’” Fraser said. Previously there had been restrictions related to distribution, restrictions that no longer exist. “Will it look like that on an airplane? I don’t know. I doubt it. But at the very least we have a theatrical version that’s to Gareth’s vision,” Fraser said.

Super-wide movies that involve huge visual effects are infrequently made because, historically, anamorphic is much tougher to wrangle. At the time of “Jurassic Park” Steven Spielberg had shot anamorphic (on “Hook”) with the same cinematographer (Dean Cundey). But he couldn’t on “Jurassic Park” because the visual effects team, utilizing then-cutting edge computer-generated imagery, told him that shooting it flat would be easier.

“ILM really came to the party and was embracing the way Gareth wanted to shoot it on the format that he wanted to shoot it on, and respecting his vision as a director wholeheartedly,” Fraser said. “They’ve taken unorthodox working practices, unorthodox footage, and an unorthodox story, and ran with it and made some of the finest VFX that I’ve seen.”

“I want to say 75, 80% of the movie is shot with the same lens,” Cooper said. The problem with when you’re shooting anamorphic is the lens barrel distortion is pretty significant and it’s hard to turn it into a flat image to add your CG. By reducing the number of lenses, he simplified that process for us,” Cooper said.

Since everything was being designed after the fact, the wide image informed the look of spaceships and space stations. “We made choices in terms of the shape of different things based on this aspect ratio,” Cooper said. The NOMAD, the terrifying space station of doom, for instance, is “longer and more squat” than it would have been if the movie had been shot taller or flatter.

While “Rogue One” saw Edwards stepping into George Lucas’ sandbox, “The Creator” finds the filmmaker going to Lucas’ roots — reworking what’s possible in the realm of cinema. The question now is who will follow in Edwards’ footsteps, and if he just sneakily launched a new era of blockbuster filmmaking.


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.