During production on the Argentinian Netflix sci-fi series “El Eternauta,” the filmmakers were looking to use visual effects to create a building collapse in Buenos Aires. But there was one major problem: the project’s budget had already been committed elsewhere and, more pressingly, time was running out to finish it.
So they turned to Eyeline Studios, the streamer’s virtual production and research unit, and Scanline, the visual effects studio it acquired in 2021, to employ generative AI to complete the effects-heavy scene.
“We budgeted them and we saw that if we had to do this sequence using traditional visual effects, it would have been like 10x more,” Kevin Baillie, Eyeline’s Head of Studio, said in an exclusive interview with TheWrap. “Creatives had a need that would have been impossible, time-wise and budget-wise, because all the budget had already been committed on the show anyway and they were out of time. That was 10 days before the project had to be done.”
In addition to using over 35 real filming locations in Buenos Aires and more than 25 virtual production stages, “El Eternauta” marked the first Netflix global title to use generative AI, a fact that co-CEO Ted Sarandos shared during the company’s July earnings call, noting the sequence was also completed 10 times faster. His comments, which caused a stir in Hollywood, reflected the reality that studios were putting AI to use faster than expected.
Now, after 36 years of operation, Scanline — whose work has been seen in everything from “Godzilla x Kong: A New Empire” and “The Batman” to “Andor” and “Stranger Things” — is merging into the Eyeline brand as the streamer looks to continue pushing the boundaries of what’s possible in visual effects.
Moving forward, Eyeline will focus on three core divisions—Visual Effects, which will be responsible for the work done by Scanline; Studios, which focuses on the application of tools such as virtual production, volumetric capture and generative AI; and Labs, which conducts and publishes research in areas such as machine learning and computer vision that can later be applied to current VFX tools. The rebrand includes a new logo and website.
“We want to modernize the business,” Eyeline CEO Jeff Shapiro said. “Eyeline is synonymous with all of this really innovative R&D work, so we wanted to make sure we used that name going forward.”



The move is designed to provide a one-stop shop for creative teams and reflects the need to more tightly integrate tools like volumetric capture and generative AI into the production process. It will also cut out the confusion for filmmakers on whether they need to go to the Eyeline or Scanline team to address challenges and figure out the best VFX tools to use on their series and films.
“Technology innovation is happening so quickly, both in the broader industry and also internally, that with the two separate brands it was easy to fall into ‘are we doing it the old way or are we doing a new way?’,” Baillie said. “By having everything be Eyeline, it really encourages people to look at not who’s doing what, but what does a filmmaker want, what do they need and how can we use the best tool for the job and really operate like one team.”
“We’re able to give filmmakers a space to experiment, try new things, see what sticks and really help elevate the core strategy for Netflix, which is making really high quality, long-form content,” Shapiro said.
Inside Eyeline Studios
Netflix most recently used virtual production and volumetric capture — in which an actor’s performance is recorded (or “captured”) by virtual cameras and then typically rendered in CG — on “Wednesday” Season 2 for the creation of Professor Orloff, whose disembodied, but very much alive, head played by Christopher Lloyd rests in a jar.
It also used Scanline’s Light Dome technology, which can replicate any real-world lighting condition, on “Happy Gilmore 2” and Kathryn Bigelow’s upcoming Netflix political thriller “A House of Dynamite.”
Upcoming projects that will use Eyeline Studios’ technology include “Stranger Things” Season 5, “The Witcher” Season 4, “Wake Up Dead Man: A Knives Out Mystery,” “The Boys” Season 5, “Spider-Noir” and “Daredevil Born Again: Season 2.”


Scanline has previously collaborated with third parties such as Autodesk and The Foundry on its VFX work. Though Netflix and the Eyeline team did not specify which generative AI-powered tools were used on “El Eternauta,” Baillie said the division does not build any custom AI-powered technology in-house and is using commercially available video diffusion models through enterprise licenses.
Netflix declined to disclose which AI companies it partners with, though Bloomberg previously reported that the streamer was testing tools from Runway back in July. A Runway spokesperson declined to comment.

“There’s three big ones that are out there in the world and we’ve tested most of them. The special sauce isn’t in the way in which you make video, images and prompts move, it’s in the way in which you bring the ingredients in,” Baillie said. “It’s like a gourmet kitchen. We’re using all the products you can go buy at the supermarket. The measure of ingredients that we’re putting into the models is really a fine-tuned workflow that’s been developed over the last couple of years.”
Shapiro acknowledged that the generative tool set is “very broad and evolving extremely fast,” which is where the team of researchers and scientists in Eyeline’s Labs division comes in.
“Their job is to go out and learn about all the white papers that are out there in the world and come back to us with novel approaches that other people have developed. So that’s the way we can stay on the competitive edge,” he said. “In some instances, they’ll actually publish their own research for the academic community to see how they can improve the ways in which you can enhance color or movement or things that the commercial tools don’t have yet.”
Some of Eyeline’s research has already been added to its tools, such as FlashDepth, a real-time video depth estimation tool that is used to create accurate, high-resolution “depth maps” — a grayscale image that represents the distance of objects in a scene from a viewpoint – for streaming video.
“There’s nobody else out there that has a research team that is this close to the teams actually driving the creation of the content,” Baillie added.
Scanline’s global expansion
Scanline was originally founded in 1989 in Munich, but rapidly expanded around the world over the next three decades into Los Angeles, Vancouver, Montreal, Stuttgart, London and Seoul, working on 10 to 15 big theatrical films a year before being integrated into the Netflix ecosystem.
“When Netflix bought this business, I don’t think it was prepared for the amount of demand that got put on it right away,” Shapiro said. “It was not very large. They were spread out, but a lot of work got put on the business and they had to figure out how to juggle and prioritize all of it. There was a lot of ‘how do we reconcile the demand with our capabilities and supply’ at the time. So that’s been a huge learning for me leading the organization.”
Though the pair declined to disclose how much money has been invested in the studio since the acquisition, they said Netflix’s support has been “significant.” In 2022, Scanline expanded into Mumbai and is doubling down on India with a new Hyderabad production hub that opened in March.
In 2023, Sarandos also said that Scanline and Eyeline would invest $100 million in its facilities and team of more than 130 VFX artists in Korea over the next six years, on top of the streamer’s previously announced $2.5 billion investment in the region.
“There’s a very rich set of stories to tell out there, from anime reboots to live action to really big spectacle projects with some technical challenges,” Shapiro said. “We’re hiring and excited about our growth in Korea and India.”
Making what’s next
Most of Eyeline’s current work has been elevating and swapping out tools that are already in the traditional visual effects pipeline, as well as helping pre-production in areas such as look and character development.
“The old traditional processes of doing this took a long time, you’d get very little iteration out of it,” Shapiro said. “Sometimes there’s too much choice now, because we have infinite choice. So our team is specially designed to help pull out a great idea and show it on screen so that they can get a feel for it and react to it.”
But Eyeline is also focused on finding solutions to the industry’s unsolved problems.
“Everything we do in terms of tools that we make or innovations that we push is in response to an actual storytelling need a filmmaker has,” Baillie added. “We’re not a company that’s going to show filmmakers some tech and be like ‘make a story around it.’ It’s the other way around.”

Though Eyeline prioritizes work on Netflix series and films, Shapiro said they’re still open to working with third-parties, so long as the project provides an opportunity to deploy new technology or elevate their workforce and capabilities.
“It speaks to our slogan as part of this rebrand: make what’s next. It’s really finding the filmmakers and the projects that will help us make what’s next in a way that’s relevant in the industry and will give our teams and audiences as much satisfaction as possible,” Shapiro said. “It also speaks to our new logo. There’s a little chunk missing out of the middle E and that’s meant to represent the space that’s open for a filmmaker to complete the picture. That’s a very important part of figuring out what projects to really dive into.”
Despite fears about AI in Hollywood, the pair argued the current tools and capabilities available are not at the stage of producing professional long-form content on their own — nor will they be a permanent replacement for human creativity.
“Actors, cinematographers, production designers are so key to everything that we do. These are trades that are the soul of everything,” Shapiro said. “So as we look at new technologies, we’re making sure that we’re leveraging them in ways that are as authentic as they were 130 years ago when people started making movies. The places to look to change are how we can give filmmakers more scope and allow creatives to be more artistic and less mechanical.”
When asked about OpenAI’s Sora 2, Shapiro said it’s “just another video generation tool” out there. He added that Eyeline is “continually evaluating what is best-in-class” for creators to use and that there are “many different tools and software out there that help achieve the desired result.”
Baillie added that VFX artists have already been using machine-learning and computer vision for decades and that generative AI is just one more tool that can enable filmmakers — especially in regional markets with smaller budgets — to bring their vision to the screen in ways that wouldn’t have otherwise been possible.
“I’ve been very surprised by how much the narrative that Hollywood is cooked with AI and all that stuff has actually sunk in with people,” he said. “What we see through all of the work that we do is that it’s so far from the truth. Whether it’s volume capture that doesn’t involve AI or traditional visual effects or things that are generative, these are tools that need to be driven by real artists with real taste and real artistic ability behind them. That will be true for as long as I can see.”