Christopher Nolan gave his thoughts on a variety of hot-button AI issues in a recent chat with Wired, including the question of whether he’d be championing artificial intelligence in filmmaking.
He reiterated that he was the “old analog fusty filmmaker” sort of creative, but highlighted that even if he’s not the one to employ AI to its logical extremes, the tech is ripe for filmmaking usage.
“The whole machine learning as applied to deepfake technology, that’s an extraordinary step forward in visual effects and in what you could do with audio,” he said. “There will be wonderful things that will come out, longer term, in terms of environments, in terms of building a doorway or a window, in terms of pooling the massive data of what things look like, and how light reacts to materials. Those things are going to be enormously powerful tools.”
He did highlight one example in his filmmaking where AI VFX tech could be useful: more easily painting out the wires that are attached to actors for hazardous stunt work.
Nolan also addressed some common fears surrounding AI. He noted he was optimistic about its utility as a tool but reiterated that society would need to maintain the view that it was a tool, so the wielder saddles the responsibility of its usage.
“If we accord AI the status of a human being, the way at some point legally we did with corporations, then yes, we’re going to have huge problems,” he said. He called it an “issue” that terms such as “algorithm” and “AI” are being used by companies as a way to dodge responsibility for their actions.
“The biggest danger of AI is that we attribute these godlike characteristics to it and therefore let ourselves off the hook,” Nolan said.
He also called out tech entities’ invitations for regulation as “very disingenuous,” saying that elected officials are stuck with fundamental knowledge gaps on these specialist, high-tech issues. The point being that asking governments to step in is a way for inventors to pass the buck and proceed on the basis that no one knew enough to order them not to.
You can read the full interview here.