“Terminator” writer-director James Cameron fears that his classic film about artificial intelligence nearly destroying humanity could end up coming true if the real world version of AI is ever connected to weapons systems.
Simultaneously, the “Avatar” director also insists he doesn’t think AI could ever actually replace people — at least artists, anyway.
Cameron’s comments come from a lengthy new interview with Rolling Stone, where he said in part, “I do think there’s still a danger of a Terminator-style apocalypse where you put AI together with weapons systems, even up to the level of nuclear weapon systems, nuclear defense counterstrike, all that stuff. Because the theater of operations is so rapid, the decision windows are so fast, it would take a superintelligence to be able to process it, and maybe we’ll be smart and keep a human in the loop.”
“But humans are fallible, and there have been a lot of mistakes made that have put us right on the brink of international incidents that could have led to nuclear war,” he added.
The “Avatar” director also explained that he considers “superintelligence,” meaning an as-yet uncreated version of AI, an existential threat alongside climate change and nuclear weapons.
“We’re at this cusp in human development where you’ve got the three existential threats: climate and our overall degradation of the natural world, nuclear weapons, and superintelligence,” he said.
“They’re all sort of manifesting and peaking at the same time,” Cameron said, though he also wondered if “superintelligence is the answer,” adding that he’s “not predicting that, but it might be.”
Cameron also told the magazine that “people make a little too much about me predicting artificial intelligence being a bad thing, especially when associated with nuclear weapons. But we exist in that world right now, and whether a superintelligence can help us or whether it gets weaponized and put in charge of our missile defense because it can react much faster than we can, who knows? We could be entering that world as we speak.”
Despite those dire warnings, Cameron rejects at least one of the biggest claims boosters of AI keep making — that the technology could make human labor obsolete.
Cameron explained that while he’s “leaning into teaching myself the tools of generative AI so that I can incorporate them into my future art… I utterly reject the premise that AI can take the place of actors and take the place of filmmakers and all that sort of thing. So we always have to approach any technology as being potentially dangerous and potentially helpful.”