Christopher Nolan wants Oppenheimer to be a warning to Silicon Valley

by Janice Allen
0 comments

About the time J. Robert Oppenheimer learned that Hiroshima had been hit (alongside everyone else in the world), he began to deeply regret his role in making that bomb. At one point, Truman Oppenheimer cried during his meeting and expressed regret. Truman called him a crybaby and said he never wanted to see him again. And Christopher Nolan hopes that when Silicon Valley audiences come to his movie Oppenheimer (appears June 21) see his interpretation of all those events there they also see something of themselves.

After a showing of Oppenheimer at the Whitby Hotel yesterday, Christopher Nolan joined a panel of scientists and Kai Bird, one of the authors of the book Oppenheimer is based on to talk about the movie, American Prometheus. The audience was mostly scientists, who chuckled at jokes about the egos of physicists in the film, but there were also a few reporters, including myself.

We listened to all-too-brief debates about the success of nuclear deterrence and Dr. Thom Mason, the current director of Los Alamos, talked about how many current lab workers had cameos in the film because so much was filmed nearby. But towards the end of the conversation the moderator, Chuck Todd van Meet the press, Nolan asked what he hoped Silicon Valley would learn from the film. “I think I wish they would take away the concept of accountability,” he told Todd.

“Applied to AI? That is a terrifying possibility. Terrifying.”

He then clarified, “If you’re innovating through technology, you have to make sure there’s accountability.” He referred to a wide range of technological innovations that have been embraced by Silicon Valley, when those same companies have refused to acknowledge the damage they have repeatedly done. “The rise of companies over the past 15 years that have scammed about words like ‘algorithm’ without knowing what they mean in any meaningful, mathematical sense. They just don’t want to take responsibility for what that algorithm does.”

He continued: “And applied to AI? That is a terrifying possibility. Terrifying. Not least because AI systems are entering the defense infrastructure, eventually they will be charged with nuclear weapons and if we allow people to say that that is an entity other than the person whose handling, programming, deploying AI then we are doomed. It has to be about accountability. We need to hold people accountable for what they do with the tools they have.”

While Nolan wasn’t referring to a specific company, it’s not hard to know what he’s talking about. Companies like Google, Meta, and even Netflix rely heavily on algorithms to gain and retain audiences, and often there are unforeseen and often horrific consequences for that dependency. Probably Meta’s most notable and truly terrible contribution to genocide in Burma.

“At least it serves as a cautionary tale.”

While an apology tour is all but guaranteed these days, days after a company’s algorithm has done something terrible, the algorithms persist. Threads even just started with a exclusively algorithmic feed. Occasionally companies might give you a tool, like Facebook did, to disable it, but these black box algorithms persist, with very little discussion of all possible bad outcomes and much discussion of the good ones.

“When I talk to the leading researchers in AI, they literally call this their Oppenheimer moment now,” said Nolan. “They’re looking to his story to say what are the responsibilities of scientists developing new technologies that could have unintended consequences.”

“Do you think Silicon Valley thinks that now?” Todd asked him.

“They say they do,” Nolan replied. “And that is,” he chuckled, “that’s useful. At least that’s what the conversation says. And I hope that thought process will continue. I’m not saying that Oppenheimer’s story offers easy answers to these questions. But at least it serves as a cautionary tale.”

You may also like

All Right Reserved Businesskinda.com