As it heads into the Oscars, what Oppenheimer gets right — and wrong — about the threat of nuclear weapons.
My wife and I went to see Oppenheimer on opening weekend in July, and I wore my best Los Alamos-themed costume. We opted for the biggest screen we could — and as the Trinity test explosion kept on rising higher and higher endlessly upward, I was glad we had sprung for the more expensive IMAX ticket for the sheer spectacle. I could guess, even then, that this film was destined for this weekend’s Academy Awards, where it will compete for 13 Oscars, including Best Picture and Best Director.
But then the final scene rolled around — showing Los Alamos lab director J. Robert Oppenheimer’s vision of rows and rows of intercontinental ballistic missiles (ICBMs), the rockets zooming upward — and then the Earth from space with mushroom clouds rising above the clouds and fire spreading across the globe
As the strings rose to their final jarring crescendo, I found myself unexpectedly and uncontrollably bawling my eyes out. I sat in my chair at the London Science Museum cinema, my shoulders shaking, speechless with tears, trying to gasp out to my wife between sobs what on earth was going on with me. “It’s all still real,” I told her. “The weapons are still there. Every 12 minutes they could kill us and everyone we love. The entire Northern Hemisphere would be gone. We’d all starve to death. Five billion people could die.”
I’ve worked at the University of Cambridge’s Centre for the Study of Existential Risk for the past seven years, where we study risks that could lead to human extinction or civilizational collapse — and how to prevent them. Most of our conversations at the pub across the road from the office are about climate change, engineered pandemics, artificial intelligence, and yes, nuclear weapons.
The most common question I get asked is “doesn’t working on this make you all depressed?” I normally answer with a joke, but the truth is that while those conversations are normally academic, abstract, distanced, and intellectual, Oppenheimer was hitting me emotionally.
The USA and Russia each have around 1,500 warheads deployed — ready to launch. Each of these is a hydrogen (or thermonuclear) bomb. Many of these, which are 20 times more destructive than those produced by Oppenheimer’s Manhattan Project and which destroyed Hiroshima, sit on the top of ICBMs in land-based silos — in the US in Montana, North Dakota, Wyoming, Nebraska, and Colorado — and are ready for a “launch on warning.”
What this means is that in the event of the detection of a possible incoming nuclear strike, the president — while they are being rushed to the White House bunker — has perhaps 12 minutes to decide whether to launch all of these missiles. Unlike nuclear missiles on submarines, which are designed to evade a first strike, this is a use-it-or-lose-it situation: the silos will get blown up at the end of those 12 minutes. What could happen next is unimaginable — save for the scientists who have imagined it.
You may remember the theory of “nuclear winter” from the 1980s. Proposed by Carl Sagan and other US and Soviet scientists, it suggested that the smoke from burning cities could rise up into the atmosphere and block out the sun. Scientists learned during World War II that if conditions are right, a burning city after a bombing can become a firestorm, a freak weather phenomenon where the heat of the fire sucks in air from surrounding areas in a great wind, fueling the fire like a bellows.
The nuclear winter theory takes this further: Destroying cities with a hydrogen bomb might also lead to firestorms. That great wind could carry millions of metric tons of black carbon soot from these burnt cities far up into the stratosphere. Once up there, up above rainclouds, that soot might stay suspended for up to a decade. If so, it would act like a massive volcanic eruption or asteroid impact, blotting out the sun and reducing crop yields.
But scientists have carried out research with today’s much more powerful physics simulations and climate models. Their conclusion?
In a 2022 Nature Food paper, researchers led by Lili Xia find that a full-scale nuclear war could reduce crop yields between 50 percent and 90 percent and kill 2 to 5 billion people. At the top end, this would mean north of 90 percent of the populations of the US, Europe, Russia, and China starving to death. Because of global climate patterns much of this would be concentrated in the Northern Hemisphere — unfortunately where I and most of my family and friends live.
This is not a closed debate: The Future of Life Institute, which also studies existential risks, recently awarded about $4 million to 10 university groups to explore this in more detail. But it is a scenario that has to be taken seriously. And it was this scenario that left me sobbing and gasping at Oppenheimer’s end.
Oppenheimer as a warning
Since Oppenheimer’s release, the film’s director, Christopher Nolan, has actively drawn attention to the continued dangers of nuclear war. Nolan spoke about just this subject at the Bulletin of the Atomic Scientists' annual Conversations Before Midnight just before the main event of the evening, joint headliner yours truly (my co-recipient of the Rieser Award, Christian Ruhl, had talked me out of making too many jokes about Nolan). In his BAFTA acceptance speech last month, Nolan thanked his cast (standard), his crew (classy), and his producers (shrewd), but ended by thanking a different group. He said:
Our film ends on what I think is a dramatically necessary note of despair. But in the real world there are all kinds of individuals and organizations that have fought long and hard to reduce the number of nuclear weapons in the world. Since its peak in 1967, they’ve done it by almost 90 percent. Of late that’s gone the wrong way. And so in accepting this I want to acknowledge their efforts and point out that they show the necessity and the potential of efforts for peace.
In 1991, as the Cold War came to a sudden end, the Doomsday Clock, designed by the Bulletin more than 70 years ago to draw attention to the threat of nuclear holocaust, was set back to 17 minutes to midnight. The plan for nuclear arms negotiators from there on was clear: keep on negotiating bilaterally to get the US and Russian stockpiles from their all-time high of over 60,000 combined down to around 200 warheads each, on par with most other nuclear states, then have multilateral negotiations to get the numbers as close to Global Zero as we can while preserving the possibility of deterrence. But as Nolan noted with marvelous English understatement that night at the BAFTAs: “Of late that’s gone the wrong way.”
From the Anti-Ballistic Missile Treaty to the Intermediate-Range Nuclear Forces Treaty to the Open Skies Treaty, the key Cold War pacts that constrained Russia and the US have been torn up in the 21st century. The only nuclear treaty left, the New START Treaty, expires automatically on February 5, 2026. And last month, Russian President Vladimir Putin withdrew from New START’s inspections regime.
Nuclear risk experts warn that we are entering a new arms race. China is building hundreds of new silos in its northern deserts and could be increasing its number of operational warheads from around 500 to around 1,000. All nuclear weapon states are in the middle of nuclear “modernization”: replacing old warheads, ICBMs, bombers, and submarines and making tweaks like better fuzes to control blast timing in ways that will make the warheads more damaging. New technologies such as highly maneuverable hypersonic missiles or integrating AI into nuclear decision-making could threaten strategic stability.
Meanwhile, the nuclear risk reduction community is not doing well. The biggest funder in the field, the MacArthur Foundation, pulled out in 2021, declaring that its “Big Bet” had not paid off. MacArthur had provided around half of all the non-government funding worldwide on nuclear policy. Other funders such as Longview have stepped up, but have not been able to plug this gaping hole.
One major benefit of Nolan’s work and Oppenheimer, especially if — as it deserves to — it takes home Oscar gold, is that it will continue to draw more attention to the bizarrely neglected area of nuclear risk.
What Oppenheimer gets wrong about the Manhattan Project
Oppenheimer is a film that affected me so strongly and has done great good. Will I really be so churlish as to criticize it? Yes, I will. Not for its depiction of Florence Pugh’s Jean Tatlock, the choice to not include Japanese perspectives, or for it being “too long,” but about two subjects it didn’t touch on.
First, the true story of the production of the bomb.
Like most films and TV shows about the Manhattan Project, Oppenheimer focuses far too much on Oppenheimer and his Los Alamos scientists. The film never shows Oak Ridge, Tennessee, or Hanford, Washington. But these production plants for enriched uranium and plutonium respectively were responsible for more than 80 percent of the budget of the Manhattan Project and most of its staff. It is these locations and people that actually produced the bomb. The Nazis had smart scientists. What they didn’t have was the US’s vast industrial and financial resources. Yet in the film, these remarkable achievements are reduced to marbles that Cillian Murphy’s Oppenheimer places in a bowl to track the production of the nuclear fuel.
It was these industrial workers who built the bomb. But after the war, it was the privileged genius scientists who were on the front cover of Time magazine, and it is their stories that shaped how we remember the Manhattan Project.
This misremembering has important modern implications. It is production that determines “breakout time” — how long until a state can have enough nuclear material for a bomb. Arms control regimes like the Iran deal or the Non-Proliferation Treaty focus on monitoring and controlling production facilities, as did the 1981 Israeli air strike on Iraq or the 2010 Stuxnet cyberattack on Iran. It is production that should be our focus.
Second, the film presents the US as desperately behind the Nazis who have, according to Oppenheimer, an “18-month head start” in the nuclear arms race, and suggests that “in a straight race, the Germans win.” The Nazis only lose in the film because German physicist Werner Heisenberg “took a wrong turn” by choosing heavy water rather than graphite as a moderator, as revealed in the film by Kenneth Branagh’s Niels Bohr in Christmas 1943.
But we’ve known for decades that none of this was true: Hitler had decided against a serious program at a presentation in June 1942.
The Nazis had indeed explored launching their own Manhattan Project: the “Uranverein” nuclear weapons program led by Heisenberg. But Heisenberg and Germany’s military planners predicted that it would be a major investment that would only pay off in two to three years. The Nazis could not make such a massive investment of people and raw materials like steel. They had a much smaller economy, huge shortages, and a greater need for shells and tanks. And they couldn’t wait for 1944 or ’45 — the Nazis needed a breakthrough in the Eastern Front right then.
This was not a technical error, but a strategic decision. The US was at no risk of losing the race, as no other great power — the Soviet Union, Imperial Japan, or Nazis — rushed toward the bomb during the war. Moreover in a straight race, there was no possible way for any of them to compete with the vast industrial and financial might of the US.
The film could have explored this tragic mistake while keeping its laser focus on Oppenheimer. The official historian of the Manhattan Project stated that at the end of 1943 Oppenheimer was explicitly told by Gen. Leslie Groves, the director of the atom bomb program, that the Nazis had abandoned their early program — and Oppenheimer just shrugged.
The Manhattan Project’s lost conscience
Most egregiously for me, the film doesn’t show Joseph Rotblat. Rotblat was the only scientist to resign from the project, even though he was a Polish refugee whose wife Tola Gryn was murdered in a Nazi concentration camp. When D-Day made clear that the Nazis would lose, and Groves told him that the focus of the project had always been the Soviets, Rotblat resigned.
In 1957, he organized a conference in the small lobster fishing village of Pugwash in Nova Scotia. The Pugwash Conferences would go on to spread key ideas for arms control agreements on nuclear testing, limiting warheads, and banning biological weapons. Rotblat and Pugwash shared the 1995 Nobel Peace Prize — one more Nobel than Oppenheimer ever received.
Again, this misremembering has important modern implications. The lesson of this tragic mistaken arms race shouldn’t be “never race.” It should be “make sure you know whether or not you’re actually in a race.” This is a lesson we have consistently failed to learn — the US mistakenly thought there was a “missile gap” in the late 1950s, and the Soviets mistakenly thought they were in a biological weapons arms race in the early 1970s. In the coming years, states may mistakenly believe they are in a race to develop powerful advanced artificial intelligence systems.
The “individuals and organizations that have fought long and hard to reduce the number of nuclear weapons” that Nolan paid tribute to — like the Bulletin of Atomic Scientists, Joseph Rotblat, and the Pugwash Conferences — weren’t really shown in his remarkable, magnificent, affecting film. Indeed, they were never supported by Oppenheimer. We’re in a tough spot, facing the possibility of a new nuclear arms race. We need to learn from these successful arms controllers, rather than from Oppenheimer’s failures.
0 Comments