President Obama had made it clear in advance that he would not apologize, when he became the first sitting US president to visit Hiroshima, in May 2016. His position followed that of eleven prior administrations and was vocally supported by China and Korea - - Japan’s principal war victims - - and many US veterans’ groups.
But Hiroshima is complicated: apology or no, Japan was officially happy to host the president, and many expressed gratitude that he could acknowledge the pain and horror experienced by civilian casualties of the 1945 bombings. Obama concluded with a prayer for “a future we can choose, a future in which Hiroshima and Nagasaki are known not as the dawn of atomic warfare but as the start of our own moral awakening.”
The long delay and great delicacy of this visit speaks volumes. American presidents have never hesitated to visit other major battlefields of World War II, especially sites of Allied triumph. Nuclear weapons are different, in the scale of their destruction, the health and environmental effects of radiation, and the risk of reprisals. And Hiroshima was not a site of great American courage. Though committed in wartime for the purpose of defeating an aggressive and brutal Japanese empire, the atomic bombings of Hiroshima and Nagasaki were massive, poisonous attacks on very soft civilian targets.
After Hiroshima, the Cold War rivals somehow both built and refrained from using nuclear arsenals that really could have destroyed large parts of humanity. The superpowers’ leaders feared reprisals, it is true, but their abstinence was not just due to the “balance of power” or isolated cases of restraint. A long series of leaders came to realize, in the manner of Clausewitz, that there would be no political point to victory if nuclear weapons were used: even the “victors” would lose, and become pariahs to the rest of the world. So the US and USSR, and their allies, made a series of agreements to reduce the risk that anyone would use the weapons. Even Ronald Reagan, who’d once suggested that “we could pave the whole country [of North Vietnam] and put parking strips on it,” became a “nuclear abolitionist” in the words of the historian John Lewis Gaddis. As President, Reagan negotiated some of the most sweeping arms reductions of the Cold War.
These were acts of self-interested self-restraint, made between parties that continued to disagree about much else, in service of larger goals. Perhaps more important, these actions signaled that nuclear weapons were undesirable - - ruinously expensive, difficult to manage, unusable - - at least to countries that aspired to participate in the global economy. And that may be the Cold War’s most important lesson for the future.
The story of President Truman’s decision to drop atomic bombs is as necessary and irresistible as Pandora’s Box. We are riveted by its ethical questions: whether it was right or wrong; whether it was necessary or proportionate to US war aims; how many US lives were saved by the sacrifice of two or three hundred thousand Japanese people; whether it violated international treaties against the targeting of civilians and the use of chemical weapons (reading “chemical” to include “radiation”); whether the US should have demonstrated the bomb first, provided a clearer warning, made its surrender demand clearer with respect to the status of the Japanese emperor (who was ultimately allowed to remain as head of state, subject to US occupying authority). Unfortunately, as with the Smithsonian’s ill-fated 50th anniversary exhibition, the public discussion in the US often gets reduced to two groups talking past each other, as though respect for military service and avoiding unnecessary civilian casualties were mutually incompatible.
But Hiroshima’s most pertinent lesson is actually this: no future leader will face a situation like Truman’s. The unprecedented decision about whether to drop atomic bombs on Japanese cities was made in wartime by an intelligent and humble but completely unprepared man who had known about the weapon for less time than it takes to teach the first semester of college physics, during which time he also happened to be employed as President of the United States. No future leader will write on such a blank slate.
When he became President in April 1945, Truman was not even well-prepared to follow the course he publicly promised and intended: to follow Roosevelt’s own policies and plans as closely as possible. Indeed, he had barely been part of the administration. He had only become Vice President that January; he had met with President Roosevelt just a couple of times after the 1944 election; FDR hadn’t even invited him to the Yalta summit with British Prime Minister Churchill and Soviet Chairman Stalin in February 1945. (Truman’s lack of preparation is a scandal of its own. Roosevelt’s inner circle, including the sainted Eleanor, were well aware that the President had forced to take a 6 week “vacation” in the spring of 1944 due to serious heart disease, and kept his condition from public scrutiny through the 1944 election. Even if electoral considerations were a sufficient justification for this non-disclosure - - a highly debatable proposition - - they would provide no excuse for failing to prepare a new Vice President of the same party to take on his most important responsibilities.)
Among other things, Truman had not lived through the history of the Manhattan Project. As Albert Einstein’s famous 1941 letter to President Roosevelt had suggested, this vast initiative had always been motivated by the fear that Nazi Germany, which had the expertise and resources, would build a bomb first. An American weapon could deter the Germans from using theirs. But Germany fell in May 1945, before either side had succeeded. Regarding Japan, which was understood to have no nuclear program, the best indication of FDR’s thinking was a rather inscrutable summary of a private discussion with Churchill in September 1944, to the effect that the bomb “might perhaps, after mature consideration, be used against the Japanese, who should be warned that this bombardment will be repeated until they surrender.” We don’t, of course, know what Truman or his advisors would have made of this statement if they had seen it. In hindsight, however, it is hard to miss Roosevelt’s hedging – “might” “perhaps” “after mature consideration” and most notably, that the Japanese “should be warned.”
The Manhattan Project was cloaked in great secrecy, improbably so, with its massive tasks divided among three very remote major sites (Los Alamos, New Mexico; Hanford, Washington; and Pine Ridge, Tennessee) and dozens of others. Only a few top-level scientists and military officers knew with certainty that the project was to build an atomic bomb. Truman wasn’t among them. (Ironically, Stalin had high level spies inside the Project, so that when Truman finally “revealed” the bomb’s existence to Stalin at Potsdam in July 1945, it only served to reinforce Stalin’s lack of trust.)
In the event, Truman first learned of the bombs in a forty-five minute briefing from Secretary of War Henry Stimson and the Manhattan Project’s military leader General Leslie Groves on April 25, 1945, almost two weeks after he’d been sworn in. Coming in as late as he did, Truman chose to defer to the process then in place, and postponing political or foreign policy implications. His direct advisors were defense leaders. Stimson convened a panel of eight civilian officials and a separate four-person scientific advisory panel, which met for about a month in May 1945. These groups featured no voices of caution. Churchill likewise recalled there was “unanimous, unquestioned agreement” about using the bomb to compel Japanese surrender.
The White House announcement of the bombing, delivered while Truman was still steaming back from the Potsdam conference, reflected this military focus.
Sixteen hours ago an American airplane dropped one bomb on Hiroshima, an important Japanese Army base. That bomb had more power than 20,000 tons of TNT. It had more than two thousand times the blast power of the British “Grand Slam” which is the largest bomb ever yet used in the history of warfare.
The Japanese began the war from the air at Pearl Harbor. They have been repaid many fold. And the end is not yet. With this bomb we have now added a new and revolutionary increase in destruction to supplement the growing power of our armed forces. In their present form these bombs are now in production and even more powerful forms are in development.
It is an atomic bomb. It is a harnessing of the basic power of the universe. The force from which the sun draws its powers has been unloosed against those who brought war to the Far East. . . .
This was a remarkable statement for many reasons, including its righteous and threatening language, its factual inaccuracies, its low-key delivery. But: talk about burying the lead! The technology that will forever hang over civilization merits no mention before the third paragraph.
The statement focused instead on winning the current war, intimidating the enemy, and reiterating the enemy’s aggression and the justifications for violence. The inaccuracies crept in, apparently for these very purposes. The bomb actually targeted and hit the center of a major city, not a nearby army base. The US had no actual production capability for atomic bombs; just two were then available, each with a different experimental design. The Hiroshima device also didn’t employ nuclear fusion like the sun, although fusion weapons were already on the drawing board.
Truman and his team did have legitimate strategic objectives: to end the Japanese war; to do so before the Russians entered and demanded influence or territorial advantage. And, as Truman’s bombing announcement repeatedly emphasized, the US had a clear casus belli. But the leadership punted on the really difficult question of what the decision might mean for the world’s future safety. His statement went no further than to say that he would make some proposals to the US Congress on that subject, as though the rest of the world might have nothing to say about it.
Americans naturally celebrated Japan’s sudden surrender. About the bombs, however, there were a few voices of concern from the first. “All thoughts and things are split,” wrote James Agee in Time magazine. Conservative commentator David Lawrence condemned the bombing of thousands of civilians in US News. “[W]e shall not soon purge ourselves of the feeling of guilt which prevails among us. Military necessity will be our consistent cry in answer to criticism, but it will never erase from our minds the simple truth that we, of all civilized nations, though hesitating to use poison gas, did not hesitate to employ the most destructive weapon of all times indiscriminately against men, women and children. What a precedent for the future we have furnished to other nations . . .”
A year later, The New Yorker devoted an entire issue to John Hersey’s “Hiroshima.” A classic of what would later be called “New Journalism,” Hersey’s piece followed six residents of the city through the day of the bombing and afterwards. Beyond personalizing the attack on specific civilians, Hersey spotlighted the effects of radiation which, as Lawrence had perhaps intuited, bore a stark resemblance to the chemical weapons that had been condemned by treaty since World War I. In the wake of the strong public response to the New Yorker article, the retired Stimson was drafted to write a semi-official response for Harper’s, soberly walking through the decision-making process that led to the bomb’s use against Japan. Stimson’s article was also well-received and, though somewhat self-serving, performed a public service by opening the door on what had been a top-secret decision of monumental public importance. However, Stimson offered no comfort for the future; after Hersey’s article, no one could claim ignorance of the consequences.
The culture soon reflected these technological fears. Apocalyptic tales have been a religious and literary staple from the beginning of time, but now these fears could be made specific. Uncontrollable technology, driven by witless and/or power-hungry scientists, became an enduring plot device for drama and satire. Among the first was the Japanese movie “Godzilla” (1957), in which the monster’s power came from nuclear fallout. This was practically a current event: radiation from a 1953 US thermonuclear test in the South Pacific killed the sailors on a downwind Japanese fishing boat. Later: “The Blob” (1958), a cult classic with Steve McQueen and some comically bad special effects involving carnivorous goo; the dark comedy “Dr. Strangelove” (1964), with Peter Sellers playing three roles including the title’s mad-bomber physicist; “2001, A Space Odyssey” (1968), featuring HAL, a mutinous spaceship computer; “The China Syndrome” (1979) with Jack Lemmon and Jane Fonda, an almost too realistic foreshadowing of the Three Mile Island nuclear power plant accident; “Blade Runner” (1982), in which no one can tell the humans from the artificial “replicants”; “Jurassic Park” (1993) with aggressive dinosaurs re-created genetically rather than from nuclear waste.
But nuclear weapons and other exotic quasi-scientific speculation weren’t necessary to promote mass fear. The famous air raids upon London and Pearl Harbor at the outset of World War II had been answered, on a much larger scale: US and British air forces had dropped incendiary bombs in carefully designed circular patterns to create unearthly firestorms in Dresden and Tokyo, which left tens of thousands of civilians dead. These “conventional” attacks inspired their own line of antiwar literature, like Kurt Vonnegut’s Slaughterhouse Five and Miyazaki’s film “Night of the Fireflies.”
The French New Wave film “Hiroshima, Mon Amour” (1959) perhaps best combines these themes of fear, guilt and individual powerlessness in the nuclear age. The film opens with images of the atomic bomb’s radiation victims, rarely shown in the US, and bright scenes of peace demonstrations, part of the real worldwide movement for a nuclear test ban in the late 1950s. Two lovers - a French actress, in Hiroshima to film a “peace movie,” and a Japanese businessman - recount their World War II experiences in simple repetitive phrases, stories they have never told their spouses because war no longer makes sense in the new “normal” world. While he was serving elsewhere in the Japanese army, the man’s family and hometown were obliterated by a horrifying, humiliating and unimaginable event. In the war's aftermath, the actress was ostracized by ugly crowds in her pretty little French town because of her teenage relationship with a German soldier during the occupation. In the film’s last scenes, the lovers name each other after their hometowns, Nevers and Hiroshima, tying together the fates of now powerless peoples.
The comparison of the conflicted war histories of France and Japan shows that an ordinary person’s life (simply living in the Japan or France of the time) may tempt fate in extraordinary ways. When the man asks the woman about her reaction to the bombing of Hiroshima, she says she was amazed that they, the Americans, built it, and she was amazed that they used it. Amazed, in other words, at what the US had proved capable of, both in achievement and in destruction. And if that was the reaction of a liberated ally, how might a potential enemy respond?
The reduction of nuclear risk has two general components: reducing the number of opportunities in which anyone might consider using the weapons, and reducing the likelihood that, in each case, the actor will decide to use them. It’s vitally important to do both for a rudimentary statistical reason: even events with a low likelihood in each case become likely to occur with enough chances. For example, if we (arbitrarily) assume there is a 10% probability that a nuclear state will use nuclear weapons in any given confrontation (90% chance of non-use), and if each such decision is independent of all others, after just the seventh standoff, it would be more likely than not that someone would have dropped a bomb on at least one occasion. By the 15th, the likelihood of at least one bombing approaches 80%.
To be sure, these assumptions are wrong. I presume that the likelihood of non-use in any given situation is much higher than 90%. Moreover, each decision is far from independent of the others, because national leaders have been well aware of history and precedent. But the principle remains: the greater the number of chances that someone will use a nuclear bomb, the greater the likelihood that at least one will be used. And the wider the number of nations with nuclear capacity, the greater the number of potential confrontations.
In the shadow of Hiroshima, the nuclear strategies of successive Cold War leaders all evolved in a consistent way. The more they learned, the more reluctant they became to engage in brinksmanship, and the more willing they were to look for other solutions to conflict, often against the advice of their own more hawkish advisors.
When deciding whether to fund the vastly more powerful thermonuclear device known as the “super,” in 1949, President Truman reasoned that if the US could build the device, it must. “We had got to do it - - make the bomb - - though no one wants to use it. But . . . we have got to have it if only for bargaining power with the Soviets.” While this was quite the opposite of what peace advocates wanted, it was also a far cry from the Truman who often claimed that he’d never lost a moment of sleep over the decision to bomb Japan. The super was now part of the strategy of containing the Soviet Union, an effort to prevail without starting World War III. Winston Churchill, who had known about the Manhattan project from the beginning and voiced no objections to the atomic bombing of Japan, came to agree that the thermonuclear devices were unusable. But he felt that the “element of equality” with the Soviet Union offered reason for hope.
At the outset of the Korean War, Truman initially fell back upon a reflexive answer to a reporter’s question about nuclear weapons, deferring to military leadership by saying that everything was on the table. He promptly backtracked from that stance, not wishing to expand that particular war. (As would be the case in Vietnam, there was fierce disagreement between those who didn’t want to risk open war with the Soviet Union and China, and those who felt victory was worth that risk.) Months later, US General Douglas MacArthur, field commander in Korea, proposed dropping a couple of dozen atomic bombs along the Chinese-North Korean border, to leave a toxic wasteland that would isolate and prevent reinforcement of the Chinese and North Korean troops. MacArthur may have been posturing. His horrifying escalation proposal, along with numerous public disagreements with the Truman cabinet that bordered on insubordination, essentially forced Truman to relieve him. (MacArthur’s public didn’t care, showering him with praise and tickertape parades on his return. Truman, more unpopular than ever, dropped out of the 1952 presidential race.)
Dwight Eisenhower, probably the most experienced general ever to occupy the White House, underwent a similar transformation. In the words of Jim Newton, a sympathetic biographer: “Faced with the awesome implications of the Soviet Union’s ability to wage nuclear war, Eisenhower changed. The nuclear enthusiast of 1953 had become a more sober leader by 1956.” Eisenhower had once said that the military could use nuclear weapons “exactly as you would use a bullet or anything else.” Later he rejected multiple staff policy proposals that would condone the limited use of nuclear weapons, seeing such decisions as having unique international political risks as well as the risk of annihilating escalation. “We must now plan to fight peripheral wars on the same basis as we would fight a general war.” The historian John Lewis Gaddis interprets this as a reflection of Clausewitz’ dictum that war must serve political ends: there could be no political end in the end of civilization. By making every decision all or nothing, “maximum massive retaliation,” Eisenhower believed that both sides would refrain from nuclear war.
This policy seemed crazy to many (see, “Doctor Strangelove”), but it took the twin Kennedy-Khrushchev crises of Berlin in 1961 and Cuba in 1962 to show that limited nuclear war might have been even riskier.
When the Berlin Wall went up overnight in 1961, the US conventional forces then stationed in Germany had no realistic hope of holding back the Red Army should it seek to take over the entire city. President Kennedy sought alternatives to Eisenhower’s massive first strike policy. A September 1961 report by Carl Kaysen, a Harvard economics professor then working for the Defense Department, laid out a plan that, if executed without mistakes, gave the US “a fair probability of achieving a substantial measure of success” in degrading Soviet nuclear capabilities, at a cost of between half a million and a million Soviet casualties. Allowing that the Soviets might not see this as a show of US restraint, Kaysen estimated that Soviet retaliation on American cities might produce losses of five to ten million people. Although some administration officials were appalled, Kennedy approved back up plans that included, in its final stage, the use of tactical nuclear arms.
That crisis receded, but just months later, Khrushchev approved the secret installation of short and medium range missiles in Cuba, weapons that could easily reach New York and Washington.
When US surveillance discovered the missile installations in the fall of 1962, Kennedy did not object through back channels as Khrushchev had expected. Instead he took to the airwaves to make a public demand for withdrawal of the missiles, intentionally leaving himself little room to back down, and therefore raising the stakes for Khrushchev. Against the advice of his more hawkish advisors, who advocated immediate bombing of the installations, Kennedy imposed a “quarantine” of Cuba - - a clever relabeling of what amounted to a blockade, which would have been an official act of war. Even with delicate phrasing, the parties remained a step away from the brink.
Fortunately, Soviet resupply ships turned back. But four Soviet submarines, carrying nuclear-tipped torpedoes, were ordered to evade the “quarantine” and supplement forces already deployed in Cuba. The subs’ commanders did not have clear instructions about what to do in the event of a confrontation, so it is also extremely fortunate that they did not fire when the US Navy forced them to surface. Later, Kennedy displayed similar restraint when a US U-2 plane was shot down. In these extremely tense two weeks, a deal was struck: removal of the missiles from Cuba, a US pledge not to invade Cuba, and a secret US pledge to later remove intermediate term missiles from Turkey (which submarine-based missiles would effectively replace).
This terrifying episode convinced Robert McNamara, Defense Secretary to Presidents Kennedy and Johnson, of the futility of even limited use. McNamara had initially believed that the US and USSR might abide by rules of war without nuclear weapons, but after Cuba he saw there was no controlling what might happen in a crisis. He converted to an explicit proponent of “mutually assured destruction,” effectively Eisenhower’s policy.
A similar evolution took place on the Soviet side. After the Cuban crisis, Khrushchev came to the view that a nuclear war, once started, could not be limited. He sharply rebuked Castro for suggesting a pre-emptive strike: “There will always be a counterstrike. . . . . Only a person who has no idea what nuclear war means . . . can talk like this.” This did not make Khrushchev’s comrades any happier about his improvised saber-rattling.
While on a Black Sea vacation in 1964, “crazy Nikita” was summoned back to Moscow by his well-organized deputies. It was to his credit, Khrushchev thought, “that they were able to get rid of me simply by voting. Stalin would have had them all arrested.” The new leadership under Leonid Brezhnev prized order and discipline; they would be more careful to avoid high-stakes conflicts with the West.
The physicist Leo Szilard, who had urged Einstein to write his 1941 letter to President Roosevelt, also wrote a letter to President Truman in July 1945. Now that Germany had surrendered, Szilard asked Truman not to use the bomb in Japan unless Japan had refused detailed public terms of surrender, and only after considering that “a nation which sets the precedent of using these newly liberated forces of nature for purposes of destruction may have to bear the responsibility of opening the door to an era of devastation on an unimaginable scale.”
Truman probably did not see Szilard’s letter. If he had, he probably would have handed it off to the advisory committee, led by the Secretary of War. And the truth is that the funding and resources needed for scientific and development work would always put the government in control of the projects. Exactly this had happened with the Manhattan Project itself; General Groves effectively decided when and how the bomb would be used.
After the war, several prominent groups advocated for international bodies to regulate nuclear technology. The most prominent was a report by US Secretary of State Dean Acheson and David Lilienthal, chair of the Tennessee Valley Authority, one of the largest electricity producers in the country. Their report advocated, among other things, sharing scientific knowledge about nuclear energy and establishing a system of inspections by a United Nations agency to ensure that the technology would be used peacefully. The American financier Bernard Baruch, whom President Truman had appointed to represent the US at the UN, modified the plan in its last stages, leading to its rejection by the Soviet Union. But Stalin, though without his own nuclear weapons at the time, might have declined to participate in any such plan, just as he declined to join postwar international economic recovery programs.
The idea did not die. In a 1950 letter to the United Nations, Niels Bohr, a Nobel Laureate in quantum physics, advocated for international transparency to reduce the chance of nuclear war. “The ideal of an open world, with common knowledge about social conditions and technical enterprises, including military preparations . . . will . . . obviously be required for genuine co-operation on progress of civilization . . .”
A few years later, after both the US and USSR had detonated fusion bombs thousands of times more powerful than the Hiroshima device, Albert Einstein and the philosopher and mathematician Bertrand Russell - - each an international celebrity in his own right - - issued a manifesto calling for nothing less than the abolition of war. Short of that, “the abolition of thermo-nuclear weapons, if each side believed the other had carried it out sincerely, would lessen the fear of a sudden attack in the style of Pearl Harbour . . .”
The internationalists, scientists and others, were accused of being naïve, or worse, in matters of foreign policy, especially during the frosty 1950s. There was reason for skepticism, too. After World War I, the victors had tried to put arms controls in place. The Versailles treaty forbade Germany from re-arming, chemical weapons were banned, and the major powers entered into the Washington Naval Treaty limiting major warship construction (freezing advantages then enjoyed by the US and Britain). But by the mid-1930s, Germany had renounced Versailles and Japan had pulled out of Washington Naval Treaty.
The imposition of terms by the World War I victors was not a completely fair comparison to a multi-lateral treaty. In any case, regarding nuclear weapons, the internationalists understood the risk-reduction problem exactly right; they were just ahead of their time. By preventing the spread of arms, reducing stockpiles and requiring inspections, their proposals would have limited the number of opportunities for the use of nuclear weapons. And the logic first seen by these scientists in the 1940s and 1950s would gradually, and with other labels, be adopted by alarmed Cold War leaders after the Cuban crisis.
Beginning with the Limited Test Ban Treaty (1963), long a goal of peace groups, the nuclear club agreed to some mutual rules. The nuclear nonproliferation accord (1968) forbade its signatories from assisting new countries in acquiring weapons, obviously a critical step toward reducing the number of opportunities for escalation. It also guaranteed its participants nuclear exclusivity, but that did not make it a bad idea for the rest of the world. In 1972, the US and Soviet Union signed two treaties: the Strategic Arms Limitation Treaty (SALT) which placed limits (rather high limits) on the parties’ long range land and sea based missiles; and the anti-ballistic missile (ABM) treaty, which banned certain defensive weapons which could, if successfully deployed, remove one side’s fear of counterattack, the kind of one-sided situation that Truman enjoyed in 1945.
It is important to remember that much of this difficult diplomatic work took place while war raged between the superpowers’ respective client states in Vietnam. Negotiations were at all times very difficult and often broke down. A collapse of trust in the late 1970s, due in part to Soviet adventurism in a period of America’s post-Vietnam hangover (and in part to high oil prices), sidelined and threatened to reverse progress on arms reduction. Indeed, President Reagan’s announcement of the practically nonexistent “Star Wars” missile defense initiative in the early 1980s went against the spirit of the ABM treaty. But the anti-communist crusader was also an extreme skeptic of nuclear weapons. In the end the US and USSR, and later Russia, cooperated for many years to reduce their arsenals. The danger and cost of practically unusable weapons was ultimately a subject about which the enemies could agree, and they found that they needed each other in order to reduce their arsenals.
Eisenhower came up with a valid solution to the problem of nuclear game theory, in the two-power setting, in seeing that mutual destruction deterred any use. But the “naïve” scientists had seen the more general solution.
President Trump dramatically rejected this approach in 2018 by withdrawing the US from the 2016 six-country treaty, under which Iran agreed to discontinue its nuclear arms program for an extended period in exchange for a lifting of economic sanctions. He hoped this harder line would result in a more favorable political outcome in the region (presumably regime change in Iran). His flamboyant unilateralism, often dismissive of allies, was very widely criticized in other contexts, but a significant number of more conventional analysts also agreed with his abandonment of the Iran deal. These observers generally argued that the treaty would reward Iran’s political and military interventions on behalf of its regional allies, while merely delaying Iranian nuclear capabilities.
In the context of the consistent evolution of the views of Cold War leaders, this position strikes me (and many others) as short sighted: it elevates a regional foreign policy goal regarding Iran above the decades-long multinational effort to reduce nuclear weapons risk. For one thing, a hard-line position is not the only path to positive change in the region: peace may also come in a series of small steps that build trust on issues of consensus. The US did not need to settle every score with the USSR in order to find areas of agreements about nuclear weapons. Of much greater concern is the American willingness to step back from the global consensus toward nuclear risk-reduction. Now any autocrat can cite US inconsistency as a reason to ignore international norms.
The concern is not simply about this particular bet, or the next one. It’s the number of bets, the number of bettors, and their understanding of the Cold War leaders’ experience with nuclear weapons. Winning nine out of ten is a losing proposition.
Still, to use Truman’s words, the end is not yet. President Biden’s administration is attempting the difficult task of reviving the Iran treaty in an atmosphere of even less trust than before. Perhaps, like the Cold War leaders before them, the parties may find mutual benefit in standing down.