Twelve years ago, in October, 1999, the Senate rejected the Comprehensive Test Ban Treaty. Only 48 senators supported the treaty, falling short of the 67 required. In the debate, serious doubt was expressed about whether the U.S. nuclear arsenal could be kept safe, secure and reliable without nuclear explosive tests.
At the time, six former secretaries of defense in Republican administrations wrote a letter saying that if the test ban were to be ratified, "over the decades ahead, confidence in the reliability of our nuclear weapons stockpile would inevitably decline, thereby reducing the credibility of our nuclear deterrent."
The six former secretaries also said the relatively young Stockpile Stewardship Program, started in the Clinton administration, "will not be mature for at least 10 years" and could only mitigate, not eliminate, a loss of confidence in the weapons without testing. Although the treaty was rejected, the United States has continued to abide by the test ban.
Today, many fears voiced in the Senate debate have not materialized. That is the core message in an important report issued last week by a nine-member committee of the National Research Council. The panel, which focused on technical issues in the treaty, said the stockpile stewardship program "has been more successful than was anticipated in 1999."
They concluded that "the United States is now better able to maintain a safe and effective nuclear stockpile and to monitor clandestine nuclear-explosion testing than at any time in the past."
That is quite a milestone, and one that I have heard from other sources as well. Bruce T. Goodwin, principal associate director for weapons programs at the Lawrence Livermore National Laboratory, told me in an interview last year, "We have a more fundamental understanding of how these weapons work today than we ever imagined when we were blowing them up."
The Stockpile Stewardship Program includes such things as surveillance of the weapons -- taking them apart and checking them. It includes non-nuclear experiments, and periodic life-extension programs for the existing weapons.
There is also a massive supercomputing program to simulate nuclear explosions, which has advanced by leaps and bounds since the 1990s. According to the committee's report, the computing capability available to weapons designers "has increased by a factor of approximately one hundred thousand" since 1996. I wrote a story about this for The Washington Post in November. What I found in talking to scientists at Livermore is that they are using some of the world's most capable computers to create realistic models of what happens inside a nuclear explosion, when tremendous pressures and temperatures squeeze metals, including uranium and plutonium, to set off the nuclear blast.
The computer simulations produce a virtual window into what happens in an explosion. "This is millions of times finer than you could ever do in a nuclear test," Goodwin told me. "You could never see this process go on inside a nuclear explosion."
Such progress depends, in part, on the use of hard data from past nuclear explosions. Also, computer simulations are impressive, but they must be validated by modern laboratory experiments. All this is expensive: state-of-the-art supercomputers, advanced laboratory facilities, a modernized infrastructure and the need to recruit and sustain the best and brightest workforce. The committee said funding each of these is essential. But it seems a relatively small price to pay for an end to U.S. nuclear explosions.
We really have come a long way since 1999.
David E. Hoffman is a Pulitzer Prize-winning author and a contributing editor to Foreign Policy.