Trinity to Trinity
70th Anniversary of the Trinity Test
1945 - 2015
Join us for the Director's Roundtable Discussion
July 16, 2015
8:30 – 10:15 am
This journey from Trinity to Trinity begins with the New Mexico desert night sky turning instantly to day at 05:29 am on July 16, 1945. An eyewitness recalled, “The effects could well be called unprecedented, magnificent, beautiful, stupendous, and terrifying. The lighting effects beggared description. The whole country was lighted by a searing light with the intensity many times that of the midday sun. It was golden, purple, violet, gray, and blue.”
It was the Trinity Test: the world’s first nuclear detonation. Trinity was the culmination of a fantastic effort of groundbreaking science and engineering by hundreds of men and women at Los Alamos National Laboratory (and other Manhattan Project sites). It took them less than two years to change the world.
From the Director
This year, the Laboratory is marking the 70th anniversary of the Trinity Test because it not only ushered in the Nuclear Age, but with it the origin of today’s advanced supercomputing. We live in the Age of Supercomputers due in large part to nuclear weapons science here at Los Alamos.
"Highly accurate 3D computing is a Holy Grail of the Stockpile Stewardship Program’s supercomputing efforts. As the weapons age, 3D features tend to be introduced that require highly accurate 3D modeling to understand. This is a great challenge reminiscent of the one faced by the Manhattan Project. The challenge then was to build the first nuclear weapon that works. Now our challenge is to understand how and why a weapon works well enough to confidently predict its performance."
Director Charles McMillan
Punched Cards to Petaflops
Los Alamos National Laboratory’s Key Role in the Evolution of Computing
National security science, and nuclear weapons science in particular, at Los Alamos National Laboratory have provided a key motivation for the evolution of large-scale scientific computing. Beginning with the Manhattan Project there has been a constant stream of increasingly significant, complex problems in nuclear weapons science whose timely solutions demand larger and faster computers. The relationship between national security science at Los Alamos and the evolution of computing is one of interdependence.
Vacuum Tube Era 1946–60
The eminent Hungarian mathematician, John von Neumann, introduced Los Alamos to the world’s first electronic digital “computer,” the ENIAC (Electronic Numerical Integrator And Computer). The ENIAC was designed to make calculations for the Army’s artillery firing tables to help gunners improve accuracy. But first, the ENIAC was used to perform calculations to design and build the hydrogen bomb. A revolutionary capability, conceived by von Neumann, was the ENIAC's use of programs stored electronically on the machine, which became the basis for all modern computing systems.
In the 1950s digital-electronic computing technology became less expensive, more reliable, and more powerful and commercially produced computers started to gradually displace ENIAC’s custom handmade descendants.
The Laboratory purchased its first commercial computer, an IBM 701, in 1953. This opened a new era in Los Alamos computing, dominated by commercial machines and custom computers developed jointly with corporate partners. Throughout much of the 1950s and 60s, the Laboratory managed to double computing capacity every two years.The most significant advancement during this era was the development of transistors.
IBM Era 1960–64
To meet the growing computing needs of the nuclear weapons program, the Laboratory jointly developed with IBM the Stretch, IBM’s first transistorized computer. The Stretch, delivered in 1961, retained the title of world’s fastest computer into the mid-1960s. Using this technology, IBM began to focus less on government contracts and more on building computers for thousands of commercial clients.
Control Data Corporation (CDC) Era 1964
Needing a new partner, Los Alamos looked to CDC to develop evermore-powerful machines for doing increasingly complex weapon’s calculations. CDC delivered by producing the world’s first supercomputer, the model 6600. The 6600s were the first computers capable of performing a million floating-point operations per second (megaflops). The even faster CDC 7600 models soon supplemented them.
Cray Era 1976–89
Seymour Cray, the CDC designer who led the development teams that produced the 6600 and 7600, left CDC to start his own company in 1972. Cray Research completed its first design, the revolutionary 160-megaflop Cray-1, in 1975 and delivered it to Los Alamos the following year.
The Cray-1 used integrated circuits and an innovative Freon cooling system to ensure the giant machines did not overheat. Cray also used revolutionary “vector” processing that enabled the Cray-1 to process information far more efficiently than any other computer of its day. During the 1980s, the Laboratory purchased additional Cray computers, most notably the X-MP. The X-MP, which used multiple “vector” processors, reigned as the world’s fastest computer from 1982 to 1985.
Stockpile Stewardship Era 1989–present
As the 1980s drew to a close, Los Alamos continued to drive the evolution of supercomputing. The Lab partnered with Thinking Machines Corporation to develop the massively-parallel Connection Machine series that focused on quantity over quality: using thousands of microprocessors (not more powerful ones) to perform numerous calculations simultaneously. This took Lab computing into the gigaflop zone (one billion floating-point operations per second) by 1990.
But then the Cold War came to an abrupt end in 1991 and nuclear weapons testing ceased in 1992. To ensure the continued safety, security, and reliability of the nation’s nuclear deterrent, President Bill Clinton and the Congress created the Stockpile Stewardship Program in 1994. In lieu of underground nuclear weapons testing, this law called for “an increased level of effort for advanced computational capabilities to enhance the simulation and modeling capabilities of the United States with respect to the detonation of nuclear weapons.”
Suddenly, it was necessary to rapidly develop supercomputers powerful enough to replace real-world nuclear testing with virtual testing. This meant eventually being able to create simulations in 3D. Increasing speed was important, but to be really useful, 3D simulations with high resolution and accuracy were even more important. To achieve high-fidelity 3D simulations, computing would need to make incredible technological leaps from gigaflops to teraflops (trillions, in 1999) to petaflops (quadrillions, in 2008) to exaflops (quintillions, coming soon).
Learn more about Trinity today: Trinity: Advancing Predictive Capability for Stockpile Stewardship
Trinity: Supercomputing into the Future
The nation’s supercomputing capabilities have evolved in response to the needs of the Lab’s nuclear weapons program delivering a safe, secure, and effective nuclear deterrent. Today, as the anniversary of the Trinity Test turns 70, the need for 3D simulations has brought the Trinity supercomputer to Los Alamos. Trinity—expected to be the second or third fastest supercomputer in the world—will make complex 3D simulations of nuclear detonations with increased fidelity and resolution practical. The first phase of Trinity’s installation began in June.