Los Alamos National Laboratory

Los Alamos National Laboratory

Delivering science and technology to protect our nation and promote world stability

Turning Big Data into fast data for nuclear weapons simulations at the exascale

Solving the roadblock for tomorrow’s exascale supercomputing
March 25, 2013
graphic showing a small person compared to a huge amount of data in a tree shape

Los Alamos is doing serious prepping for the exascale environment of the future.


  • Managing Editor
  • Clay Dillingham
  • Email
We’ll be able to watch the simulation as it’s happening and intervene if we see something that needs changing. This is truly big data becoming fast data.

Big Data, fast data for the exascale


  • The new Los Alamos burst buffer technology solves the mismatch between exascale supercomputer speed and data storage capabilities
  • Thus Big Data turns into Fast Data—data that becomes useful the moment it’s generated.

Big data is everywhere. Massive sets of digital data are being collected or generated for science, medicine, astronomy and cosmology, national security, cybersecurity, situational awareness for our warfighters, social networking, financial markets, and more.

Perhaps the most daunting are the datasets that will be generated by the national laboratories’ next generation exascale supercomputers as they simulate complex physical systems that are difficult or impossible to test—nuclear weapons being a prime example.

Los Alamos hopes to do the first high-resolution 3D simulations of nuclear weapon detonations at the exascale by 2020, but those simulations must be cost-effective. The Laboratory knew back in 2006 that big data generated by exascale simulations would lead to big data bottlenecks and make the old way of doing supercomputing unaffordable. It knew it needed to innovate and it came up with burst buffer technology.

This story:  interactive   |   pdf
Full issue:   interactive   |   pdf 

*PDF can be best for printing and for viewing on Kindle and other readers