Turning Big Data into fast data for nuclear weapons simulations at the exascale
- Managing Editor
- Clay Dillingham
Big Data, fast data for the exascale
- The new Los Alamos burst buffer technology solves the mismatch between exascale supercomputer speed and data storage capabilities
- Thus Big Data turns into Fast Data—data that becomes useful the moment it’s generated.
Big data is everywhere. Massive sets of digital data are being collected or generated for science, medicine, astronomy and cosmology, national security, cybersecurity, situational awareness for our warfighters, social networking, financial markets, and more.
Perhaps the most daunting are the datasets that will be generated by the national laboratories’ next generation exascale supercomputers as they simulate complex physical systems that are difficult or impossible to test—nuclear weapons being a prime example.
Los Alamos hopes to do the first high-resolution 3D simulations of nuclear weapon detonations at the exascale by 2020, but those simulations must be cost-effective. The Laboratory knew back in 2006 that big data generated by exascale simulations would lead to big data bottlenecks and make the old way of doing supercomputing unaffordable. It knew it needed to innovate and it came up with burst buffer technology.
*PDF can be best for printing and for viewing on Kindle and other readers