Los Alamos National Laboratory

Los Alamos National Laboratory

Delivering science and technology to protect our nation and promote world stability

Two Roads to Next-gen Computing

Overcoming the fault vulnerability that goes hand in hand with increasing computing power
October 1, 2017
computer chips

As computer chips get smaller, they become more susceptible to radiation-induced errors. The future of computing will have to address this vulnerability one way or another.

We can either triple-check everything or accept some inexact calculations—or both.

Moore’s Law famously asserts that the number of transistors per unit area that can be manufactured on a computer chip doubles every year or two—every year when the law was first postulated in the 1960s and every 18–24 months now. Historically, this has meant a steady increase in processing power, memory, and other measures of computer performance. While there have been many false predictions of the end of Moore’s Law, at least one such prediction seems very difficult to avoid: with greater transistor density on a chip comes greater sensitivity to radiation-induced faults.

To address this impending crisis, Los Alamos computer scientists are working on two fronts. One is to develop rigorous, advanced systems for error detection and correction. The other is to develop inexact calculation methods for certain resource-intensive operations—such as image 
processing, social networking, searching, and sorting—that do not require
 perfect, deterministic computations.

VIEW THIS ARTICLE

PDF ICON    PDF ICON

 


Visit Blogger Join Us on Facebook Follow Us on Twitter See our Flickr Photos Watch Our YouTube Videos Find Us on LinkedIn Find Us on iTunesFind Us on GooglePlayFind Us on Instagram