The Next Frontier in Computing
Articles,  Blog

The Next Frontier in Computing


I’m John Sarrao, I’m the associate
laboratory director for Theory, Simulation and Computation at Los Alamos National
Laboratory. Exascale is the next frontier in computing, I think it’s where
we’re going sort of in the next generation of computing. Exa stands
for ten to the eighteenth. Importantly, it’s exascale not exaflop. For the last
number of years, we’ve measured computers by how fast they are.
Exascale says there’s more to it than that. How we use those exascale
computing resources for a variety of mission challenges go to how
do we do our business? The National Strategic Computing
Initiative is a framework for how we think about computing for decades
to come. The first element of that is the Exascale Computing Project.
It’s the specific means by which we will get an exascale computer. In
addition to the Exascale Computing Project, there’s an aspect of the initiative
called the Exascale Computing Initiative. The goal of ECI is to use large-scale
simulation resources in places where they’ve not been used before, for example,
to attempt to solve cancer using computing resources. Finally, the third element
of NSCI is what’s called Beyond Moore’s Law Computing. How do we invision
those elements of computing that go beyond the current generation?
So, in the framework of NSCI, you have the specific pursuit
of exascale computing platforms, you have using exascale and other
large-scale simulation well and you have imagining what comes next
after those technologies play out. That whole suite of activities taken
together defines computing for the country for the next several decades.
I think there are several aspects about Los Alamos that make us unique in the
space of high performance computing. It’s a significant differentiator for us
that we think about national security science challenges rather than a
singular focus on computing and that differentiates us from our
peers. Our goal is not to be a great computing laboratory, our goal is to
be a great national security science laboratory that relies on computing.
That’s not meant to diminish the importance of computing in any way,
but it says that computing has to fit in an overall landscape of
directing the mission.

2 Comments

  • Gene Mccall

    Computer speed now depends on ways of structuring problems to fit the computer. Not all problems can be structured in the proper way. In fact, as the speed goes up through paralleling processors, the number of solvable problems goes down. We may wind up with an incredible fast computer that cannot solve a single important problem. Moore's law has not applied here for quite a while.

  • Gene Mccall

    As I said, the term "scale" is pretty vague. Why not stick with exaflop, if that is what you mean? Another question to be answered is what is the average speed that the computers run, day by day. That depends on how parallelized the solutions are. Eventually, the communication system that puts it all together at the end will be the limit. That is still a nanosecond per foot.

Leave a Reply

Your email address will not be published. Required fields are marked *