I n November 2012, The Top500 supercomputer’s semiannual rankings gave the highest rating to a machine built in Oak Ridge National Laboratory in Tennessee. Well named Titan, the machine had a maximum performance of more than 27 × 1015 floating point operations per second, or 27 petaflops. It was a huge IT resource for government, industry and academia researchers, and being at the pinnacle of supercomputing, it helped boost pride in the high-performance computing community in the United States.
Photo: Oak Ridge National Laboratory
Filtered in: A technician installs cables for the internal data network of the Summit supercomputer at Oak Ridge National Laboratory, Tennessee.
An advantage that the all new supercomputer will bring to Oak Ridge is a significant improvement in energy efficiency. Summit should be able to run simulations of researchers 5 to 10 times faster than Titan, using only twice the power. The typical requirements will be around 15 megawatts. Fortunately, the power will come from the Tennessee Valley Authority’s fully-equipped electricity grid. Others may find it more difficult to power a modern supercomputer, Bland notes. “Go to your local power company and ask,” Where can I plug in my 15-MW computer? “And see what they tell you,” he says ironically.
Although Summit is the best performer, it is not the only US supercomputer in its class to enter service in 2018. A supercomputer named Sierra expected to exceed 120 petaflops of peak performance , will be completed at Lawrence Livermore National Laboratory, California. The Argonne National Laboratory was also to operate in 2018 a new supercomputer, one offering 180 petaflops of maximum performance. But plans from the Illinois laboratory for the construction of this machine, called Aurora, were delayed until 2021 to attempt to ] expand its capabilities and make it the first US supercomputer “exascale” (1000 petaflops or 1 exaflop).
These huge numbers refer to peak performance, but real-world applications only use a fraction of that potential. The Linpack benchmark, often cited, typically runs to 75% of the top of a supercomputer, Dongarra says. “Our little secret is that most real apps are at 3%.”
Clearly, finding intelligent ways to improve actual performance is as important as the number of peak flops theoretically available. And specialists in Oak Ridge supercomputers are also putting a lot of energy into this effort. Joseph Oefelein, who will use Summit in his studies on physics and chemistry of combustion at Georgia Tech succinctly summarizes it: “This game does not boil down to saying that you have the computer the most fast. “
This article appears in the print magazine of January 2018 under the title “U.S. Intensive Computing Counter Attack.”