What’s next for the world’s fastest supercomputers
Home/Technology / What’s next for the world’s fastest supercomputers
What’s next for the world’s fastest supercomputers

To use Frontier, approved scientists log in to the supercomputer remotely, submitting their jobs over the internet. To make the most of the machine, Oak Ridge aims to have around 90% of the supercomputer’s processors running computations 24 hours a day, seven days a week. “We enter this sort of steady state where we’re constantly doing scientific simulations for a handful of years,” says Messer. Users keep their data at Oak Ridge in a data storage facility that can store up to 700 petabytes, the equivalent of about 700,000 portable hard drives.

While Frontier is the first exascale supercomputer, more are coming down the line. In the US, researchers are currently installing two machines that will be capable of more than two exaflops: Aurora, at Argonne National Laboratory in Illinois, and El Capitan, at Lawrence Livermore National Laboratory in California. Beginning in early 2024, scientists plan to use Aurora to create maps of neurons in the brain and search for catalysts that could make industrial processes such as fertilizer production more efficient. El Capitan, also slated to come online in 2024, will simulate nuclear weapons in order to help the government to maintain its stockpile without weapons testing. Meanwhile, Europe plans to deploy its first exascale supercomputer, Jupiter, in late 2024.

China purportedly has exascale supercomputers as well, but it has not released results from standard benchmark tests of their performance, so the computers do not appear on the TOP500, a semiannual list of the fastest supercomputers. “The Chinese are concerned about the US imposing further limits in terms of technology going to China, and they’re reluctant to disclose how many of these high-performance machines are available,” says Dongarra, who designed the benchmark that supercomputers must run for TOP500.

The hunger for more computing power doesn’t stop with the exascale. Oak Ridge is already considering the next generation of computers, says Messer. These would have three to five times the computational power of Frontier. But one major challenge looms: the massive energy footprint. The power that Frontier draws, even when it is idling, is enough to run thousands of homes. “It’s probably not sustainable for us to just grow machines bigger and bigger,” says Messer. 

As Oak Ridge has built progressively larger supercomputers, engineers have worked to improve the machines’ efficiency with innovations including a new cooling method. Summit, the predecessor to Frontier that is still running at Oak Ridge, expends about 10% of its total energy usage to cool itself. By comparison, 3% to 4% of Frontier’s energy consumption is for cooling. This improvement came from using water at ambient temperature to cool the supercomputer, rather than chilled water.

Next-generation supercomputers would be able to simulate even more scales simultaneously. For example, with Frontier, Schneider’s galaxy simulation has resolution down to the tens of light-years. That’s still not quite enough to get down to the scale of individual supernovas, so researchers must simulate the individual explosions separately. A future supercomputer may be able to unite all these scales.

By simulating the complexity of nature and technology more realistically, these supercomputers push the limits of science. A more realistic galaxy simulation brings the vastness of the universe to scientists’ fingertips. A precise model of air turbulence around an airplane fan circumvents the need to build a prohibitively expensive wind tunnel. Better climate models allow scientists to predict the fate of our planet. In other words, they give us a new tool to prepare for an uncertain future.



Source link