Supercomputer Just Created the Largest Space Simulation Ever
Last month, a team of researchers made the world’s fastest supercomputer at the time work on a much bigger problem: the universe’s nature of atoms and dark matter.
The supercomputer is called Frontier; recently, a team of researchers recently used it to run the largest simulation of the sky yet. The simulation size of the supercomputer is compatible with surveys taken by observatories with large telescopes, which has not been possible until now. Mathematical simulations provide a new basis for the universal simulation of the contents of all matter, from everything we see to invisible objects that only interact with ordinary objects through gravity.
What exactly is the Frontier supercomputer calculated on?
Frontier is a state-of-the-art supercomputer, capable of performing quintillion (one billion) calculations per second. In other words, it’s a juicy machine fit for a big job simulating the physics and evolution of the known and unknown universe.
“If we want to know what the universe is like, we need to simulate both of these things: gravity and all other physics including hot gas, and the formation of stars, black holes and galaxies,” said Salman Habib. director of the computer science division at Argonne National Laboratory, a release from Oak Ridge National Laboratory. The astrophysical “kitchen sink”.
The matter we know—the stuff we can see, from black holes, molecular clouds, planets and moons—makes up only 5% of the universe’s content, according to CERN. A large part of the universe is explained only by the effects that gravity seems to have on physical (or atomic) matter. That invisible patch is called dark matter, a catch-all term for the number of particles and substances that may be responsible for about 27% of the universe. The remaining 68 percent of the universe’s composition is attributed to dark energy, which is responsible for the universe’s rapid expansion rate.
How does Frontier change our understanding of the universe?
“If we were to measure a large part of the universe that has been observed by one of the biggest telescopes like the Rubin Observatory in Chile, you’re talking about looking at huge chunks of time – billions of years of expansion,” Habib said. “Until recently, we couldn’t even imagine doing simulations this large without measuring gravity alone.”
In the top image, the image on the left shows the evolution of the expanding universe over billions of years in an area containing a cluster of galaxies, and the image on the right shows the formation and motion of galaxies over time in one part of that image.
“It’s not just the size of the virtual domain, it’s needed for direct comparison with modern survey observations enabled by exascale computing,” said Bronson Messer, science director of the Oak Ridge Leadership Computing Facility, in the laboratory’s release. “And it’s the added physical reality of including baryons and all the other dynamic physics that makes this simulation a true Frontier tour.”
Frontier is no longer the world’s fastest supercomputer
Frontier is one of many supercomputers used by the Department of Energy, and has more than 9,400 CPUs and more than 37,000 GPUs. It resides at Oak Ridge National Laboratory, although the latest simulations are done by researchers at Argonne.
Frontier results are possible thanks to a supercomputer code, the Hardware/Hybrid Accelerated Cosmology Code (or HACC). The fifteen-year-old code was updated as part of DOE’s $1.8 billion, eight-year Exascale Computing project, which was completed this year.
The simulation results were announced last month, when Frontier is still the fastest computer in the world. But not long after, Frontier was eclipsed by the El Capitan supercomputer as the world’s fastest. El Capitan is certified at 1.742 quintillion counts per second, with a peak performance of 2.79 quintillion counts per second, according to a release from Lawrence Livermore National Laboratory.
Source link