Machine studying accelerates cosmological simulations

IMAGE: The leftmost simulation ran at low decision. Using machine studying, researchers upscaled the low-res mannequin to create a high-resolution simulation (proper). That simulation captures the identical particulars as a standard…
view more 

Credit: Credit: Y. Li et al./Proceedings of the National Academy of Sciences 2021

A universe evolves over billions upon billions of years, however researchers have developed a approach to create a posh simulated universe in lower than a day. The method, printed on this week’s Proceedings of the National Academy of Sciences, brings collectively machine studying, high-performance computing and astrophysics and can assist to usher in a brand new period of high-resolution cosmology simulations.

Cosmological simulations are a necessary a part of teasing out the various mysteries of the universe, together with these of darkish matter and darkish power. But till now, researchers confronted the frequent conundrum of not having the ability to have all of it ¬– simulations may give attention to a small space at excessive decision, or they might embody a big quantity of the universe at low decision.

Carnegie Mellon University Physics Professors Tiziana Di Matteo and Rupert Croft, Flatiron Institute Research Fellow Yin Li, Carnegie Mellon Ph.D. candidate Yueying Ni, University of California Riverside Professor of Physics and Astronomy Simeon Bird and University of California Berkeley’s Yu Feng surmounted this drawback by educating a machine studying algorithm based mostly on neural networks to improve a simulation from low decision to tremendous decision.

“Cosmological simulations need to cover a large volume for cosmological studies, while also requiring high resolution to resolve the small-scale galaxy formation physics, which would incur daunting computational challenges. Our technique can be used as a powerful and promising tool to match those two requirements simultaneously by modeling the small-scale galaxy formation physics in large cosmological volumes,” mentioned Ni, who carried out the coaching of the mannequin, constructed the pipeline for testing and validation, analyzed the information and made the visualization from the information.

The skilled code can take full-scale, low-resolution fashions and generate super-resolution simulations that comprise as much as 512 instances as many particles. For a area within the universe roughly 500 million light-years throughout containing 134 million particles, present strategies would require 560 hours to churn out a high-resolution simulation utilizing a single processing core. With the brand new strategy, the researchers want solely 36 minutes.

The outcomes had been much more dramatic when extra particles had been added to the simulation. For a universe 1,000 instances as giant with 134 billion particles, the researchers’ new technique took 16 hours on a single graphics processing unit. Using present strategies, a simulation of this measurement and determination would take a devoted supercomputer months to finish.

Reducing the time it takes to run cosmological simulations “holds the potential of providing major advances in numerical cosmology and astrophysics,” mentioned Di Matteo. “Cosmological simulations follow the history and fate of the universe, all the way to the formation of all galaxies and their black holes.”

Scientists use cosmological simulations to foretell how the universe would look in varied situations, comparable to if the darkish power pulling the universe aside different over time. Telescope observations then verify whether or not the simulations’ predictions match actuality.

“With our previous simulations, we showed that we could simulate the universe to discover new and interesting physics, but only at small or low-res scales,” mentioned Croft. “By incorporating machine learning, the technology is able to catch up with our ideas.”

Di Matteo, Croft and Ni are a part of Carnegie Mellon’s National Science Foundation (NSF) Planning Institute for Artificial Intelligence in Physics, which supported this work, and members of Carnegie Mellon’s McWilliams Center for Cosmology.

“The universe is the biggest data sets there is — artificial intelligence is the key to understanding the universe and revealing new physics,” mentioned Scott Dodelson, professor and head of the division of physics at Carnegie Mellon University and director of the NSF Planning Institute. “This research illustrates how the NSF Planning Institute for Artificial Intelligence will advance physics through artificial intelligence, machine learning, statistics and data science.”

“It’s clear that AI is having a big effect on many areas of science, including physics and astronomy,” mentioned James Shank, a program director in NSF’s Division of Physics. “Our AI planning Institute program is working to push AI to accelerate discovery. This new result is a good example of how AI is transforming cosmology.”

To create their new technique, Ni and Li harnessed these fields to create a code that makes use of neural networks to foretell how gravity strikes darkish matter round over time. The networks take coaching information, run calculations and evaluate the outcomes to the anticipated final result. With additional coaching, the networks adapt and grow to be extra correct.

The particular strategy utilized by the researchers, referred to as a generative adversarial community, pits two neural networks in opposition to one another. One community takes low-resolution simulations of the universe and makes use of them to generate high-resolution fashions. The different community tries to inform these simulations other than ones made by standard strategies. Over time, each neural networks get higher and higher till, finally, the simulation generator wins out and creates quick simulations that look similar to the gradual standard ones.

“We couldn’t get it to work for two years,” Li mentioned, “and suddenly it started working. We got beautiful results that matched what we expected. We even did some blind tests ourselves, and most of us couldn’t tell which one was ‘real’ and which one was ‘fake.'”

Despite solely being skilled utilizing small areas of house, the neural networks precisely replicated the large-scale buildings that solely seem in monumental simulations.

The simulations did not seize all the things, although. Because they centered on darkish matter and gravity, smaller-scale phenomena — comparable to star formation, supernovae and the results of black holes — had been ignored. The researchers plan to increase their strategies to incorporate the forces answerable for such phenomena, and to run their neural networks ‘on the fly’ alongside standard simulations to enhance accuracy.


The analysis was powered by the Frontera supercomputer on the Texas Advanced Computing Center (TACC), the quickest tutorial supercomputer on the earth. The crew is likely one of the largest customers of this large computing useful resource, which is funded by the NSF Office of Advanced Cyberinfrastructure.

This analysis was funded by the NSF, the NSF AI Institute: Physics of the Future and NASA.


Please enter your comment!
Please enter your name here