Atop the summit of Haleakala on the Hawaiian island of Maui sits the Panoramic Survey Telescope and Rapid Response System, or Pan-STARRS1 (PS1). As a part of the Haleakala Observatory overseen by the University of Hawaii, Pan-STARRS1 depends on a system of cameras, telescopes, and a computing facility to conduct an optical imaging survey of the sky, in addition to astrometry and photometry of know objects.
In 2018, the University of Hawaii at Manoa’s Institute for Astronomy (IfA) launched the PS1 3pi survey, the world’s largest digital sky survey that spanned three-quarters of the sky and encompassed Three billion objects. And now, a staff of astronomers from the IfA have used this information to create the Pan-STARRS1 Source Types and Redshifts with Machine Learning (PS1-STRM), the world’s largest three-dimensional astronomical catalog.
Their work is described in a paper that appeared within the August 31st difficulty of the Monthly Notices of the Royal Astronomical Society. The research was led by Robert Beck, a former cosmology postdoctoral fellow on the IfA (now a professor at Eötvös Loránd University in Hungary), and included members from each establishments, in addition to Stanford Health Care’s Platform Services.
Novel Computational Tools
As they describe of their research, the staff started by taking publicly-available spectroscopic measurements of the two,902,054,648 objects studied within the PS1 3pi survey, which supplies them with definitive object classifications and distances. They then fed these to a synthetic intelligence algorithm, which sorted them into stars, galaxies, quasars, or not sure (it additionally derived refined estimates for the galaxies’ distances).
As Beck described the method in a latest University of Hawaii News press release:
“Utilizing a state-of-the-art optimization algorithm, we leveraged the spectroscopic training set of almost 4 million light sources to teach the neural network to predict source types and galaxy distances, while at the same time correcting for light extinction by dust in the Milky Way.”
The machine studying course of they employed, often called a “feedforward neural network,” was intrinsic to serving to the staff precisely decide the properties of various objects and type them primarily based on their measurement and photometric redshift. Overall, this course of achieved a classification accuracy of 98.1% for galaxies (and distance estimates correct to virtually 3%), 97.8% for stars, and 96.6% for quasars.
Largest 3D Map Ever!
To date, the biggest and most detailed 3D maps of the Universe had been created by the Sloan Digital Sky Survey (SDSS), which was launched in 2012. This survey combines information from the Sloan Foundation 2.5 m Telescope and the NMSU 1-Meter telescope on the Apache Point Observatory (New Mexico), and the Irénée du Pont Telescope at Las Campanas Observatory (Chile).
The most up-to-date data release (DR16), the fourth launch of the fourth section of the SDSS (SDSS-IV), incorporates SDSS observations via to August 2018. The ultimate launch (DR17) is scheduled for July 2021 and can embrace all new spectra observations, in addition to all ultimate information merchandise and catalogs. However, the SDSS catalog covers one-third of the sky and incorporates spectra for over Three million objects.
In comparability, the PS1-STRM doubles the world surveyed, will increase the variety of objects tenfold, and covers particular areas that the SDSS missed. As István Szapudi, an IfA astronomer and co-author on the research, noted :
“[A]lready, a preliminary version of this catalog, covering a much smaller area, facilitated the discovery of the largest void in the universe, the possible cause of the Cold Spot. The new, more accurate, and larger photometric redshift catalog will be the starting point for many future discoveries.”
This newest map of the Universe is a testomony to the best way astronomical devices and strategies have matured in a brief area of time. In explicit, it has proven how huge information units obtained by a number of telescopes might be multiplied with the addition of machine studying strategies, improved information sharing, and complementary observations.
Ken Chambers, the Pan-STARRS Director and an IfA Associate Astronomer who was additionally a co-author on the research, indicated that that is only the start. “As Pan-STARRS collects more and more data,” he said, “we will use machine learning to extract even more information about near-Earth objects, our Solar System, our Galaxy and our Universe.”
The Pan-STARRS 3D catalog (approx. 300 GB in measurement) is now accessible on the Mikulski Archive for Space Telescopes. Science customers can question the catalog via the Space Telescope Science Institute’s (STScI) MAST CasJobs SQL interface, or obtain the whole bundle as a computer-readable desk.