In the year 2018, astronomers stumbled upon a fascinating finding: Thousands of black holes likely to exist near the center of our galaxy.
The X-ray images which were captured that enabled this discovery weren’t from some state-of-the-art new telescope. Nor were they even recently taken – some of the data was collected nearly 20 years ago.
Discoveries like this will only become more common, as the era of ‘big data’ changes how science is done. Astronomers are gathering an exponentially greater amount of data every day – so much that it will take years to uncover all the hidden signals buried in the archives.
Sixty years ago, the typical astronomer worked largely in a small team. They likely had access to a respectably large ground-based optical telescope at their home institution.
These observations were largely confined to optical wavelengths – more or less what the eye can see. That meant they missed signals from a host of astrophysical sources, which can emit non-visible radiation from very low-frequency radio all the way up to high-energy gamma rays. For the most part, if you wanted to do astronomy, you had to be an academic or eccentric rich person with access to a good telescope.
Old data was stored in the form of photographic plates or published catalogs. But accessing archives from other observatories could be difficult – and it was virtually impossible for amateur astronomers.
Today, there are observatories that cover the entire electromagnetic spectrum. No longer operated by single institutions, these state-of-the-art observatories are usually launched by space agencies and are often joint efforts involving many countries.
With the coming of the digital age, almost all data are publicly available shortly after they are obtained. This makes astronomy very democratic – anyone who wants to can reanalyse almost any data set that makes the news. (You too can look at the Chandra data that led to the discovery of thousands of black holes!
These observatories generate a staggering amount of data. For example, the Hubble Space Telescope, operating since 1990, has made over 1.3 million observations and transmits around 20 GB of raw data every week, which is impressive for a telescope first designed in the 1970s. The Atacama Large Millimeter Array in Chile now anticipates adding 2 TB of data to its archives everyday.
The archives of astronomical data are already impressively large. But things are about to explode.
Each generation of observatories are usually at least ten times more sensitive than the previous, either because of improved technology or because the mission is simply larger. Depending on how long a new mission runs, it can detect hundreds of times more astronomical sources than previous missions at that wavelength.
For example, compare the early EGRET gamma ray observatory, which flew in the 1990s, to NASA’s flagship mission Fermi, which turns 10 this year. EGRET detected only about 190 gamma ray sources in the sky. Fermi has seen over 5,000.
The Large Synoptic Survey Telescope, an optical telescope currently under construction in Chile, will image the entire sky every few nights. It will be so sensitive that it will generate 10 million alerts per night on new or transient sources, leading to a catalog of over 15 petabytes after 10 years.
The Square Kilometre Array, when completed in 2020, will be the most sensitive telescope in the world, capable of detecting airport radar stations of alien civilizations up to 50 light-years away. In just one year of activity, it will generate more data than the entire internet.
These ambitious projects will test scientists’ ability to handle data. Images will need to be automatically processed – meaning that the data will need to be reduced down to a manageable size or transformed into a finished product. The new observatories are pushing the envelope of computational power, requiring facilities capable of processing hundreds of terabytes per day.
The resulting archives – all publicly searchable – will contain 1 million times more information that what can be stored on your typical 1 TB backup disk.
The data deluge will make astronomy become a more collaborative and open science than ever before. Thanks to internet archives, robust learning communities and new outreach initiatives, citizens can now participate in science. For example, with the computer program Einstein@Home, anyone can use their computer’s idle time to help search for gravitational waves from colliding black holes.
It’s an exciting time for scientists, too. Astronomers like myself often study physical phenomena on timescales so wildly beyond the typical human lifetime that watching them in real-time just isn’t going to happen. Events like a typical galaxy merger – which is exactly what it sounds like – can take hundreds of millions of years. All we can capture is a snapshot, like a single still frame from a video of a car accident.
However, there are some phenomena that occur on shorter timescales, taking just a few decades, years or even seconds. That’s how scientists discovered those thousands of black holes in the new study. It’s also how they recently realised that the X-ray emission from the center of a nearby dwarf galaxy has been fading since first detected in the 1990s. These new discoveries suggest that more will be found in archival data spanning decades.
A black-hole-powered jet of hot gas in the giant elliptical galaxy M87. Credit: NASA, ESA, E. Meyer, W. Sparks, J. Biretta, J. Anderson, S.T. Sohn, and R. van der Marel (STScI), C. Norman (Johns Hopkins University), and M. Nakamura (Academia Sinica)
In my own work, I use Hubble archives to make jets, “high-speed plasma ejected in beams from black holes. I used over 400 raw images spanning 13 years to make a movie of the jet in nearby galaxy M87. That movie showed, for the first time, the twisting motions of the plasma, suggesting that the jet has a helical structure.
This kind of work was only possible because other observers, for other purposes, just happened to capture images of the source I was interested in, back when I was in kindergarten. As astronomical images become larger, higher resolution and ever more sensitive, this kind of research will become the norm.