SNN

What the search for alien ‘megastructures’ taught us about seeking life beyond Earth

In 2015, the same year an immense observatory on Earth captured proof of the 4D fabric of spacetime, scientists began toying with a rather far-fetched idea: If intelligent aliens are out there, might they have tried making a scientific megastructure of their own? And if they did, can we find it? Actually, have we already?

Yes, I am being fully serious. It all begins with a fascinating paper entitled “Planet Hunters IX. KIC 8462852 – where’s the flux?”

In this paper, a crew of researchers presented their analysis of data gleaned from NASA’s Kepler telescope. It concerned a star that resides about 1,470 light-years from where you’re sitting named KIC 8462852, or Boyajian’s Star in a nod to the study’s lead author. According to the team’s results, Boyajian’s Star seemed to exhibit a bunch of very peculiar dips in light.

Normally, when studying a star from our vantage point in the cosmos, telescopes can naturally see dips in starlight whenever something passes between them and the star itself. Imagine you’re staring at a bright lightbulb, then someone passes in front of the lightbulb. Its emissions would appear interrupted. Usually, as you may expect, an exoplanet causes such dimming when orbiting its stellar host — but… not for Boyajian’s Star.

“It’s not a sphere,” Daniel Giles, a postdoctoral researcher at the SETI Institute said during the 243rd meeting of the American Astronomical Society in January. “It’s composed of something like a bunch of panels … it looks like what a megastructure would look like.”

Because of this, following that 2015 result, the crowd went wild. News articles, follow-up observations, opinion pieces and even just general chatter started rippling through the astronomy niche. Okay, pause. I’ll save you the trouble and let you know that the ultimate consensus was: No, these weird dips weren’t caused by a massive piece of futuristic alien technology. “It’s probably dust,” Giles said. But here’s the thing.

Related: ‘Dyson sphere’ legacy: Freeman Dyson’s wild alien megastructure idea will live forever

“Signals like this were actually missed in the Kepler data,” Giles explained. In fact, a huge reason the researchers behind the paper found the light dip anomaly at all was because citizen scientists spotted it by accident while searching for something else.

Or as Giles puts it: “People weren’t looking.”

So, that’s precisely what he and fellow researchers aim to do. Perhaps, they believe, the truth about aliens lies straight in the data — we just have to look for it. But, like, really look.

Enlisting the machines

In short, Giles and his team intend to search for the confusing, mysterious, intriguing and starkly out-of-the-ordinary signals in data collected by NASA’s Transiting Exoplanet Survey Satellite, or TESS. They want to hunt for starlight dips that don’t have a defined shape, a defined depth or even a defined timeframe. The cosmic outliers.

NASA’s TESS mission was designed to identify exoplanets, but that’s no reason not to use its data to better understand stars as well. (Image credit: MIT)

Strange dips like these can be spotted through photometric curves, which represent brightness over time. “We’re counting photons,” Giles explained in a nutshell. The kicker, however, is precisely how the team wishes to embark on this anomaly-hunting quest: Machine learning.

The process is pretty much as follows.

TESS data used in the study is based on the satellite’s view of different sky sectors. These sectors were viewed across some 30 days at a time; during that scan, TESS took a snapshot of the observed area once every 30 minutes. This eventually led the team to about 60 million light curves ready for analysis, generated for stars brighter than 14 magnitude. In the magnitude system, smaller numbers are brighter than larger numbers — a magnitude 0 object is 100 times brighter than a magnitude 5 object, for instance. A full moon goes into the negatives with a magnitude of around -12.6; the sun shines around magnitude -27. And so on.

The next step is to start mass organizing the light curves based on things like their shapes and periodicities. “We’re processing 60 million different light curves so we need them to be cheap and easy to calculate,” Giles said. “We calculate these cheap metrics and then we run the anomaly detection on it, and this is a density based anomaly detection — we find out what has features that stick out.”

Then, after culling the data down to a manageable size, the team gets ready to apply more granular techniques that typically take more computational power. The nitty gritty, difficult-to-do analyses. “We ensure that the behavior actually exists, and is astrophysical and not due to an instrumentation issue,” Giles said.

If something exhibits a recognizable pattern, well, time to go back to the culling stage.

“Finally, we go through manually,” Giles said, “because nothing is better at finding weird stuff than the human eye.”

To find an alien, you might need a human

To be perfectly honest, I was thrilled to hear something intrinsically human can find strange things like no machine really can. I think it grounds our admittedly wild endeavor of trying to locate intelligent aliens. We’re inherently curious I suppose, and somehow drawn to lapses in patterns.

“There’s a certain level to which we can use ML methods,” Giles told Space.com, “but ultimately, we need to be able to understand why it is things are happening.”

Maybe a pool full of even the most highly accurate datasets is just that  —  a pool full of highly accurate datasets  —  until a human starts parsing through to make connections a machine hasn’t yet been programmed to recognize.

“For things like anomaly detection, there’s an additional trick,” Giles said. “There’s not a ground truth, so we can’t train something necessarily to find the weirdest stuff, or the stuff that’s the most interesting, because we don’t necessarily know what that’s going to be.”

KIC 8462852 and another bright star for comparison, showing that it has a distinct protrusion to the left (east). (Image credit: Boyajian et al )

Even when it comes to standard robotics that aim to mimic human structure, a limiting step for scientists is with regard to decoding physical laws that dictate the way we move. It’s because, as humans, we don’t really need to know how some aspects of humanity work. They just work. A few years ago, for instance, one team made a breakthrough in figuring out how our fingerprints impact our grip. You know how when you wash slippery dishes, you instinctively know how hard to hold the dishes so they don’t fall out of your hands? You’re unconsciously considering your fingerprints during that whole thing. But scientists literally had to make a new law of physics to convert that instinct into written fact.

There seems to be a similar concern for ML — and artificial intelligence, for that matter — even though the two are technically trainable to come up with some solutions of their own. It’s tough to program a machine to find something that we haven’t found before, because what would we tell it to look for? It’s sort of like how scientists champion the James Webb Space Telescope as the invention that might answer some cosmic questions we never thought to ask.

“There are limits to what AI and ML can do for us, but there are also a lot of opportunities as long as we understand what ML is doing specifically,” Giles said.

Food for thought.

Boyajian’s star in infrared and ultraviolet. (Image credit: IPAC/NASA, STScI (NASA)/Wikimedia Commons)

However, Giles says the team is also trying to search for specific anomalies that are indeed codable. “We have injected nearly 2 million different artificial signals into light curves that don’t have any dip signatures we know about, but still have artifacts in them so they still have behavior going on,” he said.

As for the anomaly results so far? “None of these so far speak to us like they’re megastructures,

“But they are certainly interesting.”

Exit mobile version