MANCHESTER, UK – In a provocative new paper, an English scientist wonders if so-called “artificial intelligence” (AI) is one reason why we’ve never found other intelligent beings in the universe. The Fermi Paradox captures this idea: in an infinite universe, how can it be that there are no other civilizations sending us radio signals?

Could it be that, once civilizations develop AI, it’s a short ride into oblivion for most of them? This kind of large-scale event is called a Great Filter, and AI is one of the most popular subjects of speculation about them. Could the inevitable development of AI by technological civilizations create the improbable Great Silence we hear from the universe?

Michael Garrett is a radio astronomer at the University of Manchester and the director of the Jodrell Bank Centre for Astrophysics, with extensive involvement in the Search for Extraterrestrial Intelligence (SETI). Basically, while his research interests are eclectic, he’s a highly qualified and credentialed version of the people in TV shows or movies who are listening to the universe to hear signs of other civilizations. In this paper, which was peer reviewed and published in the journal of the International Academy of Astronautics, he compares theories about artificial superintelligence against concrete observations using radio astronomy.

Garrett explains in the paper that scientists grow more and more uneasy the longer we go without hearing any sign of other intelligent life. “This ‘Great Silence’ presents something of a paradox when juxtaposed with other astronomical findings that imply the universe is hospitable to the emergence of intelligent life,” he writes. “The concept of a ‘great filter’ is often employed – this is a universal barrier and insurmountable challenge that prevents the widespread emergence of intelligent life.”

There are countless potential Great Filters, from climate extinction to (gulp!) a pernicious global pandemic. Any number of events could stop a global civilization from going multiplanetary. For people who follow and believe in Great Filter theories more ideologically, humans settling on Mars or the moon represent a way to reduce risk. (It will be a long time, if ever, before we have technology to make these settlements sustainable and independent.) The longer we stay on only Earth, the more likely a Great Filter event is to wipe us out, the thinking goes.

Today, AI is not capable of anything close to human intelligence. But, Garrett writes, it is doing work that people previously didn’t believe computers could do. If this trajectory leads to a so-called general artificial intelligence (GAI) — a key distinction meaning an algorithm that can reason and synthesize ideas in a truly human way combined with incredible computing power — we could really be in trouble. And in this paper, Garrett follows a chain of hypothetical ideas to one possible conclusion. How long would it take for a civilization to be wiped out by its own unregulated GAI?

Unfortunately, in Garrett’s scenario, it takes just 100-200 years. Coding and developing AI is a singleminded project involving and accelerated by data and processing power, he explains, compared with the messy and multidomain work of space travel and settlement. We see this split today with the flow of researchers into computing fields compared with a shortage in the life sciences. Every day on Twitter, loud billionaires talk about how great and important it it is to settle Mars, but we still don’t know how humans will even survive the journey without being shredded by cosmic radiation. Pay no attention to that man behind the curtain.

There are some big caveats, or just things to keep in mind, about this research. Garrett steps through a series of specific hypothetical scenarios, and he uses huge assumptions. He assumes there is life in the Milky Way and that AI and GAI are “natural developments” of these civilizations. He uses the already hypothetical Drake Equation, a way to quantify the likely number of other planetary civilizations, which has several variables we have no concrete idea about.

More at MSN