What if They're Hungry?
Here's an interesting piece of commentary in The Independent arguing that wantonly announcing our presence to (possible) alien civilizations may not be such a smart move:
This is not just a matter for astronomical research involving distant worlds and academic questions. Could it be that, from across the gulf of space, as HG Wells put it, there may emerge an alien threat? That only happens in lurid science fiction films, doesn't it? Well, the threat is real enough to worry many scientists, who make a simple but increasingly urgent point: if we don't know what's out there, why on Earth are we deliberately beaming messages into space, to try and contact these civilisations about whom we know precisely nothing?
It's interesting to note some of the names of those who are raising these questions:
Physicist Freeman Dyson, of the Institute for Advanced Study in Princeton, has been for decades one of the deepest thinkers on such issues. He insists that we should not assume anything about aliens. "It is unscientific to impute to remote intelligences wisdom and serenity, just as it is to impute to them irrational and murderous impulses," he says. " We must be prepared for either possibility."
[S]cientist and science-fiction author David Brin thinks those in charge of drafting policy about transmissions from Earth - ostensibly a body called the International Astronomical Union, which would make recommendations to the United Nations - are being complacent, if not irresponsible. Whatever has happened in the past, he doesn't want any new deliberate transmissions adding to the risk. "In a fait accompli of staggering potential consequence," he says, "we will soon see a dramatic change of state. One in which Earth civilisation may suddenly become many orders of magnitude brighter across the Milky Way - without any of our vaunted deliberative processes having ever been called into play."
Is this something we should be worried about? I note that "alien invasion" isn't even listed as a serious existential risk for humanity on the Wikipedia list linked by the Lifeboat Foundation, although technically I suppose such a threat would fall under the general heading of War and Genocide. Of course, there are some schools of thought that argue that we needn't worry about alien civilizations because they simply aren't there. Stephen and I took a stab at a couple of those a while back.
Generally, I'm inclined to think that those arguments are on the right track. If there are aliens, there should be at least one so far ahead of us that its presence in the universe would have announced itself to us by now. Or there may be several advanced civilizations, but once a civilization reaches a certain level of advancement, it just "drops out" of the universe -- or at least the universe as we understand it. Either way, not much of a threat.
Still, there could be other options. Maybe technological development slows down after a while. Maybe colonizing the galaxy doesn't appeal to some civilizations, but they don't mind going out and eliminating the occasional upstart threat whenever they find out about them. What about a civilization where some extreme ideology -- political or religious fundamentalism -- accompanied by advanced technology freezes development at a certain threatening level?
It's all just guesswork, of course. But then that's the thing about alien civilizations. We really don't know anything about them. It's guesswork, with different guesses informed by different assumptions and biases. Right now, we're discovering new planets right and left and our own technological prowess is growing exponentially. We'll no doubt understand the situation a lot better a few years down the road.
Meanwhile, we keep guessing. As we do, I don't see how adding a little caution to the mix could possibly hurt.
Comments
I'm pretty sure if they're out there, it will be out gold they want. And maybe our women.
Posted by: MDarling | June 26, 2007 10:38 PM
Or perhaps they're thirsty and need water. Oxygen and hydrogen may be scarce in their neck of the galaxy.
Posted by: Phil Bowermaster | June 27, 2007 08:48 AM
More seriously, any uncontrolled self-replicating things, be it natural or artificial, is extremely dangerous over the long term. Would be a lot less worry (and perhaps bother) to just destroy the Solar System in the next few thousand years than it'd be to deal with the human/manmade AI menace a few million years down the road.
Posted by: Karl Hallowell | June 28, 2007 02:26 AM
You wouldn't destroy the solar system. You'd just seed the earth with nanobots that subtley sabotage any AI project and divert our research and development to sex, drugs, mass entertainment and the like.
So a clear sign of alien intervention would be if we had the technology to reach the moon, but spent the next 4 decades developing 2nd Life and the iPod.
Posted by: doctorpat | July 2, 2007 03:16 AM