Singularity Survey Results
Here are the results of our recent survey about the Technological Singularity. The breakdown on the big question was that 94% of you think the Singularity will occur in some form or another, while 6% think it won't. Several readers observed that there is something incongruous about making predictions about the Singularity -- it being the bound beyond which prediction is no longer possible. But that didn't slow too many of you down.
A few random observations:
The vast majority of participants believe that the Singularity is coming relatively soon. More than 60% believe that it will occur within the next 45 years; more than 80% believe that it will occur in the next 70 years.
There was a nice spread over all the different ways listed that the Singularity might start. About 57% believe that it will be intentional; about 43% believe that it will be an accident.
Nobody bit on "enslavement of the human race" to machines as a worst-case outcome. On the best-case side, there weren't many takers for "finding God" or "preserving the environment."
The most popular write-in for the best-case scenario was "all of the above."
More people voted for the Silent Singularity than for any other variety, good or bad.
One poetic participant wrote: "The purpose of thought is thought. The boon of the singularity is the singularity." He or she might be on to something there. When we talk about all the "benefits" of the Singularity, we are no doubt a bit like poor Tom Canty in The Prince and the Pauper -- hiding away the Great Seal of England because it made such a nifty nutcracker.
Finally, I must add a word of recognition for the Singularity skeptic who suggested that God will one day grab us by the lapels and ask us what kind of crack we were smoking. Thank you very much for participating.
Here, then, are the results in their entirety.

Question 2: If you answered no, thanks for your oh-so-brief-and-yet-valuable participation. If you answered yes, what kind of Singularity are we in for?
Happily Ever After: The world suddenly becomes an amazingly better place.
Rise of the Machines: Emerging intelligences destroy us, either intentionally or inadvertantly in the pursuit of their own goals.
Missed Flight: We aren't really affected by the new intelligences; they go their own way and leave us undisturbed (and unhelped.)
Silent Singularity: The world changes fundamentally, but once it happens it really doesn't seem like that big a deal.
Slow Roll: The Singularity does in fact take place, but takes a long time to kick in.

A) Happily Ever After
B) Rise of the Machines
C) Missed Flight
D) Silent Singularity
E) Slow Roll
F) Other
"Other" responses:
- Apocalypse or Apotheosis.
- Humans augment and combine intelligence with the machine intelligences. The result is a synergy of unknowable characteristics from here.
- #1 or #2, with 50% probability each
- This choices, while not-too-bad, are nonetheless oversimplified. Nanotech-&-robotics will surely come to pass. What is most important is to get the institutions and cultural consciousness right, as well as defensive technologies (such as active shields) right. Then we might just "live long & prosper..."
- Stephen here. I'm closest to "happily ever after" but I think we'll still have problems - problems that we don't have the imagination to contemplate or deal with now. Or, perhaps they are problems that we can contemplate (the end of the Universe for example) but we're too busy dealing with more immediate problems that, post-singularity, won't be problems anymore.
- I don't want to guess
- Strangeness: Egos probably don't apply. Local elements are optimized for doing theory, in a feedback cycle.
- Don't know, it's just too hard to predict.
- Rise of the Machines (inadvertant) or Happily Ever After, depending on who writes the AI. Rise of the Machines is the default.
- This is a edition on a previous survey, a few minutes ago.
---Currently, we can only know (with some degree of certainty) about the past. I submit that, according to your premise above, we are already experiencing upward rise of the singularity, . For the next question (3) the peak (or the crash) of the singularity could take a near infinite amount of time (i.e. in the meantime, we will likely see continuing acceleration of progress). For the time when a self-aware, greater than human intelligence drives progress, in question (3), I will vote 2025-2050.
- The reult of the singularity on humankind is inherently unknowable.
- I can't even guess how it will go. Isn't that the whole point of a Singularity? That said, I expect a mixed bag and unequal effects in different places.
- The singularity happens, then we realize that, even after all that, we've barly scratched the surface of our understanding of the universe.
- Mostly a Slow roll (over 40-50 years) with a majority experiencing a Silent effect and a minority, but significant, percentage opting for the Missed Flight varient.
- It's a split between Happily Ever After and Rise of the Machines, depending on whether or not we solve the Friendly AI problem.
- Hard takeoff, but they destroy us due to *our* own goals, not theirs. The weakest-link, everyone can end the world scenario.
- Your definitions of soft and hard takeoff are simply wrong. These have to do SOLELY with the speed of the change, not with goodness or badness outcomes.
- "When you pray for rain, be prepared for mud". Life will be improved, but there will be new unforseen problems.

A) Before 2025
B) 2025 - 2050
C) 2050 - 2075
D) 2075 - 2100
E) 2100 - 2125
F) After 2125

A) USA
B) Europe
C) China
D) Japan
E) India
F) Widely distributed -- no one location
G) Other
"Other" responses:
- Widely distributed but not evenly distributed. The USA will be at the center.
- S. Korea
- Hopefully USA, currently where I believe most worried rationalists are located
- Widely distributed, this follows the idea that it has started already. (i.e. the initial conditions are in place).
- This question is a red herring. The singularity will take place on planet earth and will emerge from a global civilzation. In other words it will happen everywhere at once.
- The Anglosphere
- As an internet site, collaborating developments worldwide.

A) Deliberately, primarily through AI research
B) Deliberately, primarily through nanotechnology research
C) Deliberately, primarily through biotech research
D) Unintended, with the Internet or some other system "waking up"
E) Unintended, via self-replicating robotics
F) Unintended, via bitoechnological developments
G) Other
"Other" responses:
- Silent Start implies there will be no observable "moment." It will be like TV in US households, at some point it becomes so completely ubiquitous nobody will think about it. Consider the number of people still using dial-up to access the internet. Now stop considering that continuously decreasing population. When the last dial-up user on the planet upgrades to broadband, will anyone notice?
- Unintended but predictable augmentation, ubiquitous computing/communication plus AI. Throw in MNT for faster takeoff.
- A Hybrid- Deliberately in that someone takes some explicit action in that direction, unintended in that the process gets "out of hand"
- Most likely singularity is human interconnection leading to massively parallel thought. AI is hard but biological intelligence is already here.
- Actually, it will be both the first two you list...
- Unintended, through a combination of all three of the above listed realms of study.
- unintended, through incremental biotech and nanotech human enhancement
- Combination of all "deliberately" choices.
- Sorry to keep going with "other." I think we will approach the Singularity deliberately primarily through AI research. But, I think that there will be significant surprises along the way. Surprises certainly for people that aren't aware of how fast technology is developing. I also think there will be some surprises for those who are working toward strong AI. It may "wake up" at a time that even the experts aren't expecting.
- Deliberately, through a combo of AI and nanotech (need the nanotech for faster computers).
- All of the above
- Unintended, with embedded systems not "waking up" so much as becoming autonomous through self-repair and upgrade systems that are better than anticipated. They may not "wake up" at all.
- Likely it has started. Mathematical functions, start with initial conditions, in this case, they occurred at t=0, a long time ago.
- Time to fix your spelling. It will be unintended via the GRIN confluence
- It has already started, a singularity has no definite beginning. The infinite peak of a mathematical singularity has a location in time.
- I think that it will be deliberate, but I don't think that one specific field will be at the forefront - I think that it will require all of them.
- The process leading to the singularity began with the big bang. The singularity is a natural phenomenon that results from the technological nature and evolution of the human species. All technologies are interconnected in feedback loops and all the above technologies will likely be involved.
- all of the above
- -synthesis of deliberate efforts on many fronts
- -synthesis of the products of deliberate effort on many fronts
- Combination of all of the above
- combination of the 3 deliberates
- Probably an unintended expression of a deliberate effort; AI research achieving independent conciousness in a biotech application or some similar varient.
- Deliberately, though likely unexpectedly, from a combination AI, biotech, and nanotech research.
- Delibeately through a combination of bionanoAI.
- As the accumulation of many specific techniques developed for commercial application, or as open-source projects. The tipping point will happen before anyone realizes how close we are.

A) Squandered Singularity opportunity
B) Assumption of power by a group wielding post-singularity technology
C) Enslavement of the human race to machines
D) Environmental cataclysm
E) End of civilization
F) Destruction of the human race
G) Other
"Other" responses:
- The "evolution" of mankind into something unrecognizable by the most imaginative sci-fi writers. Today's philosophers call this a bad thing. To the post-Singularity 'every man' there will be no memory or context of life ever being different. (ex: "the Matrix" with no chance of 'waking up' to some other reality good or bad)
- Economic starvation of the human race
- Total oppression by some government or equiv using high tech to attempt to keep its privileges. This leads to stagnation at best and is the one way we not reach full Singularity.
- People will look back on the present era as the last days of the pure human species, and a bit of an age of innocence.
- We have to guard against these scenarios, but I think we can circumvent them all.
- Rapid change will destablize everything. Super empowered individuals will cause civilization to go through some major pain. Countries will increasingly have the power to destroy each other and fear could cause (more) preemptive wars. Probably local civilization colapses, more failed states, maybe a partial world-wide colapse. It's going to get worse before it gets better.- Rapid change will destablize everything. Super empowered individuals will cause civilization to go through some major pain. Countries will increasingly have the power to destroy each other and fear could cause (more) preemptive wars. Probably local civilization colapses, more failed states, maybe a partial world-wide colapse. It's going to get worse before it gets better.
- obselence
- Our humanity will disappear as we slowly turn ourselves into machines... through implants etc.
- Some of the above things will happen. The machines will break things and kill people literally without a thought.
- Deletion of physical universe
- removel of human drive. We will vew our attempts at understanding the world as second rate, not worth doing
- I don't favor a negative scenario.
- Again (as in my previous response), These are the words of naysayers, ever since the beginning of records. Likely, they will likely continue to be wrong.
- These are the words of naysayers since the beginning of records. The naysayers will continue to be wrong.
- Accelerated technological power will grant fringe movements even greater destructive capability.
- n/a
- possibly
- Once again, I can't begin to guess. The worst thing that could happen is destruction of the human race. The best possible result is probably an end to history as we know it.
- I don't favor negativity.(!) :-)
- Social chaos related to emerging technology, changes of same to economy, use of same in terrorism/warfare, partial collapse due to all of the above. Some sort of independent intelligence or modified human intelligence emerges in the process.

A) The elimination of disease, aging, poverty
B) The end of war, violence, exploitation
C) Preservation of the environment
D) Achieving a new level of understanding the world
E) Achieving the next stage of evolution
F) Finding God
G) Development of viable flying car/ jet pack/ time travel technology
H) Other
"Other" responses:
- All of the above.
- Sorry if answers to 5,6,7 don't flow into easily managed statistics. Not sorry if my answers made someone think of something they had not previously considered.
- Potentially all of the above and much more.
- All of the above
- Existing problems AND existing utopian solutions will both look ridiculously short sighted, blinkered, small in scope. Like a mediaeval asking "does advanced materials tech mean better swords, advanced biotech mean better breeds of horse?". Well yes it does, but that isn't really the point anymore.
- All of the above
- Most of the above...
- We could eliminate disease, aging, poverty and resource base wars.
- All of the above. The elimination of disease, aging, poverty will be a relatively early developments after The Singularity - for that part of the world that is post-Singularity.
There will be war, no doubt, with those parts of the world that resist The Singularity. It will be much like the war being fought now, only more so.
- All of the above
- The purpose of thought is thought. The boon of the singularity is the singularity.
- The greatest boon cannot be easily predicted in advance.
- Most (but not all) of everything listed above, and likely much more (things that at present, we do not know that we either want or need). Nothing is certain, not even death or taxes. Best regards, -Mike Cooper
- By definition, we can't see it from here. Otherwise, it'snot a singularity!
- Most, but not all the above. Nothing is certain (absolute).
- Colonizing other worlds, eliminating the 'all our eggs in one basket' problem
- Break throughs in technology will allow us to approach problems and develop technologies in directions which are now completely unexpected.
- Jeez. Listen guys, I know what I want to have happen. But asking what I think will happen violates the whole idea of this thing. That said; I personally want to upload myself into a couple of million intersteller probes and explore the galaxy. The idea is to have a 'Jack' reunion somewhere with a nice view in a hundred million years where I can show vacation slide shows and tell stories to myself.
- Individual independence from externally imposed obligation to anothers interest.
- All of the above! Except for God and time travel, they're make-believe.
- All of the above
- "Finding God?" Are you kidding? We're going to create machines that will ultimately destroy the reace that God so lovingly created in His own image. The only way we'll find God is when he takes us by the lapels and asks us what kind of crack we were smoking.
- What, no "all of the above" :-)
Comments
Amazing survey, first of its kind. The overwhelming people who voted in the 2025-2050 range versus the before 2025 range is quite telling. A few people seem to be mixing up the Singularity with a mathematical function of some sort. The idea that the Singularity is inevitable and predestined from the Big Bang is popular.
The defeat of aging, death and poverty "won't be a big deal"? According to the majority that voted for both Silent Singularity and the defeat of aging as a boon, that is.
These survey results are especially accurate because the Singularity is defined as "smarter-than-human intelligence driving progress" rather than "technological acceleration in general".
To the person who wrote, "I don't favor a negative scenario", I'm sure everyone agrees with you - no one "favors" a negative scenario. It's just that negative scenarios are the darndest things, they happen even if you disfavor them.
Posted by: Michael Anissimov
|
April 13, 2006 05:26 AM
My bad on the "favor" language; probably should have used the word "expect."
Posted by: Phil Bowermaster
|
April 13, 2006 07:40 AM
One poetic participant wrote: "The purpose of thought is thought. The boon of the singularity is the singularity."
Thank you, that was me. I don't seem to remember using the word "boon", though.
Posted by: Samuel Kleiner
|
April 13, 2006 03:10 PM
Samuel --
You were probably just responding to the specific language of the question:
" If you favor a positive scenario for the Singularity, what will be the greatest boon to humanity that will derive from it?"
Posted by: Phil Bowermaster
|
April 14, 2006 08:19 AM
Michael, you wrote:
A few people seem to be mixing up the Singularity with a mathematical function of some sort.
I think the confusion is understandable. After all, the Singularity is named after a mathematical object, the essential singularity which is a property of some functions.
Personally, I think a better analogue is to consider the Singularity as a phase change. But I don't think this is the place for that argument.
Posted by: Karl Hallowell
|
April 20, 2006 10:29 AM
Karl,
Personally, I think a better analogue is to consider the Singularity as a phase change. But I don't think this is the place for that argument.
If not here, where? This site isn't just for blonde jokes and Ernest Borgnine admiration, you know. :-)
Tell us more about the Singularity as a phase change. I'm intrigued.
Posted by: Phil Bowermaster
|
April 20, 2006 04:50 PM
Hello Samuel, if you find this then send a mail ASAP to oscar.haglind@mil.se
Posted by: Samskolan
|
September 13, 2006 11:24 AM