The Speculist: Navigating the Future

logo.jpg

Live to see it.


« Future Glimpses | Main | The RepRap Achieves Self-Replication! »


Navigating the Future

Economists have a latin phrase they use, in part, to acknowledge the difficulty of making predictions in the real world - cēterīs paribus. It can be translated “all other things being equal.” An economist would say, for example, the if the supply of oil goes down or its demand goes up (or both), basic economics dictates that its price will go up... cēterīs paribus.

But in the real world "other things" usually do change. Alternative energy solutions could be pursued. In my recent “Algae Economy” post I argued that this is already happening and that several alternatives are closer to being realized than most people know.

There’s a scene in The Empire Strikes Back on the planet Dagobah:

Luke: I saw - I saw a city in the clouds.
Yoda: [nods] Friends you have there.
Luke: They were in pain...
Yoda: It is the future you see.
Luke: The future? [pause]
Luke: Will they die?
Yoda: [closes his eyes for a moment] Difficult to see. Always in motion is the future.

There’s always a chance - perhaps even the probability - that something unforeseen will disrupt even the most carefully constructed prediction. Luke Skywalker (or alternative fuels) could charge to the rescue.

Even if something unforeseen doesn't happen, the things that we do foresee can interact in ways that are unexpected. There are just too many variables. In my "Spock's Chessboard" analogy, multiple technologies develop exponentially on the separate platforms of a 3-D chessboard. At critical points in their development a technology leaps from its own platform and revolutionizes a different platform or even spawns a new platform. For example, quantum research could produce a quantum computer thereby revolutionizing the computation platform. Or solar cells reach grid parity and revolutionize energy production.

Phil has said:

[Kurzweil makes] extremely specific predictions and seems to get it right a lot of the time. But he is predicting specific technological levels of achievement falling along what he sees as a predictable curve. I think people -- and what they end up doing with the new technologies -- are harder to predict.

Predicting the actions of an individual is particularly tough. The price of gas goes up but for some reason my friend Paul increases his driving. Fortunately for the futurist, the strange actions of a few people have little weight compared to the more predictable actions of large groups of people.

In our latest FastForward Radio show Michael explained why Adam Smith’s Wealth of Nations has influenced him. The thing that made an impression on Michael was the concept of the “invisible hand.” The actions of many self-interested and often irrational people create an emergent meta-intelligence that is, somehow, logical.

But individuals have always had trouble trusting emergence. A reviewer wrote the following about Kevin Kelly's book Out of Control:

Why are the three most powerful forces in our world--evolution, democracy and capitalism--so controversial? Hundreds (in the case of democracy, thousands) of years after they were first understood, we still can't quite believe these three phenomena work. Socialist Europe resists capitalism, the religious right in America questions evolution and the Middle East makes a mockery of democracy. When you think about it, it's easy to understand why: all three are radically counterintuitive. "One person, one vote?" What if they vote wrong?

But that's the problem--we're thinking about it. Our brains aren't wired to understand the wisdom of the crowd. Evolution, democracy and capitalism don't work at the anecdotal level of personal experience, the level at which our story-driven synapses are built to engage. Instead, they're statistical, operating in the realm of collective probability. They're not right--they're "righter". They're not predictable and controllable--they're inherently out of control. That's scary and unsettling, but also hugely important to understand in a world of increasing complexity and diminishing institutional power.

These three emergent systems - evolution, democracy, and capitalism - have been the great engines of improvement. And now we have a fourth emergent system - the Internet.

These four systems advance "the human imperative" We should be careful not to cripple these systems with regulation. Regulators might be motivated by the best intentions, but the end result could be a dangerous slowing of progress. Paypal founder Peter Thiel had this to say regarding Bill Joy's prediction that the future doesn't need us:

There are obviously dangers in these technologies. But I think Joy’s approach [relinquishment] would actually lead to the future he fears. If the virtuous people relinquish these things, it means that they will be developed by the evil people, and that seems to me to be a recipe for these technologies going wrong.

The only way for something like Joy’s approach to work would be basically a totalitarian world-state in which we control the technologies worldwide. It is incredibly arrogant to say that the only smart people in the world exist in the United States and that if you can stop it in the U.S., you’ll stop it everywhere. Maybe it’s going to be developed by the Chinese military. Maybe it’ll be developed by people working for Islamic terrorist groups.

The anti-Joy view that I would articulate is that what we need to be doing is to be pushing the accelerator further and harder. What I fear is that people working in free countries, where I think these technologies are likely to be developed in a more benign way, are being blocked by bureaucratic regulation and by cultural ideas that we shouldn’t be doing this. We are on this technological arc. We don’t know where it’s going to go, but I think the best trajectory is for us to just hit the accelerator really hard."


This post is a sequel to my 2006 post, "The Game Board."

Comments

Regulators might be motivated by the best intentions, but the end result could be a dangerous slowing of progress.

I think one of the keys going forward is to have regulation and technological development more intertwined with each other than they are now. Regulation as it is currently practiced tends either to allow something or prohibit it. We need a technologists and regulators working together towards win-win scenarios. For example, current nuclear fission technology addresses all (or at least most) of the safety and environmental concerns that led to the widespread elimination of nuclear power as a going-forward strategy. Smart regulation would not have been an outright ban on nuclear power, but rather to set requirements for implementation and operation that eliminated those problems. Regulation should raise the bar to make technologies safer (and cleaner), not prohibit their development. Likewise, why not set some parameters for what would constitute truly low-impact and environmentally recoverable oil drilling. Regulation shouldn't be a matter of an absolute prohibition of oil drilling in the Gulf, or ANWR, or Bakken -- it should outline what advances in oil drilling, refinement, and movement MUST be made and strictly adhered to in order to allow drilling in these areas.

Then let the X-prizes for clean oil drilling begin.

We need constructive engagement and win-win scenarios. Above all, we need a non-luddite (or I should say pro-technology, pro-economic growth) regulatory environment. We want some very well-defined parameters in place around what can and cannot, should and should not be done with GNR. Take my suggestion in the comment in the previous thread about adding an evolutionary algorithm to a self-assembling system. That might be a really bad idea that should be banned outright. Or it might be a really good idea, one that should be pursued only in a very tightly controlled environment, with very well-defined parameters around how far that algorithm can go.

The notion of the net as the 4th fundamental emergent technology of culture is terrific. We have no way of knowing where it will lead us, which range of outcomes includes doom.

I disagree that regulators and innovators should work together. That is called "regulatory capture", i.e., that the regulated lead the regulators around by the nose, because they know more and care more.

Think the FDA and the SEC, not to mention the NRC. We did not ban nukes, we regulated them out of existence, by making them too costly in part by making the process too lengthy and fraught.

Post a comment

(Comments are moderated, and sometimes they take a while to appear. Thanks for waiting.)






Be a Speculist

Share your thoughts on the future with more than

70,000

Speculist readers. Write to us at:

speculist1@yahoo.com

(More details here.)



Blogroll



Categories

Powered by
Movable Type 3.2