« Crystal Rain...review coming soon | Main | A Speculist Milestone »

Kathy vs. Cory

On Tuesday KurzweilAI republished an article by Cory Doctorow entitled "Thought Experiments: When the Singularity Is More than a Literary Device."

It got our friend Kathy Hanson's attention in part because it covers some of the same ground as Phil's recent "God and the Singularity" series (here and here).

Quotes from Doctorow are indented, Kathy's thoughts are in bold, my thoughts are in italics.


"Turing had the right insight: base the test for intelligence on written language. Turing Tests really work. A novel is based on language: with language you can conjure up any reality, much more so than with images. Turing almost lived to see computers doing a good job of performing in fields like math, medical diagnosis and so on, but those tasks were easier for a machine than demonstrating even a child’s mastery of language. Language is the true embodiment of human intelligence."

This is my favorite part of the article--if one can have a favorite part of something one is fisking!

Sure you can! And this is friendly fisking anyway.

We take for granted the miracle of language--that it informs and evokes emotion--and it's so beautiful. Perhaps mastery of language nuances presupposes emotional intelligence--and if AI reaches that threshhold it will acquire emotional intelligence as well. Emotional intelligence is a very important component of human intelligence.

But it’s a cheat. Evolutionary algorithms depend on the same mechanisms as real-world evolution: herit-able variation of candidates and a system that culls the least-suitable candidates. This latter—the fitness-factor that determines which individuals in a cohort breed and which vanish—is the key to a successful evolutionary system. Without it, there’s no pressure for the system to achieve the desired goal: merely mutation and more mutation.

I always find these evolutionary algorithm arguments a little ironic--when we're talking about AI, there's human intelligence involved, not merely blind survival of the fittest. If humans set up the fitness factor and tweak the process from time to time, that's not the same as real world evolutionary systems unless we believe that some intelligence originally set up the fitness factor and tweaks it from time to time in nature as well.

"Biology would be a lot more stable if we moved away from regulation—which is extremely irrational and onerous and doesn’t appropriately balance risks. Many medications are not available today even though they should be. The FDA always wants to know what happens if we approve this and will it turn into a thalidomide situation that embarrasses us on CNN?

And I always get miffed when very intelligent and informed people suddenly lapse into ascribing such cartoonish motives to other intelligent people who are dealing with complex issues. Yes, we need to balance risks and benefits, and yes our regulations are heavy on the risk avoidance side of the equation. I'm reacting to Doctorov's style--thalidomide was a tragic mistake not to be trivialized. The real issue is not fear of embarrassment on CNN, but fear of catastrophic financial loss including protracted legal battles, and of incurring the wrath of stockholders.

I blame the lawyers. Just kidding. I'm with Doctorow on this. At some point we are going to have to deregulate or, at least, completely overhaul the FDA to make it faster and more responsive. This could come in response to continual bio-terrorist attacks.

I'm reacting to his trivialization of it--sigh

After all, this is a system of belief that dictates a means by which we can care for our bodies virtuously and live long enough to transcend them. It is a system of belief that concerns itself with the meddling of non-believers, who work to undermine its goals through irrational systems predicated on their disbelief. It is a system of belief that asks and answers the question of what it means to be human.

Can we assume from this paragraph that:

Singulartarians have a monopoly on virtue?

This belief is Truth and disbelief is irrational?

Non-believers don't ask or answer the question of what it means to be human?

I happen to be a Singulartarian, but I believe it is one possibility for the future. It is not where I place my faith, it is a scenario where faith will be acted out.

I don't think Doctorow is being prescriptive as much as descriptive. "A path" not "THE path" to virtue.

I take issue a little with his title: "Thought Experiments: When the Singularity Is More than a Literary Device." From the time Vernor Vinge described the concept, the Singularity has always been, at least, an important theory about the direction of our civilization.

And really, the Singularity is not a very good literary device. It's sort of a sci-fi killer. It's the black hole in the future. By definition there's not much we can say intelligently about what follows. A post-Singularity civilization will be as alien to our present civilization as any pre-Singularity civilization we can imagine 10,000 light years away.

Tobias handled this problem in Crystal Rain by having a pre-Singularity civilization spawned by a mysterious post-Singularity civilization.

Throughout history, humans have used literary devices to bridge the gap between what they can understand and express and what they're trying to grasp. This goes along with the what we said earlier about the nuances of language that make it difficult for AI to grasp, in fact. However, I don't think Doctorow was giving literary devices their due credit--I think he was using the term to attempt to dimenish the Singularity in the science fiction genre as an unsubstantiated figment of the imagination.

I think science fiction as a genre is an important literary device for predicting--or if that isn't possible--imagining, the potential for the human race and civilization. The Singularity is only a sci-fi killer if we stop trying to imagine the future for fear of being wrong.

Comments

The motives of the drug companies and the FDA may not be quite cartoonish, but they aren't quite realistic either. They are existing in a world where drugs are products, sold mostly to make people's lives comfortable -- oh yeah and to cure diseases (which doesn't make as much money, but it's good publicity or something). They have a monopoly over the production of medicine worth many billions of dollars, which they are going to fight to maintain control of, and there is not much profit in boring stuff like vaccines or malaria or bioterror. Bioterror!? The customers don't even exist yet! Submit me a business plan once half the world is infected. It would be a cartoonish perspective, if it weren't so tragic.


Of course, all of the people involved are real caring compassionate human beings, but that makes very little difference. They can't come into work in the morning and decide to research malaria for the good of humanity. Malaria only gets researched if someone pays.


Besides the corruption of profit, though, there's another fundamentally unrealistic aspect to the way our medicine system is functioning now. As Kathy says, "our regulations are heavy on the risk avoidance side." The absurdity of that is the implicit (anti-singularitarian) view that the world going forward is going to stay basically the same. That we'd better not mess with how things are, because they're pretty good this way & we don't want to rock the boat. Conservatism.


The world has always been changing, and so extreme amounts of conservatism have always been a bit cartoonish, a bit abusrd, but here at the brink of the Singularity it's completely absurd to expect anything to stay still even momentarily.


We must immediately begin to erect walls of technological defense around any aspect of this pretty little biosphere that we want to preserve for more than a decade or two. The only realistic perspective is the complete transformation of everything.


<3

Post a comment