Is the brain too strange to emulate?
Our friend Will Brown points to this article from Chris Chatham:
10 Important Differences Between Brains and Computers
Chris argues that the metaphor that "our brains are computers" has been valuable. But, like most metaphors, it is eventually checked by reality. He points out how vastly different our brains are from digital computers. Why it's almost as if one evolved biologically, and the other were artificial!
I suspect that the brain/computer comparison is more than a metaphor. The brain is a strange organic system far different from what any human computer scientist would design. That said, I suspect that it can be emulated by a sufficiently powerful Turing Machine.
Informally the Church–Turing thesis states that if an algorithm (a procedure that terminates) exists then there is an equivalent Turing Machine (equivalently: recursion|recursively-definable function or lambda calculus|λ-definable function) for that algorithm. One conclusion to be drawn is that, IF a computer can effectively calculate an algorithm THEN so can an equivalent Turing Machine.
So even if the brain is not a Turing machine, it could be emulated by a sufficiently powerful Turing machine. In theory.
-Linkathon.

Comments
So even if the brain is not a Turing machine, it could be emulated by a sufficiently powerful Turing machine. In theory.
Are you certain about that? How many lifetimes of the universe would it take? What are your sources?
It's not correct to assume that brain function is describable by what most computer scientists think of as an "algorithm." Not unless you consider the universe itself to be an algorithm.
You mentioned Will Brown, but failed to provide a link. Please observe proper link etiquette, as people are watching.
Posted by: legion
|
September 8, 2007 07:09 AM
As I mentioned in a previous comment, there are limits to abstraction in the real world. No one has succeeded in creating an algorithm that describes how the brain does what it does.
Posted by: legion
|
September 8, 2007 07:11 AM
I'm Will Brown; mind your manners, [well-phrased personal attack removed.]
As to your emulation idea Stephen, I think the added levels of complexity inherent to that model works against it. Any model that says, "First, build a Turing machine ..." sort of misses the point don't you think? Isn't a Turing machine simply an archaic term for AGI?
Learning to emulate specific brain functions will almost certainly be useful to the development process of integrating manufactured enhancement into human brain/mind capability. I think that process the more likely (if somewhat serendipitous) route to independent AGI then the purely mechanical emulation one.
Posted by: Will Brown
|
September 8, 2007 10:10 AM
Legion:
Follow the "Linkathon" link and you will learn that this series is where people email us with links. I can't provide a link to the email.
Am I certain that biological brains can be simulated in silicon? Nope. Did the words "I suspect" not clue you in on that? This is my opinion.
How many lifetimes of the universe? Well, let's give it 100 years. If it doesn't happen within that time "I'd suspect" that it probably can't happen for one reason or another.
See Arthur C. Clarke on the danger of saying something is "impossible".
Posted by: Stephen Gordon
|
September 8, 2007 11:35 AM
Legion --
It's not correct to assume that brain function is describable by what most computer scientists think of as an "algorithm."
Allow me to make a response that is just bas logical and evidence-backed as your position:
"Yes it is."
Not unless you consider the universe itself to be an algorithm.
Any reason why we shouldn't? Other than your say-so?
Posted by: Phil Bowermaster
|
September 9, 2007 10:04 PM
Will:
Isn't a Turing machine simply an archaic term for AGI?
No. Read here about the related concept of "Turing completeness."
Posted by: Stephen Gordon
|
September 10, 2007 05:36 AM
Will:
Also, check out this Speculist post.
Posted by: Stephen Gordon
|
September 10, 2007 05:40 AM
Ahh, I think I see the problem. A Turing machine is programmable for independent operation, but not necessarily able to pass the Turing test for independent intelligence.
Discovering whether or not I can pass the test can be left for another occasion.
Posted by: Will Brown
|
September 11, 2007 02:06 AM