The Speculist: Longer Living through Plastics

logo.jpg

Live to see it.


« The Age of Medical Nanobots is Approaching | Main | "How to Train Your Dragon" a quick review »


Longer Living through Plastics

Wednesday evening I had a chat with a fellow futurist who told me about some exciting work that's being done in cryonics. A new approach to the problem of preserving the human brain is being developed that does not rely on cold storage -- which up to now has been the standard approach and is, as far as I know, the only form of suspension that anyone is currently using. The new approach relies on encasing the brain in plastic. And by that I don't mean putting a plastic shell around the brain, but rather infusing all the brain tissue with a resin which will harden and perfectly preserve the brain's cellular and neuron structure.

My friend explained that this approach will be a major game-changer because it won't require anything like the infrastructure and investment involved in cryonic freeze. Plastination, he explained, will cost less than a casket burial. The economic argument is gone. The "yuck" factor may hang in as an objection, but this plastic approach has the advantage of being the second major model proposed. It will never be as shocking as the original "corpsicle" (and subsequent "headsicle") ideas.

My friend also told me that he and a colleague are working on promoting this approach via multiple channels, including raising funding for financial incentives for researchers achieving defined goals towards full brain preservation. (I don't recall that he actually used the phrase "push-prize," but it sounded an awful lot like that.) He pointed out that the practice might be widely adopted even by those who for religious or other reasons aren't interested in being revived. For example, memories retrieved from a preserved brain in "offline mode," -- meaning that there is no attempt made to restore the conscious brain function -- might be of great value to family members, future historians, etc.

I'm not identifying this futurist because he told me that he's not yet ready to go public with this effort, although I'm looking forward to having him on FastForward radio as soon as he is ready. Anyhow, I found it pretty interesting on Thursday, having had this conversation the previous night, to see this same idea being kicked around on Fight Aging!, Accelerating Future, and InstaPundit.

The discussion ultimately centers around this site, which argues for both the plastination method and a push prize. The site is owned and run by Kenneth Hayworth. I can't say whether there is any connection between Hayworth and my friend. The best case would be no connection -- meaning that there are several different groups and individuals working on this goal simultaneously.

Michael quotes a key piece from Hayworth's site, which I will repeat here:

From a medical and technical standpoint all that is needed is the development of a surgical procedure for perfusing a patient's circulatory system with a series of fixatives and plastic resins capable of perfectly preserving their brain's neural circuitry in a plasticized block for long-term storage. Such a procedure would, in effect, put the patient into a long dreamless sleep where they can wait out the decades or centuries necessary for the development of the more advanced technology required to revive them.

How could a patient ever be awoken from such an unconventional sleep? The necessary technology exists in primitive form today -- the plasticized brain block will be automatically sliced into thin sections and these scanned in an electron microscope at nanometer resolution. Such scanning can map out the exact synaptic connectivity among neurons while simultaneously providing information on a host of molecular-level constituents. This map of brain connectivity will then be uploaded into a computer emulation controlling a robotic body -- the patient awakes to a new dawn of unlimited potential.

I think this approach, once perfected, could well be the technology that pushes cryonics more or less into the mainstream. Hayworth foresees a future in which uploading human personality from a carefully preserved brain is viewed roughly the way laser eye surgery is today. I think that's about right, although the stakes are clearly higher with uploading.

Hayworth makes a passionate case that we need to overcome backward philosophical ideas in order to enable such technology in the near future. Michael reiterates that case. Both take a dim view of religion, seeing it as a primary culprit in blocking progress in this kind of research. I'll deal with that issue separately somewhere down the road, but for now I'll just state that I don't think there is any real conflict between religious belief and brain preservation, any more than there's a conflict between religious belief and this technology.

However, there is a philosophical discussion in the comments on Michael's post which I think is quite interesting.

The debate comes down to this question: if you store my brain in plastic for a couple of centuries, then slice it up to create an uploaded virtual replica, then fire up the virtual replica...have you brought me back to life? My answer to that, assuming that everything works, is a qualified "yes." (It's a major qualification, though.) The replica will have my personality, my memories, and -- from his standpoint -- a continuous experience of being Phil Bowermaster, with this one interruption--which may be no more significant to him than a single night's sleep. From his standpoint, and from the standpoint of the outside world, I have been brought back to life.

I can even go so far as to say that from MY standpoint, as the replica, I have been brought back to life.

In fact, there is only one standpoint from which anything looks amiss. And that, of course, would be my other standpoint, the standpoint of the original Phil Bowermaster. That Phil Bowermaster, it would seem to me, gets left behind in those discarded slices of plastinated brain. So even though the replica is me as far as he is concerned and as far as the world is concerned, in an important sense -- from the point of view of the original -- I am not there.

Hayworth argues quite eloquently that this sense of something being amiss is based on an illusion. I find the argument compelling but less than completely convincing or satisfying. Michael points out that consciousness is not continuous, anyway, that it is interrupted daily by sleep and can be more severely messed with by things like head trauma and coma. However, I'm not concerned with continuity of consciousness. My concern is continuity of substrate.

I prefer a digitization scheme in which the old substrate functions concurrently with, and is slowly replaced by, the new one. That is to say, I need to consciously experience moving from my brain to the computer in order to accept that I have in fact made the move. Michael and Hayworth would argue that this is illusory thinking and bad philosophy. I would counter that this is merely being careful.

I say rather than slicing up my dead brain and reading it straight into digital form, I'd like to hang in until nanotechnology actually enables deplastinating and reviving my brain in a nice new cloned or robotic body. From there, I'd be happy living in non-uploaded form for a brief time until a conscious, gradual upload can be arranged. In IT terms, we're talking about warm standby rather than cold standby. It might be more difficult and more expensive, but having waited decades or centuries, I'm okay with taking a few extra steps to make sure that my survival is actually my survival.

Hayworth presents a mind-uploading bill of rights which reads in part:

Revival rights -- The revival wishes of the individual undergoing brain preservation should be respected. This includes the right to refuse revival under a list of circumstances provided by the individual before preservation.

Bingo. My circumstances would include, among other things, the requirement that my suspended brain first be revived and that I be uploaded via a warm standby approach. Call me old-fashioned, but when I get brought back to life, I want to be there to see it.

Comments

"Mr. McGuire: I want to say one word to you. Just 1 word.
Benjamin: Yes, sir.
Mr. McGuire: Are you listening?
Benjamin: Yes, I am.
Mr. McGuire: Plastics."

--Dustin Hoffman, 'The Graduate', 1967

From a Psychiatric point of view (and a long career dealing with issues of identity and self-consciousness) I can assure you that you are correct to desire a continuity of substrate. I know that's the only way I would agree to the process described and would oppose any kind of destructive uploading that does not allow for continuity of consciousness. There is no way to allow for destructive uploading where the person does not know they are not the original. I think this would introduce identity discontinuities that would lead to disastrous personality defects. Our personalities are flawed enough without adding such a burden to our mind-cloned offspring. For safety and sanity, continuity of consciousness would be advisable.

Not to argue over much, but I think you both may be making more of this than it really deserves.

When you take a night's sleep now, you awake having shed an indeterminant quantity of "yourself" in the hair and skin trapped in the bedsheets or washed down the drain during that morning's shower.

When you take the Plastic Nap, you awaken having accomplished the identical result, if admittedly more thoroughly.

If the former activity doesn't excessively bend the brain, why should the latter necessarily do so?

I concede this is in large part a matter of individual taste and preference, and on those grounds I join your position, but I submit that perception of the process by the recipient is much influenced by the type and degree of intellectual preparation said worthy undergo's beforehand. If you expect to "wake up" inhabiting a mechanised structure (and why that model and not some ethically satisfactory derivative of clone instead?), why would you experience unexpected psychological "discontinuities" from an anticipated condition?

In the calculus of these sorts of problems, the degree of uncertainty of outcome increases along with the period of time required for completion. Where in your tentative decision matrix does the cost of personally destructive external potential events get weighed against those having potential to effect personal comfort (will you concede the reasonableness of a contractual stipulation that mind-cloning proceedures can be performed as many times as required to achieve acceptable-to-the-recipient success?) (assuming the ability to finance such, of course)? What might the differences in revival time be as a result of the added stipulation of personal conciousness during the proceedure be? The added uncertainty needs to be honestly accounted for, don't you both agree?

In addition, I submit that being awake during mental discorporation seems an at-least-as-likely source of "disastrous personality defects" as anything mentioned so far. Wouldn't this whole issue be better dealt with during the preparation period prior to the plastination process being initiated? Such would seem more consistent with the existing cryonics model at least. I concede that resorting to plastination as a treatment option post-cataclysmic-trauma event might be problematic. Still, the exception shouldn't be the basis for deciding the rule, should it?

One aspect I don't see mentioned; what is the anticipated likelihood of this type proceedure achieving the degree of advancement such that mind-cloning isn't actually necessary? If plastic can be infused within the brain structure (how might that work with the - apparently critical - neurons surrounding the intestinal tract I wonder?), then is it unreasonable to speculate on a technique/technology to un-infuse the plastic once a corrective proceedure/treatment has been developed? Something equally "nano" no doubt.

'Hanging in there', until the technology makes it possible to revive the original -

A variation on this, would be to let some sort of replication take place, as long as the process was non-destructive.

Let the uber-Phil be born, but leave him instructions (not demands, because uber-Phil will not have signed a contract) to figure out for himself whether he is the original Phil or not. Since by that time, the uber-Phil will have access to better AI and brain enhancements, he will be better able to decide whether Phil and uber-Phil are the same person.

If uber-Phil decides he really IS a different individual, then in the instructions ask him to revive the real Phil, using the advanced technology now available, but in a way that DOES preserve the original identity.

The argument leans, of course, on the assumption that the uber-Phil will be compassionate and honorable enough to respect the wishes of the Phil in the original substrate. It also assumes that our uber-selves will be much smarter than we are, and better able to resolve philosophical questions of identity.

Will --

I'm not particularly concerned about psychological trauma (or at least wasn't until Shrinkwrapped brough it up!) My concern is a little more fundamental than that.

Here's an analogy.

Tonight, while you're sleeping, some highly advanced aliens perform a thoroughly non-intrusive scan of your brain and use it to power a more or less perfect replica of your body that they have constructed. When you wake up on their ship, they don't mention technique they used to bring you there, so you assume that you've been abducted in your sleep only to wake up on an alien spacecraft.

From the standpoint of Replica-Will, you are (he is) without question Will Brown. But is that you - the guy reading this message -- up there in the spaceship? No, THAT guy wakes up in his bed and starts thinking about breakfast.

The copy is discontinuous from the original. We know this because the original has no awareness of what's hapening to the copy and goes on and has his own experiences. A non-destructive scan is never going to get you, the guy left behind, into the spaceship. And yet, the argumet is that if a destructive scan is done (or if you were to die in your sleep before waking the next morning) the copy waking up on the ship is pretty much the same thing as you waking up in your bed the next morning (assuming the aliens never came along and no copy was ever made.)

How does destruction of the original change anything? The copy brought back from the digitized and destroyed plastic brain has every reason to believe he's me -- but in an important sense, I am still in those scraps of discarded plastic, just as the original Will Brown is still in his bed the following morning.

Long before anyone gets resurrected, dead men WILL tell tales.

Yes, they WILL tell tales -- without conscious consent -- and they WILL be JUDGED.

Are you worthy or revival?

I'd love to see how Philip K. Dick would interpret this. Am I a man? Am I human enough? Or am I just a plastic android with a plastic brain that remembers being a man? That takes alienation to a whole new level.

If you don't know, he was a writer who had some experience with paranoid schizophrenia and some trouble distinguishing the real from the imaginary.

What if I am about to nondestructively make a copy of you. You say the copy is equal to the original of you. OK, after I make a copy of you, I am going to shoot either you or your copy. Wouldn't you prefer that I shoot the copy? Doesn't that mean that the copy is not the same person as yourself? Why does making the copy destructively (shooting you during the process, as it were) change this?

What if you make (or can make) two copies? They are no longer each other (each would prefer that the other be shot) but they are now both you?

Or ... future generations may dig up your mummified head and put it on display ...

Thank you Phil for bringing up this aspect of uploading. I have sat and thought about it on multiple occasions and I always come away with an existential angst. I want to continuously experience my upgrade. I agree that whatever person that wakes up after a (non-)destructive upload feels a continuation of conscience, but I cannot convince myself that "I", whatever that is, would wake up in the new body/substrate.

Phil/RationalAnarchist

I'm curious, do you carry this same attitude over to other activities that involve deliberate loss of conciousness? Would you equally insist upon remaining concious awareness during major invasive surgery say?

The reason I ask is that there exists a measurable degree of trust involved in either example (surgery or plastination upload). Trust that the proceedure you expect to undergo will in fact occur as planned and trust that the individual you contract to perform the process will comply with your stated objectives (with the usual caveats). Whether or not you conciously experience the event, you aren't likely to retain any capability to interrupt proceedings in either circumstance. Since mid-proceedure interruption isn't really the objective, what's the point?

Are you the "same" person, the same "I" as it were, after undergoing liposuction and a facelift? Are you still the same person if the fat cells removed during that proceedure are used in a follow-up stem cell treatment to destroy cancer cells in your blood? If the post-plastination "body" you awake in is as you anticipated it would be (robotic, clone, etc) prior to the proceedure, are you not still you however many of you there might be at any given moment?

Anyway, given the well documented phenomenon of inplanted memory, how would you "really" know that the proceedure you "remember" was the one that actually took place?

DC --

Ah, but the thing to remember is that the copy is me! If done correctly, the copy is as valid a future state of Phil Bowermaster as the original. From his standpoint, and I'm speaking as a fairly credible estimator of how he will view things, any contract signed by me is binding on him. :-)

Will --

It is not about consciousness or the loss thereof per se; it is about the locus of consciousness. It is about the "where" of the subjective experience of being me. While I'm unconscious, they can do anything they want to me -- change my hair color, remove body fat, completely replace my body -- it's all good if I wake up with the same brain. But moving the data that defines the subjective experience of being me does not actually move that subjective experience -- it simply creates the potential for a duplicate subjective experience. Moving my consciousness is the one and only change that I have to be awake for. It's a consciousness transplant, the one surgery they can't put you under for! ;-)

I'll put the question to you this way -- how do you move the locus of subjective experience by any means other than via subjective experience?

Moving my consciousness is the one and only change that I have to be awake for. It's a consciousness transplant, the one surgery they can't put you under for! ;-)

I'm not sure I'm quite ready to stipulate the latter (though it is a nifty turn of phrase :)) and I think we're back to personal preference regarding the former (in which I concur with your general position as it happens).

My reply to your question - and not at all tongue in cheek - is that your subjective experience is to large degree a matter of your individual belief, and I can influence that via means both direct and indirect.

In part we're back to implanted memory again, but also into the contextual nature of human experience generally. I'm nobody's idea of an expert, but my understanding is that actual experience isn't required to achieve belief in the "reality" of an artificial experience. I'm not talking faith in the ordinary religious context, but the full-on physiological response that might be expected as a result of some occurance that did not physically occur to the responding individual (which I think goes some way toward explaining the rareness of stigmata displays among even the deeply religious faithfull for example - belief/faith isn't enough absent some individuating trigger event). If I can make you believe you experienced self-aware consciousness during physical transplant of your mind to another substrate, then you did irrespective of the actual circumstance. And no argument to the contrary will convince you otherwise because you remember!

I would counter your question by asking how does one objectively measure consciousness during a consciousness transplant proceedure? Either there will be a transitory - but detectable, perhaps distruptive to the point of identity discontinuity -interruption in personal awareness or there will need be some also transitory period of multiple consciousness locus (with all the personality disturbance you've imputed to that along with it). I don't think the proposition allows for any else.

What think you?

I had the privilege of watching the funeral of a dear departed who was about to be cryogenically preserved. Unlike any ordinary, tearful funeral, I was amazed at how upbeat and fun the ceremony was.

Everyone gathered around a piano and sang,

Freeze a jolly good fellow!
Freeze a jolly good fellow!

Will --

I would counter your question by asking how does one objectively measure consciousness during a consciousness transplant proceedure?

Objective measurement is beside the the point. Subjective experience will suffice.

Either there will be a transitory - but detectable, perhaps distruptive to the point of identity discontinuity -interruption in personal awareness...

...in which case, in my view, the procedure has not worked.

...or there will need be some also transitory period of multiple consciousness locus...

...which is okay with me. Actually we're talking about a single subjective experience of consciousness centered in two places simultaneously. John Smart and others have described how this could happen.

...(with all the personality disturbance you've imputed to that along with it).

You're confusing me with Shrinkwrapped. For me continuity of substrate can include a continuous process from one substrate to another. In fact, I think by my definition it has to include this.

"Thus conscience does make cowards of us all." (Hamlet)

Identity is not an either/or. I am much less the baby I once was than the teenager I once was. Commonality is matter of degree. I will not be exactly the same as the uploaded creature, but I will be more the same than were you to revert me to a baby, by aging me back through perfected formulae that touches substrate not (for physics oft forgets time's arrow and there be no means to say the substrate is fairer kept forward than backward). I would also prefer that a somewhat common upload persist than that, by becoming something wholly uncommon, I desist. Corpses are all remarkably alike and closer in substrate to living men than babies are to any man, but I consider that substrate change of corpse, though but a minor jump, to be far greater than an upload hop.

Put in more coherent language, how do substrate-theorists deal with reverse ageing, which preserves a substrate? When they digress (their mind too) into babyism, then are they really more preserved than had they been uploaded?

Thus asks Panda.

>>If done correctly, the copy is ... valid..."

Yes, of course. But that begs the question.

My point is that if we have doubts, scientific or philosophical, then we should perform any types of augmentations off the copy, and not the original. Then, when we have augmented the intelligence of the copies of ourselves, we ask THEM if they believe they are the same person. If they believe they ARE the same person, then fine. But I don't think we can assume that we now know everything they will know.

If they decide they are different from us, then leave a request for them to revive and augment the originals, with whatever technology is available at that point, but in a way which DOES preserve identity.

However, if any of us are absolutely certain that the copies WILL be US, then we don't need to leave any instructions for our futures selves; just let them continue our existence, stronger and smarter and better than we were before.

Frankly, I think we all need to consider the possibility that our transformations won't necessarily be that of a human individual, to another more advanced human individual. It might well be more akin to butterfly larvae (caterpillars) becoming butterflies. In other words (just a possibility), our advanced selves might not really be our selves anymore.

As long as they are smart and peaceful and compassionate, that isn't so bad.

Post a comment

(Comments are moderated, and sometimes they take a while to appear. Thanks for waiting.)






Be a Speculist

Share your thoughts on the future with more than

70,000

Speculist readers. Write to us at:

speculist1@yahoo.com

(More details here.)



Blogroll



Categories

Powered by
Movable Type 3.2