<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
		>
<channel>
	<title>Comments on: Longer Living through Plastics</title>
	<atom:link href="https://blog.speculist.com/life_extension/longer-living-t.html/feed" rel="self" type="application/rss+xml" />
	<link>https://blog.speculist.com/life_extension/longer-living-t.html</link>
	<description>Live to see it.</description>
	<lastBuildDate>Thu, 16 Dec 2021 08:21:00 +0000</lastBuildDate>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.6.1</generator>
	<item>
		<title>By: DCWhatthe</title>
		<link>https://blog.speculist.com/life_extension/longer-living-t.html#comment-5171</link>
		<dc:creator>DCWhatthe</dc:creator>
		<pubDate>Thu, 01 Apr 2010 21:58:45 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=2072#comment-5171</guid>
		<description><![CDATA[&gt;&gt;If done correctly, the copy is ... valid...&quot;

Yes, of course.  But that begs the question.

My point is that if we have doubts, scientific or philosophical, then we should perform any types of augmentations off the copy, and not the original.  Then, when we have augmented the intelligence of the copies of ourselves, we ask THEM if they believe they are the same person.  If they believe they ARE the same person, then fine.  But I don&#039;t think we can assume that we now know everything they will know.

If they decide they are different from us, then leave a request for them to revive and augment the originals, with whatever technology is available at that point, but in a way which DOES preserve identity.

However, if any of us are absolutely certain that the copies WILL be US, then we don&#039;t need to leave any instructions for our futures selves; just let them continue our existence, stronger and smarter and better than we were before.

Frankly, I think we all need to consider the possibility that our transformations won&#039;t necessarily be that of a human individual, to another more advanced human individual.  It might well be more akin to butterfly larvae (caterpillars) becoming butterflies.  In other words (just a possibility), our advanced selves might not really be our selves anymore.

As long as they are smart and peaceful and compassionate, that isn&#039;t so bad.]]></description>
		<content:encoded><![CDATA[<p>>>If done correctly, the copy is &#8230; valid&#8230;&#8221;</p>
<p>Yes, of course.  But that begs the question.</p>
<p>My point is that if we have doubts, scientific or philosophical, then we should perform any types of augmentations off the copy, and not the original.  Then, when we have augmented the intelligence of the copies of ourselves, we ask THEM if they believe they are the same person.  If they believe they ARE the same person, then fine.  But I don&#8217;t think we can assume that we now know everything they will know.</p>
<p>If they decide they are different from us, then leave a request for them to revive and augment the originals, with whatever technology is available at that point, but in a way which DOES preserve identity.</p>
<p>However, if any of us are absolutely certain that the copies WILL be US, then we don&#8217;t need to leave any instructions for our futures selves; just let them continue our existence, stronger and smarter and better than we were before.</p>
<p>Frankly, I think we all need to consider the possibility that our transformations won&#8217;t necessarily be that of a human individual, to another more advanced human individual.  It might well be more akin to butterfly larvae (caterpillars) becoming butterflies.  In other words (just a possibility), our advanced selves might not really be our selves anymore.</p>
<p>As long as they are smart and peaceful and compassionate, that isn&#8217;t so bad.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Panda</title>
		<link>https://blog.speculist.com/life_extension/longer-living-t.html#comment-5170</link>
		<dc:creator>Panda</dc:creator>
		<pubDate>Thu, 01 Apr 2010 11:05:32 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=2072#comment-5170</guid>
		<description><![CDATA[&quot;Thus conscience does make cowards of us all.&quot; (Hamlet)

Identity is not an either/or.  I am much less the baby I once was than the teenager I once was.  Commonality is matter of degree.  I will not be exactly the same as the uploaded creature, but I will be more the same than were you to revert me to a baby, by aging me back through perfected formulae that touches substrate not (for physics oft forgets time&#039;s arrow and there be no means to say the substrate is fairer kept forward than backward).  I would also prefer that a somewhat common upload persist than that, by becoming something wholly uncommon, I desist.  Corpses are all remarkably alike and closer in substrate to living men than babies are to any man, but I consider that substrate change of corpse, though but a minor jump, to be far greater than an upload hop.

Put in more coherent language, how do substrate-theorists deal with reverse ageing, which preserves a substrate?  When they digress (their mind too) into babyism, then are they really more preserved than had they been uploaded?

Thus asks Panda.]]></description>
		<content:encoded><![CDATA[<p>&#8220;Thus conscience does make cowards of us all.&#8221; (Hamlet)</p>
<p>Identity is not an either/or.  I am much less the baby I once was than the teenager I once was.  Commonality is matter of degree.  I will not be exactly the same as the uploaded creature, but I will be more the same than were you to revert me to a baby, by aging me back through perfected formulae that touches substrate not (for physics oft forgets time&#8217;s arrow and there be no means to say the substrate is fairer kept forward than backward).  I would also prefer that a somewhat common upload persist than that, by becoming something wholly uncommon, I desist.  Corpses are all remarkably alike and closer in substrate to living men than babies are to any man, but I consider that substrate change of corpse, though but a minor jump, to be far greater than an upload hop.</p>
<p>Put in more coherent language, how do substrate-theorists deal with reverse ageing, which preserves a substrate?  When they digress (their mind too) into babyism, then are they really more preserved than had they been uploaded?</p>
<p>Thus asks Panda.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Phil Bowermaster</title>
		<link>https://blog.speculist.com/life_extension/longer-living-t.html#comment-5169</link>
		<dc:creator>Phil Bowermaster</dc:creator>
		<pubDate>Mon, 29 Mar 2010 21:50:34 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=2072#comment-5169</guid>
		<description><![CDATA[Will --

&lt;em&gt;I would counter your question by asking how does one objectively measure consciousness during a consciousness transplant proceedure?&lt;/em&gt;

Objective measurement is beside the the point. Subjective experience will suffice.

&lt;em&gt;Either there will be a transitory - but detectable, perhaps distruptive to the point of identity discontinuity -interruption in personal awareness...&lt;/em&gt; 

...in which case, in my view, the procedure has not worked.

&lt;em&gt;...or there will need be some also transitory period of multiple consciousness locus...&lt;/em&gt; 

...which is okay with me. Actually we&#039;re talking about a single subjective experience of consciousness centered in two places simultaneously. John Smart and others have described how this could happen. 

&lt;em&gt;...(with all the personality disturbance you&#039;ve imputed to that along with it).&lt;/em&gt;

You&#039;re confusing me with Shrinkwrapped. For me continuity of substrate can include a continuous process from one substrate to another. In fact, I think by my definition it has to include this.]]></description>
		<content:encoded><![CDATA[<p>Will &#8211;</p>
<p><em>I would counter your question by asking how does one objectively measure consciousness during a consciousness transplant proceedure?</em></p>
<p>Objective measurement is beside the the point. Subjective experience will suffice.</p>
<p><em>Either there will be a transitory &#8211; but detectable, perhaps distruptive to the point of identity discontinuity -interruption in personal awareness&#8230;</em> </p>
<p>&#8230;in which case, in my view, the procedure has not worked.</p>
<p><em>&#8230;or there will need be some also transitory period of multiple consciousness locus&#8230;</em> </p>
<p>&#8230;which is okay with me. Actually we&#8217;re talking about a single subjective experience of consciousness centered in two places simultaneously. John Smart and others have described how this could happen. </p>
<p><em>&#8230;(with all the personality disturbance you&#8217;ve imputed to that along with it).</em></p>
<p>You&#8217;re confusing me with Shrinkwrapped. For me continuity of substrate can include a continuous process from one substrate to another. In fact, I think by my definition it has to include this.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: not C.J. Burch</title>
		<link>https://blog.speculist.com/life_extension/longer-living-t.html#comment-5168</link>
		<dc:creator>not C.J. Burch</dc:creator>
		<pubDate>Mon, 29 Mar 2010 19:35:38 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=2072#comment-5168</guid>
		<description><![CDATA[I had the privilege of watching the funeral of a dear departed who was about to be cryogenically preserved. Unlike any ordinary, tearful funeral, I was amazed at how upbeat and fun the ceremony was. 

Everyone gathered around a piano and sang,

Freeze a jolly good fellow!
Freeze a jolly good fellow!]]></description>
		<content:encoded><![CDATA[<p>I had the privilege of watching the funeral of a dear departed who was about to be cryogenically preserved. Unlike any ordinary, tearful funeral, I was amazed at how upbeat and fun the ceremony was. </p>
<p>Everyone gathered around a piano and sang,</p>
<p>Freeze a jolly good fellow!<br />
Freeze a jolly good fellow!</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Will Brown</title>
		<link>https://blog.speculist.com/life_extension/longer-living-t.html#comment-5167</link>
		<dc:creator>Will Brown</dc:creator>
		<pubDate>Mon, 29 Mar 2010 17:48:13 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=2072#comment-5167</guid>
		<description><![CDATA[&lt;i&gt;Moving my consciousness is the one and only change that I have to be awake for. It&#039;s a consciousness transplant, the one surgery they can&#039;t put you under for! ;-)&lt;/i&gt;

I&#039;m not sure I&#039;m quite ready to stipulate the latter (though it &lt;i&gt;is&lt;/i&gt; a nifty turn of phrase :)) and I think we&#039;re back to personal preference regarding the former (in which I concur with your general position as it happens).

My reply to your question - and not at all tongue in cheek - is that your subjective experience is to large degree a matter of your individual belief, and I &lt;i&gt;can&lt;/i&gt; influence that via means both direct and indirect.  

In part we&#039;re back to implanted memory again, but also into the contextual nature of human experience generally.  I&#039;m nobody&#039;s idea of an expert, but my understanding is that actual experience isn&#039;t required to achieve belief in the &quot;reality&quot; of an artificial experience.  I&#039;m not talking faith in the ordinary religious context, but the full-on physiological response that might be expected as a result of some occurance &lt;i&gt;that did not physically occur to the responding individual&lt;/i&gt; (which I think goes some way toward explaining the rareness of stigmata displays among even the deeply religious faithfull for example - belief/faith isn&#039;t enough absent some individuating trigger event).  If I can make you believe you experienced self-aware consciousness during physical transplant of your mind to another substrate, then you did irrespective of the actual circumstance.  And no argument to the contrary will convince you otherwise &lt;i&gt;because you remember&lt;/i&gt;!

I would counter your question by asking how does one objectively measure consciousness during a consciousness transplant proceedure?  Either there will be a transitory - but detectable, perhaps distruptive to the point of identity discontinuity -interruption in personal awareness or there will need be some also transitory period of multiple consciousness locus (with all the personality disturbance you&#039;ve imputed to that along with it).  I don&#039;t think the proposition allows for any else.  

What think you?]]></description>
		<content:encoded><![CDATA[<p><i>Moving my consciousness is the one and only change that I have to be awake for. It&#8217;s a consciousness transplant, the one surgery they can&#8217;t put you under for! <img src='https://blog.speculist.com/wp-includes/images/smilies/icon_wink.gif' alt=';-)' class='wp-smiley' /> </i></p>
<p>I&#8217;m not sure I&#8217;m quite ready to stipulate the latter (though it <i>is</i> a nifty turn of phrase <img src='https://blog.speculist.com/wp-includes/images/smilies/icon_smile.gif' alt=':)' class='wp-smiley' /> ) and I think we&#8217;re back to personal preference regarding the former (in which I concur with your general position as it happens).</p>
<p>My reply to your question &#8211; and not at all tongue in cheek &#8211; is that your subjective experience is to large degree a matter of your individual belief, and I <i>can</i> influence that via means both direct and indirect.  </p>
<p>In part we&#8217;re back to implanted memory again, but also into the contextual nature of human experience generally.  I&#8217;m nobody&#8217;s idea of an expert, but my understanding is that actual experience isn&#8217;t required to achieve belief in the &#8220;reality&#8221; of an artificial experience.  I&#8217;m not talking faith in the ordinary religious context, but the full-on physiological response that might be expected as a result of some occurance <i>that did not physically occur to the responding individual</i> (which I think goes some way toward explaining the rareness of stigmata displays among even the deeply religious faithfull for example &#8211; belief/faith isn&#8217;t enough absent some individuating trigger event).  If I can make you believe you experienced self-aware consciousness during physical transplant of your mind to another substrate, then you did irrespective of the actual circumstance.  And no argument to the contrary will convince you otherwise <i>because you remember</i>!</p>
<p>I would counter your question by asking how does one objectively measure consciousness during a consciousness transplant proceedure?  Either there will be a transitory &#8211; but detectable, perhaps distruptive to the point of identity discontinuity -interruption in personal awareness or there will need be some also transitory period of multiple consciousness locus (with all the personality disturbance you&#8217;ve imputed to that along with it).  I don&#8217;t think the proposition allows for any else.  </p>
<p>What think you?</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Phil Bowermaster</title>
		<link>https://blog.speculist.com/life_extension/longer-living-t.html#comment-5166</link>
		<dc:creator>Phil Bowermaster</dc:creator>
		<pubDate>Mon, 29 Mar 2010 17:04:33 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=2072#comment-5166</guid>
		<description><![CDATA[DC --

Ah, but the thing to remember is that the copy is me! If done correctly, the copy is as valid a future state of Phil Bowermaster as the original. From his standpoint, and I&#039;m speaking as a fairly credible estimator of how he will view things, any contract signed by &lt;b&gt;me&lt;/b&gt; is binding on &lt;b&gt;him.&lt;/b&gt; :-) 

Will --

It is not about consciousness or the loss thereof per se; it is about the locus of consciousness. It is about the &quot;where&quot; of the subjective experience of being me. While I&#039;m unconscious, they can do anything they want to me -- change my hair color, remove body fat, completely replace my body -- it&#039;s all good if I wake up with the same brain. But moving the data that defines the subjective experience of being me does not actually move that subjective experience -- it simply creates the potential for a duplicate subjective experience. Moving my consciousness is the one and only change that I have to be awake for. It&#039;s a consciousness transplant, the one surgery they can&#039;t put you under for! ;-)

 I&#039;ll put the question to you this way -- how do you move the locus of subjective experience by any means &lt;b&gt;other&lt;/b&gt; than via subjective experience?]]></description>
		<content:encoded><![CDATA[<p>DC &#8211;</p>
<p>Ah, but the thing to remember is that the copy is me! If done correctly, the copy is as valid a future state of Phil Bowermaster as the original. From his standpoint, and I&#8217;m speaking as a fairly credible estimator of how he will view things, any contract signed by <b>me</b> is binding on <b>him.</b> <img src='https://blog.speculist.com/wp-includes/images/smilies/icon_smile.gif' alt=':-)' class='wp-smiley' />  </p>
<p>Will &#8211;</p>
<p>It is not about consciousness or the loss thereof per se; it is about the locus of consciousness. It is about the &#8220;where&#8221; of the subjective experience of being me. While I&#8217;m unconscious, they can do anything they want to me &#8212; change my hair color, remove body fat, completely replace my body &#8212; it&#8217;s all good if I wake up with the same brain. But moving the data that defines the subjective experience of being me does not actually move that subjective experience &#8212; it simply creates the potential for a duplicate subjective experience. Moving my consciousness is the one and only change that I have to be awake for. It&#8217;s a consciousness transplant, the one surgery they can&#8217;t put you under for! <img src='https://blog.speculist.com/wp-includes/images/smilies/icon_wink.gif' alt=';-)' class='wp-smiley' /> </p>
<p> I&#8217;ll put the question to you this way &#8212; how do you move the locus of subjective experience by any means <b>other</b> than via subjective experience?</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Will Brown</title>
		<link>https://blog.speculist.com/life_extension/longer-living-t.html#comment-5165</link>
		<dc:creator>Will Brown</dc:creator>
		<pubDate>Mon, 29 Mar 2010 16:27:05 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=2072#comment-5165</guid>
		<description><![CDATA[Phil/RationalAnarchist

I&#039;m curious, do you carry this same attitude over to other activities that involve deliberate loss of conciousness?  Would you equally insist upon remaining concious awareness during major invasive surgery say?

The reason I ask is that there exists a measurable degree of trust involved in either example (surgery or plastination upload).  Trust that the proceedure you expect to undergo will in fact occur as planned and trust that the individual you contract to perform the process will comply with your stated objectives (with the usual caveats).  Whether or not you conciously experience the event, you aren&#039;t likely to retain any capability to interrupt proceedings in either circumstance.  Since mid-proceedure interruption isn&#039;t really the objective, what&#039;s the point?

Are you the &quot;same&quot; person, the same &quot;I&quot; as it were, after undergoing liposuction and a facelift?  Are you still the same person if the fat cells removed during that proceedure are used in a follow-up stem cell treatment to destroy cancer cells in your blood?  If the post-plastination &quot;body&quot; you awake in is as you anticipated it would be (robotic, clone, etc) prior to the proceedure, are you not still you however many of you there might be at any given moment?

Anyway, given the well documented phenomenon of inplanted memory, how would you &quot;really&quot; know that the proceedure you &quot;remember&quot; was the one that actually took place?]]></description>
		<content:encoded><![CDATA[<p>Phil/RationalAnarchist</p>
<p>I&#8217;m curious, do you carry this same attitude over to other activities that involve deliberate loss of conciousness?  Would you equally insist upon remaining concious awareness during major invasive surgery say?</p>
<p>The reason I ask is that there exists a measurable degree of trust involved in either example (surgery or plastination upload).  Trust that the proceedure you expect to undergo will in fact occur as planned and trust that the individual you contract to perform the process will comply with your stated objectives (with the usual caveats).  Whether or not you conciously experience the event, you aren&#8217;t likely to retain any capability to interrupt proceedings in either circumstance.  Since mid-proceedure interruption isn&#8217;t really the objective, what&#8217;s the point?</p>
<p>Are you the &#8220;same&#8221; person, the same &#8220;I&#8221; as it were, after undergoing liposuction and a facelift?  Are you still the same person if the fat cells removed during that proceedure are used in a follow-up stem cell treatment to destroy cancer cells in your blood?  If the post-plastination &#8220;body&#8221; you awake in is as you anticipated it would be (robotic, clone, etc) prior to the proceedure, are you not still you however many of you there might be at any given moment?</p>
<p>Anyway, given the well documented phenomenon of inplanted memory, how would you &#8220;really&#8221; know that the proceedure you &#8220;remember&#8221; was the one that actually took place?</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: RationalAnarchist</title>
		<link>https://blog.speculist.com/life_extension/longer-living-t.html#comment-5164</link>
		<dc:creator>RationalAnarchist</dc:creator>
		<pubDate>Mon, 29 Mar 2010 15:11:08 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=2072#comment-5164</guid>
		<description><![CDATA[Thank you Phil for bringing up this aspect of uploading. I have sat and thought about it on multiple occasions and I always come away with an existential angst. I want to continuously experience my upgrade. I agree that whatever person that wakes up after a (non-)destructive upload feels a continuation of conscience, but I cannot convince myself that &quot;I&quot;, whatever that is, would wake up in the new body/substrate.]]></description>
		<content:encoded><![CDATA[<p>Thank you Phil for bringing up this aspect of uploading. I have sat and thought about it on multiple occasions and I always come away with an existential angst. I want to continuously experience my upgrade. I agree that whatever person that wakes up after a (non-)destructive upload feels a continuation of conscience, but I cannot convince myself that &#8220;I&#8221;, whatever that is, would wake up in the new body/substrate.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: LoboSolo</title>
		<link>https://blog.speculist.com/life_extension/longer-living-t.html#comment-5163</link>
		<dc:creator>LoboSolo</dc:creator>
		<pubDate>Mon, 29 Mar 2010 14:29:43 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=2072#comment-5163</guid>
		<description><![CDATA[Or ... future generations may dig up your mummified head and put it on display ...]]></description>
		<content:encoded><![CDATA[<p>Or &#8230; future generations may dig up your mummified head and put it on display &#8230;</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: DWPittelli</title>
		<link>https://blog.speculist.com/life_extension/longer-living-t.html#comment-5162</link>
		<dc:creator>DWPittelli</dc:creator>
		<pubDate>Mon, 29 Mar 2010 13:32:50 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=2072#comment-5162</guid>
		<description><![CDATA[What if I am about to nondestructively make a copy of you. You say the copy is equal to the original of you. OK, after I make a copy of you, I am going to shoot either you or your copy. Wouldn&#039;t you prefer that I shoot the copy? Doesn&#039;t that mean that the copy is not the same person as yourself? Why does making the copy destructively (shooting you during the process, as it were) change this?

What if you make (or can make) two copies? They are no longer each other (each would prefer that the other be shot) but they are now both you?]]></description>
		<content:encoded><![CDATA[<p>What if I am about to nondestructively make a copy of you. You say the copy is equal to the original of you. OK, after I make a copy of you, I am going to shoot either you or your copy. Wouldn&#8217;t you prefer that I shoot the copy? Doesn&#8217;t that mean that the copy is not the same person as yourself? Why does making the copy destructively (shooting you during the process, as it were) change this?</p>
<p>What if you make (or can make) two copies? They are no longer each other (each would prefer that the other be shot) but they are now both you?</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Number Six</title>
		<link>https://blog.speculist.com/life_extension/longer-living-t.html#comment-5161</link>
		<dc:creator>Number Six</dc:creator>
		<pubDate>Mon, 29 Mar 2010 12:38:38 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=2072#comment-5161</guid>
		<description><![CDATA[I&#039;d love to see how Philip K. Dick would interpret this.  Am I a man?  Am I human enough? Or am I just a plastic android with a plastic brain that remembers being a man?  That takes alienation to a whole new level. 

If you don&#039;t know, he was a writer who had some experience with paranoid schizophrenia and some trouble distinguishing the real from the imaginary.]]></description>
		<content:encoded><![CDATA[<p>I&#8217;d love to see how Philip K. Dick would interpret this.  Am I a man?  Am I human enough? Or am I just a plastic android with a plastic brain that remembers being a man?  That takes alienation to a whole new level. </p>
<p>If you don&#8217;t know, he was a writer who had some experience with paranoid schizophrenia and some trouble distinguishing the real from the imaginary.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: apetra</title>
		<link>https://blog.speculist.com/life_extension/longer-living-t.html#comment-5160</link>
		<dc:creator>apetra</dc:creator>
		<pubDate>Mon, 29 Mar 2010 12:12:51 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=2072#comment-5160</guid>
		<description><![CDATA[Yes, they WILL tell tales -- without conscious consent -- and they WILL be JUDGED.

Are you worthy or revival?]]></description>
		<content:encoded><![CDATA[<p>Yes, they WILL tell tales &#8212; without conscious consent &#8212; and they WILL be JUDGED.</p>
<p>Are you worthy or revival?</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Mike Anderson</title>
		<link>https://blog.speculist.com/life_extension/longer-living-t.html#comment-5159</link>
		<dc:creator>Mike Anderson</dc:creator>
		<pubDate>Mon, 29 Mar 2010 11:17:47 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=2072#comment-5159</guid>
		<description><![CDATA[Long before anyone gets resurrected, dead men WILL tell tales.]]></description>
		<content:encoded><![CDATA[<p>Long before anyone gets resurrected, dead men WILL tell tales.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Phil Bowermaster</title>
		<link>https://blog.speculist.com/life_extension/longer-living-t.html#comment-5158</link>
		<dc:creator>Phil Bowermaster</dc:creator>
		<pubDate>Mon, 29 Mar 2010 08:26:47 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=2072#comment-5158</guid>
		<description><![CDATA[Will --

I&#039;m not particularly concerned about psychological trauma (or at least  wasn&#039;t until Shrinkwrapped brough it up!) My concern is a little more fundamental than that.

Here&#039;s an analogy. 

Tonight, while you&#039;re sleeping, some highly advanced aliens perform a thoroughly non-intrusive scan of your brain and use it to power a more or less perfect replica of your body that they have constructed. When you wake up on their ship, they don&#039;t mention technique they used to bring you there, so you assume that you&#039;ve been abducted in your sleep only to wake up on an alien spacecraft. 

From the standpoint of Replica-Will, you are (he is) without question Will Brown. But is that you - the guy reading this message -- up there in the spaceship? No, THAT guy wakes up in his bed and starts thinking about breakfast.

The copy is discontinuous from the original. We know this because the original has no awareness of what&#039;s hapening to the copy and goes on and has his own experiences. A non-destructive scan is never going to get you, the guy left behind, into the spaceship. And yet, the argumet is that if a destructive scan is done (or if you were to die in your sleep before waking the next morning) the copy waking up on the ship is pretty much the same thing as you waking up in your bed the next morning (assuming the aliens never came along and no copy was ever made.)

How does destruction of the original change anything? The copy brought back from the digitized and destroyed plastic brain has every reason to believe he&#039;s me -- but in an important sense, I am still in those scraps of discarded plastic, just as the original Will Brown is still in his bed the following morning.]]></description>
		<content:encoded><![CDATA[<p>Will &#8211;</p>
<p>I&#8217;m not particularly concerned about psychological trauma (or at least  wasn&#8217;t until Shrinkwrapped brough it up!) My concern is a little more fundamental than that.</p>
<p>Here&#8217;s an analogy. </p>
<p>Tonight, while you&#8217;re sleeping, some highly advanced aliens perform a thoroughly non-intrusive scan of your brain and use it to power a more or less perfect replica of your body that they have constructed. When you wake up on their ship, they don&#8217;t mention technique they used to bring you there, so you assume that you&#8217;ve been abducted in your sleep only to wake up on an alien spacecraft. </p>
<p>From the standpoint of Replica-Will, you are (he is) without question Will Brown. But is that you &#8211; the guy reading this message &#8212; up there in the spaceship? No, THAT guy wakes up in his bed and starts thinking about breakfast.</p>
<p>The copy is discontinuous from the original. We know this because the original has no awareness of what&#8217;s hapening to the copy and goes on and has his own experiences. A non-destructive scan is never going to get you, the guy left behind, into the spaceship. And yet, the argumet is that if a destructive scan is done (or if you were to die in your sleep before waking the next morning) the copy waking up on the ship is pretty much the same thing as you waking up in your bed the next morning (assuming the aliens never came along and no copy was ever made.)</p>
<p>How does destruction of the original change anything? The copy brought back from the digitized and destroyed plastic brain has every reason to believe he&#8217;s me &#8212; but in an important sense, I am still in those scraps of discarded plastic, just as the original Will Brown is still in his bed the following morning.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: DCWhatthe</title>
		<link>https://blog.speculist.com/life_extension/longer-living-t.html#comment-5157</link>
		<dc:creator>DCWhatthe</dc:creator>
		<pubDate>Mon, 29 Mar 2010 07:32:54 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=2072#comment-5157</guid>
		<description><![CDATA[&#039;Hanging in there&#039;, until the technology makes it possible to revive the original - 

A variation on this, would be to let some sort of replication take place, as long as the process was non-destructive.  

Let the uber-Phil be born, but leave him instructions (not demands, because uber-Phil will not have signed a contract) to figure out for himself whether he is the original Phil or not.  Since by that time, the uber-Phil will have access to better AI and brain enhancements, he will be better able to decide whether Phil and uber-Phil are the same person.

If uber-Phil decides he really IS a different individual, then in the instructions ask him to revive the real Phil, using the advanced technology now available, but in a way that  DOES preserve the original identity.

The argument leans, of course, on the assumption that the uber-Phil will be compassionate and honorable enough to respect the wishes of the Phil in the original substrate.  It also assumes that our uber-selves will be much smarter than we are, and better able to resolve philosophical questions of identity.]]></description>
		<content:encoded><![CDATA[<p>&#8216;Hanging in there&#8217;, until the technology makes it possible to revive the original &#8211; </p>
<p>A variation on this, would be to let some sort of replication take place, as long as the process was non-destructive.  </p>
<p>Let the uber-Phil be born, but leave him instructions (not demands, because uber-Phil will not have signed a contract) to figure out for himself whether he is the original Phil or not.  Since by that time, the uber-Phil will have access to better AI and brain enhancements, he will be better able to decide whether Phil and uber-Phil are the same person.</p>
<p>If uber-Phil decides he really IS a different individual, then in the instructions ask him to revive the real Phil, using the advanced technology now available, but in a way that  DOES preserve the original identity.</p>
<p>The argument leans, of course, on the assumption that the uber-Phil will be compassionate and honorable enough to respect the wishes of the Phil in the original substrate.  It also assumes that our uber-selves will be much smarter than we are, and better able to resolve philosophical questions of identity.</p>
]]></content:encoded>
	</item>
</channel>
</rss>
