<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
		>
<channel>
	<title>Comments on: This seems like a big deal&#8230;</title>
	<atom:link href="https://blog.speculist.com/artificial_intelligence/this-seems-like.html/feed" rel="self" type="application/rss+xml" />
	<link>https://blog.speculist.com/artificial_intelligence/this-seems-like.html</link>
	<description>Live to see it.</description>
	<lastBuildDate>Thu, 16 Dec 2021 08:21:00 +0000</lastBuildDate>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.6.1</generator>
	<item>
		<title>By: Phil Bowermaster</title>
		<link>https://blog.speculist.com/artificial_intelligence/this-seems-like.html#comment-9709</link>
		<dc:creator>Phil Bowermaster</dc:creator>
		<pubDate>Mon, 22 Sep 2008 12:44:49 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=1682#comment-9709</guid>
		<description><![CDATA[Consider the deaf-blind:

&lt;a href=&quot;http://www.actionfund.org/ohsay/saysee18.htm&quot; rel=&quot;nofollow&quot;&gt;http://www.actionfund.org/ohsay/saysee18.htm&lt;/a&gt;

People who learn to communicate using language who have no auditory or visual experience to base their understanding of words on probably deal at a much higher level of abstraction than the rest of us. If such an individual has never touched or smelled a dog, all he or she will know about a dog is what somebody else says. Such a person&#039;s understanding of the idea of &quot;dog&quot; will be very different from mine, but perhaps not that different from a computer&#039;s -- which also must rely on abstract information to grasp the concept of &quot;dog.&quot; The major difference is that this individual&#039;s existing sensory experience will inform some aspect of their concept of what a dog is even if he or she has never touched or smelled a dog. The computer&#039;s &quot;idea&quot; will be entirely abstract, at least until we have computers that process their own sensory information.

In any case, our understanding of concepts has a lot to do with what we sense and a lot to do with how we reason abstractly. If you take most of the sensory input away, I think we would agree that we still have real use of and understanding of language. The question is how much meaning can exist if all sensory input is removed from the equation?

I think there can still be meaning and understanding even at a purely abstract level. It would be a different kind of thinking than we do, but that doesn&#039;t mean that it isn&#039;t thinking.]]></description>
		<content:encoded><![CDATA[<p>Consider the deaf-blind:</p>
<p><a href="http://www.actionfund.org/ohsay/saysee18.htm" rel="nofollow">http://www.actionfund.org/ohsay/saysee18.htm</a></p>
<p>People who learn to communicate using language who have no auditory or visual experience to base their understanding of words on probably deal at a much higher level of abstraction than the rest of us. If such an individual has never touched or smelled a dog, all he or she will know about a dog is what somebody else says. Such a person&#8217;s understanding of the idea of &#8220;dog&#8221; will be very different from mine, but perhaps not that different from a computer&#8217;s &#8212; which also must rely on abstract information to grasp the concept of &#8220;dog.&#8221; The major difference is that this individual&#8217;s existing sensory experience will inform some aspect of their concept of what a dog is even if he or she has never touched or smelled a dog. The computer&#8217;s &#8220;idea&#8221; will be entirely abstract, at least until we have computers that process their own sensory information.</p>
<p>In any case, our understanding of concepts has a lot to do with what we sense and a lot to do with how we reason abstractly. If you take most of the sensory input away, I think we would agree that we still have real use of and understanding of language. The question is how much meaning can exist if all sensory input is removed from the equation?</p>
<p>I think there can still be meaning and understanding even at a purely abstract level. It would be a different kind of thinking than we do, but that doesn&#8217;t mean that it isn&#8217;t thinking.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Aaron</title>
		<link>https://blog.speculist.com/artificial_intelligence/this-seems-like.html#comment-9708</link>
		<dc:creator>Aaron</dc:creator>
		<pubDate>Sun, 21 Sep 2008 11:22:14 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=1682#comment-9708</guid>
		<description><![CDATA[I agree. The association of words with the actual fundamental concepts behind them requires visualization of them and their behavior, which in turn requires association with experience of the physical world.

Without that, it&#039;s hard to see how what we think of intelligence or conciousness could converge.]]></description>
		<content:encoded><![CDATA[<p>I agree. The association of words with the actual fundamental concepts behind them requires visualization of them and their behavior, which in turn requires association with experience of the physical world.</p>
<p>Without that, it&#8217;s hard to see how what we think of intelligence or conciousness could converge.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Derek James</title>
		<link>https://blog.speculist.com/artificial_intelligence/this-seems-like.html#comment-9707</link>
		<dc:creator>Derek James</dc:creator>
		<pubDate>Thu, 18 Sep 2008 08:14:20 +0000</pubDate>
		<guid isPermaLink="false">http://localhost/specblog/?p=1682#comment-9707</guid>
		<description><![CDATA[If you think semantics is all about the statistical correlations between words, then yeah, we&#039;re close to AI.

Only semantics is a lot more than that. Our concept of &quot;dog&quot; doesn&#039;t rely merely on the correlation of that word with other words. We&#039;ve got mountains of sensory experience associated with the concept. Without that link, I don&#039;t think I&#039;d be alone in arguing that a semantic map such as this doesn&#039;t understand what a dog is...not even close. It knows a great deal about how the token &quot;dog&quot; interrelates to other tokens, but that is far, far, far from understanding or intelligence.]]></description>
		<content:encoded><![CDATA[<p>If you think semantics is all about the statistical correlations between words, then yeah, we&#8217;re close to AI.</p>
<p>Only semantics is a lot more than that. Our concept of &#8220;dog&#8221; doesn&#8217;t rely merely on the correlation of that word with other words. We&#8217;ve got mountains of sensory experience associated with the concept. Without that link, I don&#8217;t think I&#8217;d be alone in arguing that a semantic map such as this doesn&#8217;t understand what a dog is&#8230;not even close. It knows a great deal about how the token &#8220;dog&#8221; interrelates to other tokens, but that is far, far, far from understanding or intelligence.</p>
]]></content:encoded>
	</item>
</channel>
</rss>
