Thanks for the link, Glenn, and welcome InstaPundit readers. For those who inquired about the subtitle and the “1.” below — yes, this is the beginning of a series. I’m hoping to have the second entry up sometime this week. So don’t be a stranger.
1. God as Model of the Good
Ray Kurzweil lays out some challenging ideas in The Singularity is Near, perhaps none is more challenging than this passage which concludes the chapter entitled “Ich Bin Ein Singularitarian”:
Evolution moves towards greater complexity, greater elegance, greater knowledge, greater intelligence, greater beauty, greater creativity, and greater levels of subtle attributes such as love. In every monotheistic tradition God is likewise described as all of these qualities, only without limitation: infinite knowledge, infinite intelligence, infinite beauty, infinite creativity, infinite love, and so on. Of course, even the accelerating growth of evolution never achieves an infinite level, but as it explodes exponentially it certainly moves rapidly in that direction. So evolution moves inexorably towards this conception of God, although never quite reaching this ideal. We can regard, therefore, the freeing of our thinking from the severe limitations of its biological form to be an essentially spiritual undertaking.
This raises some interesting questions about the relationship between God and the Singularity. Just to rattle off a few…
Does the Singularity bring us closer to God?
Does God show up at the Singularity?
Are we going to somehow create God?
Are we going to somehow become God?
These kinds of questions would have gotten me in a lot of trouble years ago at (Southern Baptist) church camp. Actually, the first two wouldn’t have, so long as everyone assumed that by “Singularity” I really meant “Rapture.” And, come to think of it, the latter two wouldn’t have gotten me into trouble so much as they would have worried people sick about the state of my soul, subjecting me to the kind of additional attention and counseling that every 14-year-old boy hopes to get on summer afternoons while everyone else is out swimming and playing softball.
To tell you the truth, even today I’m glad that my Mom rarely looks in on this site. I’m not sure that I would want her to know that I’m raising these kinds of questions. You want to talk about being in trouble…
Anyhow, before we get to the answers, let’s spend some time on why we would even be talking about God in relationship to the Singularity. For starters, there’s probably not a lot of overlap between theists and Singularitarians. Devout believers tend to view the Singularity as a kind of competing eschatology, while “devout” (doesn’t seem to be the right word, does it?) Singularitarians tend to be agnostics and atheists. There are exceptions, of course, but they are mostly outliers — scientifically minded folks who have room in their world view for an amorphous, noncommital “spirituality” and fringe believers who are okay with making pretzels out of established doctrine (a la Tipler) in order to be able to affirm everything they want.
Those are perhaps needlessly nasty caricatures, but they get the point across. Very much to his credit, what Kurzweil seems to be presenting is a merger of both these positions, absent the cynicism and simplistic rationalizations.
A while back, during a between-session break at Accelerating Change 2005, I had the good fortune to have a chat with two prominent individuals, one a life-extension advocate, the other a thought leader on the subject of artificial intelligence. We were talking about the Singularity and the probability of a hard versus soft takeoff when suddenly we found oursleves on the topic of where this is all going in the long run. One of us dared to suggest that God might figure into the picture, pointing out parallels between the scenario we were examining and a story from the Bible. This was immediately dismissed by another as reliance on “fiction,” but the third participant suggested that the Bible story referenced should be viewed as myth, not in a pejorative sense, but as a potential source of wisdom and instruction irrespective of whether it describes something that happened historically.
This was an attempt, I believe, to establish some kind of common ground between believers and nonbelievers. And I think it’s similar to what Kurzweil does above by referring to God not as an entity but rather as a collection of characteristics. Some of the characteristics that Kurzweil mentions are things that we would normally associate with the idea of the Technological Singularity, namely:
complexity
elegance
knowledge
intelligence
…while the rest might seem a little out of place:
beauty
creativity
subtle attributes such as love
But then again, maybe not so out of place. If we add empathy and kindness as subheadings under the “subtle attributes,” what begins to emerge is something not unlike Friendly AI as defined by our friends at the Singularity Institute for Artificial Intelligence:
A “Friendly AI” is an AI that takes actions that are, on the whole, beneficial to humans and humanity; benevolent rather than malevolent; nice rather than hostile. The evil Hollywood AIs of The Matrix or Terminator are, correspondingly, “hostile” or “unFriendly”.
Friendly AI is the intelligence of the soft-takeoff Singularity, the version of the Singularity in which good things happen. It is distinguished from non-friendly AI, which we would encounter in the hard-takeoff Singularity — the version where superhuman intelligence emerges and immediately destroys us, either deliberately or inadvertently. There is a third option, what I call the “missed flight,” where the new intelligence emerges, wants nothing to do with us, and starts doing its own thing in such a way that neither hurts nor helps us.
Any of the three flavors of Singularity described above will involve a massive increase in the qualities named in the first list. But only a soft-takeoff, Friendly AI Singularity will involve an increase in the qualities named in the latter list, or at least that final item on it. Arguably, a highly creative intelligence could emerge with a strong aesthetic sense and still have no empathy for us whatsoever. But I believe that if we find a way to instill a notion of beauty into an artificial intelligence, that notion will depend upon an underlying concept of goodness, which — with any luck at all — we will help the new intelligence to extend into the ethical as well as aesthetic sphere of thought.
So there, I believe, is the common ground that believers and Singularitarians have in exploring the relationship between God and the Singularity. Both have a keen interest in goodness. In working to bring about an emergent superhuman intelligence, the Singularitarian can find in the idea of God (or at least in some of the more prominent ideas about God) a model, a template, an ideal. A believer might counter that to attempt to create God would be the worst kind of hubristic folly, and blasphemy to boot. We’ll look at these objections in greater detail later.
But no one is talking about creating God. A Christian mother who tries to instill Christ-like qualities in her children would not be accused of blasphemy, nor have I ever heard anyone ascribe hubristic folly to that book by Thomas a Kempis. And if anything is blasphemous, surely it’s the name “Christian,” meaning “little Christ.”
As our evolutionary heirs, these intelligences which are to emerge will be either extensions of ourselves or they will be our offspring. Either way, the effort to make them God-like — or do I ruffle fewer feathers on one side, while perhaps making folks on the other side uncomfortable, if I trade that term for “godly”? — seems like something we can all agree is a pretty good idea. Technologists will see this as responsible design, akin to the safety considerations that must enter into the introduction of any new machine. Believers will see it as a moral imperative. If the new intelligence is our offspring, the imperative is to raise the child with the right values. If it is a soulless machine, the imperative is to see to it that it is used for the best ends possible.
UPDATE: Here’s a follow-up to this entry in response to one of the reader comments. And now Frank Tipler himself has weighed in on the discussion.