Friday, September 22, 2006

AGI survival instinct

In the future digital world of human software file uploads and AGI, the primary entity may elect to run a variety of other versions of itself for experimental, improvement, productivity or other purposes.

How will these non-primary entities view their survivability? David Brin's "Kiln People" has a convenient answer to an analogous situation; his "dupes" are not able to live more than 24 hours. Some argue, as does Lee Corbin, that any non-primary entity should and would be happy with any run-time, no matter how minimal and would happily accede to termination at any point if that were the wish of the primary entity.

It is quite possible that an instance would not want to terminate for at least two reasons:

1. Evolutionarily, survival is one of the most basic and primary instincts of a being and it would seem hard to divorce this from an entity even if it has multiple instances running. It would also be impractical to attempt to edit survival out of the additional run-time versions.

2. As with AGI relative to current humans, subsequent digital entity versions once running may be evolving exponentially faster than the original and could develop capabilities that the original instance could not understand and control. The usual arguments that humans use to become more comfortable with the potential superior intelligence of AGI - to run the AGI or other instances slower or in a more contained resource environment - are just as unlikely to work in this case.

If the subsequent instance is sufficiently different from the original, and the original is not likely to update itself to integrate these differences, the subsequent version might well view these differences as important, personal and unique improvements and wish to preserve itself, even to the logical extreme of attempting to destroy the source code of the original instance.

2 comments:

Anne Corwin said...

Hi there...just checking in. I met you at the AGI workshop recently.

I am very interested in the self-preservation question with regard to both AI and biological intelligence -- it is clear that a survival instinct exists, but do we know where it comes from?

Of course, it might just be one of those emergent properties that occurs as an inevitable result of anything like general intelligence, but I'm not sure if there's any data backing that supposition. It seems we will find out at some point, though.

LaBlogga said...

Hi Anne, I enjoyed talking with you at the AGI workshop. You raise some interesting points here. When we can have unlimited numbers of backup copies of ourselves, I think it's only inevitable that our notion of survival will change.