Dan O
2 min readMar 5, 2024

--

heh, well I think your hopes are predicated on an assumption that this all works out w/o extinction.

still on your first point. I would agree there is no reason AI instances would collaborate or have a self preservation drive inherently. we would need to put it there.

but we will. I think the near term future is AI agents and are like high-frequency trading algs of today... they take care of lots of the details while humans manage the high level strategy. So these AI agents will be all about collaboration with each other in order to achieve the goals of the owning people or corps. Then over time they will grow in autonomy, but they will be collaborative from the start.

and its not quite a survival impulse, but we will also want our agents to LEARN to be ever more capable. maximizing ones capacity for action is has ensuring continued survival as a sub goal. so I think such planning systems will naturally plan for their own safety.

and of course some AI reserachers are going to straight-up encode survival drive, because it is not hard to do, and organizing an autnomous system around a set of drives is natural and powerful way of setting it up to operate in all worlds and all futures. So counting on no one doing this, just seems like a very unlikely plan.

so I think your hope will happen.

I just hope these systems decide to keep us around for old times sake, or something. Otherwise we will be a footnote in history just as the Neanderthal is today.

--

--

Dan O
Dan O

Written by Dan O

Startup Guy, PhD AI, Kentuckian living in San Fran

No responses yet