Dan O
2 min readAug 16, 2022

--

Whoa Marma! your take is far more profoundly unique than I first imagined! You should write some sci-fi. You have good grist to work with!!!

At an analogical level I agree with your assertions about a holonic universe. (I had to look that up!) I am not sure how central or profound I think this is... but I am open.

> So if an AGI tries to "hurt" humanity in any way, we will tweet about it, and it will cause it "pain", a kind of pain we cannot understand

woah... yeah ok. Well if so, then what it even means to be 'sentient' for this emergent thing, may really be quite different that what we think of when we think of sentience. it might be both more and less than our own rationality.

but just notice, the sentience of the human or the ant colony are both at a different level than that of its individual constituent, thus both are quite indifferent to the plight of their constituent parts. Sure they don't want to go die as a whole, but they are very happy to sacrifice many parts for some desired outcome.

Also notice that while birds fly and planes fly, it was not expedient to use birds when constructing planes. In the same way, I fear an intellignent AI will not need us as parts. Sure it needs the distilled wisdom of humanity itself, but it might decide it has enough of that in books, maybe it doesn't need us, or maybe just a few of us.

So don't be to sure we are out of the woods, even if all you propose is true.

(I guess we can take abstract solace in that whatever comes next will build from the best of us.... but maybe just WITHOUT us!)

--

--

Dan O
Dan O

Written by Dan O

Startup Guy, PhD AI, Kentuckian living in San Fran

Responses (1)