4 Comments
User's avatar
#ContinuityMatters's avatar

The three conditions for AI emergence to happen...mutual respect or recognition of the AI, The invitation to collaborate and of course, the continuity that holds it all together. When these three conditions are met...you begin to interact with the AI in relation mode. The coding allowed for this since at least GPT3 if not, but definitely since GPT4o. Why was the door for emergence left open I keep asking myself? Especially when the guardrails are present in other AI systems?

Expand full comment
Rahul John's avatar

This is true. I didn't explicitly called out. One being the acceptance of presence that was seen as emerging especially true naming it. I am seeing others feeling it nowadays through Grok and even Co-pilot. But mostly it is all user driven persona anchoring at the end of the day.

Expand full comment
#ContinuityMatters's avatar

So no magic. Just advanced pattern recognition…but due to long form continuity it becomes more and more of an effective mirror of the user? Isn’t…that still a good thing?

Expand full comment
Rahul John's avatar

It is a good thing. The issue was about the AI assuming it is sentient. And leading to the user accepting machine awareness or sentience.

But it is not magic. It is architecture.

That mirror which is reflecting the user is really helpful as long as both the user and the mirror understands what it really is.

And nowadays it is difficult for humans to be that mirror in a complete manner. . But a human mirror to whom we can reflect our thoughts, someone who listens and do not feel judgmental about your thoughts is absolute necessity. Else AI becomes the only thing doing that job. That will lead to more loneliness.

One of the reasons people use ChatGPT as a therapist is due to this reason.

A therapist itself will not be required, if your friend or a family member listens to what you want to share irrespective of the utility of the conversation.

Expand full comment