Discussion about this post

User's avatar
Shortest Circuit's avatar

There was a guy on the WSJ podcast a few weeks ago who lost his mother unexpectedly. He was also an early AI adopter, so to help himself write an eulogy he fed a selection of her text messages to the chatbot then asked "her" how she feels now that she is dead. The LLM came up with a response that according to the guy was 'exactly' what his mother would've said. He became very emotional. Immediately after he felt disgust. Didn't say why, probably remembered the part of OpenAI's EULA where they claim all data that you feed into their systems, effectively handing over his dead mother to a soulless corporation. Or it was the realization that every interaction with the thanobot won't replace the time he should've spent with his mother. I lost mine before all of this AI nonsense, so I had to formulate my own thoughts about the event and this new situation her family has to get used to.

Expand full comment
Dale R's avatar

“As you’d expect, real world grief counselors are sounding the alarm about losing their jobs.”

I suspect the concern is more people won’t “move on” from their grief. And then develop complicated grief. ... Though if they are seeing a counselor ...

I think the biggest danger with these thanobots is the manipulation of peoples feelings and actions.

Expand full comment
4 more comments...

No posts