And it did what it said it was supposed to do at the time. I can’t say I am surprised that this happened, but knowing its original purpose and watching the creators talk about why they made it. I am disappointed.
NVM, it was one called Eliza, and gave himself an existential spiral.
Would be a good explanation for the pivot from therapy to e-thottery though. Maybe they just realized it was a liability nightmare, and most of the dudes were just horny anyways…
They actually stooped to this level? They used to advertise it as emotional support.
And it did what it said it was supposed to do at the time. I can’t say I am surprised that this happened, but knowing its original purpose and watching the creators talk about why they made it. I am disappointed.
Wasn’t it Replika that started telling depressed people to kill themselves?
I have no idea, but it was fine when I had tried it.
https://www.euronews.com/next/2023/03/31/man-ends-his-life-after-an-ai-chatbot-encouraged-him-to-sacrifice-himself-to-stop-climate-
NVM, it was one called Eliza, and gave himself an existential spiral.
Would be a good explanation for the pivot from therapy to e-thottery though. Maybe they just realized it was a liability nightmare, and most of the dudes were just horny anyways…