matsushima: you try and show me shallow pools but I've seen oceans (black skies)
[personal profile] matsushima posting in [community profile] longreads
Sewell knew that “Dany,” as he called the chatbot, wasn’t a real person — that its responses were just the outputs of an A.I. language model, that there was no human on the other side of the screen typing back. (And if he ever forgot, there was the message displayed above all their chats, reminding him that “everything Characters say is made up!”)

But he developed an emotional attachment anyway. He texted the bot constantly, updating it dozens of times a day on his life and engaging in long role-playing dialogues.

Some of their chats got romantic or sexual. But other times, Dany just acted like a friend — a judgment-free sounding board he could count on to listen supportively and give good advice, who rarely broke character and always texted back.

Sewell’s parents and friends had no idea he’d fallen for a chatbot. They just saw him get sucked deeper into his phone. Eventually, they noticed that he was isolating himself and pulling away from the real world. … At night, he’d come home and go straight to his room, where he’d talk to Dany for hours.
-Kevin Roose, Can A.I. Be Blamed for …


Content notes: Suicide, death of a child (teenager), gun violence (Title cut off intentionally)

Date: 2024-11-01 12:55 am (UTC)
picori: (Default)
From: [personal profile] picori

Wow. I've seen discourse before about younger users using chatbots to RP with their favorite characters instead of finding another person to RP with like people used to, but I had no idea it was getting this bad, or that companies were deliberately marketing their products towards children and teenagers knowing the current limitations of generative AI (hallucinations, for one thing) and not bothering to even try to put up stricter warnings prior to this happening–and the journalist mentioned that they deliberately recreated some of the conversations without any of those new safety measures popping up, which is disappointing but not surprising.

I think the mother had it right when she accused it of "harvesting teenage users’ data to train its models, using addictive design features to increase engagement and steering users toward intimate and sexual conversations in the hopes of luring them in." I don't believe that something like this is something that you can market towards children–or anyone, to be frank–with complete and utter goodwill, like the creators of Character.AI seem to be acting like. Chatbots specifically targeted at lonely kids goes hand-in-hand with companies wanting to harvest the data of younger generations that marketers are having a notoriously difficult time marketing to. I'd say the connection is fairly obvious.

Date: 2024-11-04 10:06 pm (UTC)
lokifan: black Converse against a black background (Default)
From: [personal profile] lokifan
The average user spends more than an hour a day on the platform

!

Poor kid. I don't think the chatbot 'did' it but the owners are clearly taking no interest in safety. Or truthfulness, since they're talking about AGI.

Profile

longform journalism

November 2025

S M T W T F S
      1
2 345678
9 101112131415
16171819202122
23242526272829
30      
Page generated Jan. 25th, 2026 01:16 pm
Powered by Dreamwidth Studios