this post was submitted on 03 May 2025
212 points (87.1% liked)

Technology

69891 readers
2622 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] lightnsfw@reddthat.com 17 points 5 days ago

If he was falling in love with a chat bot he wasn't happy.

[–] FaceDeer@fedia.io 205 points 1 week ago (11 children)

Ah, this is that Daenerys bot story again? It keeps making the rounds, always leaving out a lot of rather important information.

The bot actually talked him out of suicide multiple times. The kid was seriously disturbed and his parents were not paying the attention they should have been to his situation. The final chat before he committed suicide was very metaphorical, with the kid saying he wanted to "join" Daenerys in West World or wherever it is she lives, and the AI missed the metaphor and roleplayed Daenerys saying "sure, come on over" (because it's a roleplaying bot and it's doing its job).

This is like those journalists that ask ChatGPT "if you were a scary robot how would you exterminate humanity?" And ChatGPT says "well, poisonous gasses with traces of lead, I guess?" And the journalists go "gasp, scary robot!"

[–] sunshine@lemmy.ml 3 points 5 days ago

that additional context is super interesting, but it doesn't take away from the fundamental reality which is that when someone opens up to you about suicidal ideation, it's not acceptable to merely do your best to dissuade them; it's critical to get them to help they need, and there's just no way for a LLM to do that.

this individual is an outlier in that his personal outcome was spectacularly bad, but his story seems familiar to me. I know a lot of people who seem to feel like they're building real relationships with these bots.

[–] JackbyDev@programming.dev 2 points 5 days ago

Human talking to a human: "If you were going to kill someone, how would you do it?"

Human: "I consume a lot of True Crime stuff so I think I have a bit of an idea on how to get away with stuff, or at least some common blunders, why?"

Later

Tonight's top story, local person claims they know how to get away with murder!

[–] Grimy@lemmy.world 102 points 1 week ago (1 children)

Not to mention the gun that was left in easy reach by his parents even after being told he was depressed.

[–] match@pawb.social 21 points 1 week ago (1 children)

according to the article it was hidden somewhere. not locked up or anything just hidden

[–] echodot@feddit.uk 20 points 1 week ago (1 children)

What's hidden mean? In a cupboard, because that isn't hidden it's just put away.

[–] NOT_RICK@lemmy.world 21 points 1 week ago

Anywhere besides a locked safe is irresponsible

[–] shiroininja@lemmy.world 20 points 1 week ago (2 children)

I still don’t think people should be using AI for therapy or relationships.

load more comments (2 replies)
load more comments (6 replies)
[–] wwb4itcgas@lemm.ee 65 points 1 week ago* (last edited 1 week ago) (3 children)

Look, I realize the frontal lobes of the average fifteen year old aren't fully developed, I don't want to be insensitive and I fully support the lawsuit - there must be accountability for what any entity, corporate or otherwise opts to publish, especially for direct user interaction - but if a person reenacts Romeo and Juliet with a goddamn AI chatbot and a gun, there's something else seriously wrong.

[–] sleen@lemmy.zip 3 points 6 days ago (1 children)

It's usually never about undeveloped frontal nodes. As anything can happen to anyone. Of course I agree with you that there's something else wrong. But the usual case of blaming a teens undeveloped brain for something almost always can be traced to solid examples happening to adults.

[–] SoftestSapphic@lemmy.world 2 points 5 days ago

Kids are just as smart as adults.

Many of our leaders never mentally matured past middle school.

This is just rational mass depression from a noticeably dying world while they are held hostage and powerless to do anything to stop it.

load more comments (2 replies)
[–] carl_dungeon@lemmy.world 56 points 1 week ago

this headline is disingenuous. There are so many other things going on here:

  • step dad and 2 much younger siblings. This kid was probably stressed out with new younger half sibs needing a lot of attention
  • gun without a lock stored with ammo in an accessible place
  • florida
  • Christian prep school. Those kids either believe anything is real or are so hopelessly depressed they get into drugs
  • parents are both lawyers. Talk about a high stress time consuming job that probably leaves little time for the three kids

But nah, it was just a chat bot that made a totally normal kid with no other risk factors off himself. They’re probably dying by the thousand right now right?

[–] iAvicenna@lemmy.world 31 points 1 week ago* (last edited 1 week ago) (10 children)

the world needs to urgently integrate

  • critical thinking
  • media interpretation
  • AI fundamentals
  • applied statistics

courses into every school's ciriculum starting from the age of ten to graduation, repeated yearly. Otherwise we are fucked.

[–] lightnsfw@reddthat.com 2 points 5 days ago

Add mandatory therapy and counseling to that list.

[–] MisterMoo@lemmy.world 3 points 6 days ago

Spelling too.

load more comments (8 replies)
load more comments
view more: next ›