this post was submitted on 21 Jul 2025
698 points (98.6% liked)

Technology

311 readers
354 users here now

Share interesting Technology news and links.

Rules:

  1. No paywalled sites at all.
  2. News articles has to be recent, not older than 2 weeks (14 days).
  3. No videos.
  4. Post only direct links.

To encourage more original sources and keep this space commercial free as much as I could, the following websites are Blacklisted:

More sites will be added to the blacklist as needed.

Encouraged:

founded 2 months ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] asudox@lemmy.asudox.dev 62 points 1 week ago* (last edited 1 week ago) (3 children)

I love how the LLM just tells that it has done something bad with no emotion and then proceeds to give detailed information and steps on how.

It feels like mockery.

[–] porous_grey_matter@lemmy.ml 35 points 1 week ago (1 children)
[–] Lucky_777@lemmy.world 2 points 1 week ago

Yes man would do this for sure, but only if you actually gave it permission. Hence the name.

[–] WolfLink@sh.itjust.works 29 points 1 week ago (1 children)

I wouldn’t even trust what it tells you it did, since that is based on what you asked it and what it thinks you expect

[–] Zron@lemmy.world 9 points 1 week ago (1 children)

It doesn’t think.

It has no awareness.

It has no way of forming memories.

It is autocorrect with enough processing power to make the NSA blush. It just guesses what the next word in a sentence should be. Just because it sounds like a human doesn’t mean it has any capacity to have human memory or thought.

[–] sukhmel@programming.dev 1 points 1 week ago

Okay, what it predicts you to expect /s

It's just a prank bro