this post was submitted on 11 May 2026
45 points (84.6% liked)

Ask Lemmy

39559 readers
1061 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, toxicity and dog-whistling are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.


6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 2 years ago
MODERATORS
 

There are plenty of headlines about AI induced psychosis, and they all tend follow a similar pattern:

•Individual with a pre-existing vulnerability begins using AI, usually it's use of AI as a conversational partner.

•Gradually they lose the ability to hold conversations with humans who aren't programmed to stroke their ego and replace human connection with AI.

•Eventually, they spiral and completely lose touch with reality. During this time they make terrible decisions that destroy their lives. Then at some point, they are forced to confront the reality of their decisions/behavior, similar to coming out of an extended splitting episode in Dissociative Identity Disorder or waking up sober from an alcohol or drug fueled binge.

Given everything we know about plasticity and human behavior, it would be silly to believe frequent use of AI isn't changing our brains. Even if the majority of users don't develop full blown psychosis, if suddenly your day is spent talking to a self affirming mirror, it's going to change your brain and behavior. It's more a question of "what/how" it's changing people than "if" it's actually changing them.

So, what are some of the more subtle changes (as compared to psychosis) you've noticed in people who frequently use AI? Have you noticed a difference even in those who don't use it as a conversational partner?

you are viewing a single comment's thread
view the rest of the comments
[–] SpikesOtherDog@ani.social 13 points 4 days ago (2 children)

I feel like I'm the last grounding point for a peer who is getting in too deep. He is running all kinds of agents and says that he is afraid of getting left behind. He tells me about openclaw, which I looked into, but not interested in automation that doesn't produce specific repeatable results.

On his behalf I have dug into ollama, but I find that I am just as fast if not faster at the OCR text cleanup using spell checker than arguing with the bot and fixing its mistakes.

He seems to understand my frustrations very well, and my counterpoints seem to be accepted.

I think it is important to try the tools at least a few times and to attempt to integrate them into your workflow, but you need to then take a step back after you finally feel like you have a flow and compare it to your work without. Sure, you are contributing to the numbers briefly, but without being able to articulate your grievances from their perspective your words won't have as much weight.

[–] HubertManne@piefed.social 4 points 4 days ago (1 children)

My feeling is to have it help you do something you know very well. If your awesome at video games play one and ask it what to do at each point. This is what has gotten me to learn how it can fail. It works very often but when it fails its great at making a plausible failure that will lead you down a bad path.

[–] SpikesOtherDog@ani.social 3 points 4 days ago

Basically, that's what I have seen. It gives the average answer, and sometimes conflates information from similar topics or appears to provide solutions that don't exist.

If your task is to take creative solutions and work them into a framework, it might help jump start ideas, but it cannot keep a logical thread.

I feel like it's fine(ish) for work, and I agree, as long as you can show some evidence it's either easing your work flow vs causing you more issues, it's serving it's purpose.

My concern is people who seem to get hooked on it like a drug, and refuse to acknowledge any evidence it's causing more issues than actually helping them. Like they get really anxious/can't function without it, and start trusting AI more than they trust their own ability to reason through a problem.

It's especially concerning to me when people use it like this outside of work, like a life guide. It's almost like the AI starts doing the living for them.

For example, when it comes to navigating relationships, AI can give some really bad advice because it's lacking human connection and feeling/intuition. Those are pretty essential ingredients for decision making. If you decide to always default to AI to help you make decisions or solve problems, you're forgoing the entire experience of having a human relationship.

That connection and the way you feel are kind of the whole point. Human relationships aren't easy, sometimes they hurt, and people usually don't respond well to only being acknowledged when the other person feels like interacting with them. But feelings and being able to understand the other person's perspective even if you don't agree with them, are kind of the entire experience of being human. Without that experience you might as well just not have human relationships, and some people seem to be ok making that sacrifice.