this post was submitted on 08 Jan 2026
119 points (100.0% liked)

Fuck AI

5157 readers
829 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
top 10 comments
sorted by: hot top controversial new old
[–] LostWanderer@fedia.io 32 points 3 days ago (1 children)

Oh, I can't see this going well...Gassed up LLMs are going to confidently prescribe deadly medicinal cocktails, as it will hallucinate...Since it's artificially incompetent by design! Think of the data breaches and hacks because LLMs are notably insecure and vulnerable to simple and brute force attacks. There's a big storm coming for Utah and they aren't prepared for all that sloppy LLM nonsense.

[–] takeda@lemmy.dbzer0.com 10 points 3 days ago (3 children)

Oh, you're right it looks like it indeed it is LLM.

Also

According to a non-peer-reviewed preprint article from Doctronic, which looked at 500 telehealth cases in its service, the company claims its AI’s diagnosis matched the diagnosis made by a real clinician in 81 percent of cases. The AI’s treatment plan was “consistent” with that of a doctor’s in 99 percent of the cases.

80% is probably the upper bound as it's their service. I'm sure that once there's a diagnosis the treatment would be hard tied to the diagnosis.

It makes me wonder what psychopath gave ok to use their model for medical advice or the ones who coded it. They definitively are aware that it doesn't actually think. They know this otherwise they wouldn't limit the list of allowed medications.

There's a reason that even those drugs aren't OTC.

I wonder if they might have a lawsuit sometime in the future similar to Rite aid, but I would expect that they won't exist by then.

[–] Mikina@programming.dev 8 points 3 days ago* (last edited 3 days ago)

Remember the bug in the RTG that killed several people due to a race condition being a problem that blasted them with extreme amount of rads?

That was 99.999+% reliabilty. How can they be ok with 80? Or even 99? You're just ok with potentionally killing 1% of patients? What the fuck.

[–] LostWanderer@fedia.io 5 points 3 days ago

Yes, LLMs were up-jumped to "AI" by techbros that wanted to create the next big scam. No surprise there, as that's what they always do; leaving someone else holding the bag before everything crashes and loses value. Anyway, I wouldn't specifically trust their 80% number, because they could've massaged the numbers or had real people intervene when their LLM was about dispense a bad prescription double whammy. Even with a limited list of allowed medications, the programming of LLMs allows for 'hallucinations' and breaking the supposed safety rails that are built into the code.

Easily blowing out any guardrails to continue engagement and output. Even OTC drugs can be dangerous when taken for too long or at too high of a dose. You can have adverse reactions with other OTC drugs and nutritional supplements (like vitamin/mineral tablets). A LLM would never be able to account for that...

I'd assume that a lawsuit will be coming up eventually as LLMs used for a purpose they were never meant for...Leads to such avoidable choices! Like the current lawsuits and cases where LLM chatbots isolated and talked suicidal people into committing suicide without seeking help. All for the lifetime engagement!

I can only hope all these corporations pushing LLM powered services and invasive programs get fucked financially and can't continue operating. 🤬

[–] ZILtoid1991@lemmy.world 2 points 3 days ago

Not doctor, but people called me the R-slur and wanted me get fired for much less mistakes done in much less consequential jobs.

[–] recursive_recursion@piefed.ca 18 points 3 days ago* (last edited 3 days ago)

Utahns/people in Utah might want to consider getting prescriptions from out of state before they get Russian Rouletted :/

[–] xxce2AAb@feddit.dk 7 points 3 days ago

"Some of you may die..."

[–] Today@lemmy.world 5 points 3 days ago (2 children)

It's only for refills, so you first have to get a doc to prescribe something.

[–] Tikiporch@lemmy.world 9 points 3 days ago

Just set prescriptions to automatically refill, because that's the best case outcome here.

[–] shalafi@lemmy.world 0 points 2 days ago

They could dial it in some. For example, I'm never not going to need a refill on Ropinirole, not like my restless legs are going to magically get better. But you can see the margin for horrific error.