this post was submitted on 28 Sep 2025
29 points (96.8% liked)

TechTakes

2254 readers
48 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Want to wade into the sandy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

top 50 comments
sorted by: hot top controversial new old
[–] sc_griffith@awful.systems 23 points 2 weeks ago

pushy rationalist tried to glom onto and fly to meet my niche internet microcelebrity friend & i talked her through setting boundaries instead of installing this person in her life. my good deed for the week

[–] rook@awful.systems 20 points 2 weeks ago* (last edited 2 weeks ago) (7 children)

In today’s torment nexus development news… you know how various cyberpunky type games let you hack into an enemy’s augmentations and blow them up? Perhaps you thought this was stupid and unrealistic, and you’d be right.

Maybe that’s the wrong example. How about a cursed evil ring that when you put it on, you couldn’t take it off and it wracks you with pain? Who hasn’t wanted one of those?

Happily, hard working torment nexus engineers have brought that dream one step closer, by having “smart rings”, powered by lithium polymer batteries. Y’know, the things that can go bad, and swell up and catch fire? And that you shouldn’t puncture, because that’s a fire risk too, meaning cutting the ring off is somewhat dangerous? Fun times abound!

https://bsky.app/profile/emily.gorcen.ski/post/3m25263bs3c2g

image descriptionA pair of tweets, containing the text

Daniel aka ZONEofTECH on x.com: “Ahhh…this is…not good. My Samsung Galaxy Ring’s battery started swelling. While it’s on my finger 😬. And while I’m about to board a flight 😬 Now I cannot take it off and this thing hurts. Any quick suggestions

Update:

  • I was denied boarding due to this (been travelling for ~47h straight so this is really nice 🙃). Need to pay for a hotel for the night now and get back home tomorrow👌
  • was sent to the hospital, as an emergency
  • ring got removed

You can see the battery all swollen. Won’t be wearing a smart ring ever again.

load more comments (7 replies)
[–] rook@awful.systems 18 points 2 weeks ago (3 children)

AI video generation use case: hallucinatory RETVRN clips about the good old days, such as, uh, walmart 20 years ago?

It his the uncanny valley triggers quite hard. It’s faintly unsettling t watch at all, but every individual detail is just wrong and dreamlike in a bad way.

Also, weird scenery clipping, just like real kids did back in the day!

https://bsky.app/profile/mugrimm.bsky.social/post/3lzy77zydrc2q

[–] sc_griffith@awful.systems 16 points 2 weeks ago (1 children)

genuinely think nostalgia might be the most purely evil emotion, and every one of these RETVRN ai videos i see strengthens that belief

load more comments (1 replies)
[–] rook@awful.systems 14 points 2 weeks ago (3 children)

I will say that the flipping between characters in order to disguise the fact that longer clips are impractical to render is a neat trick and fits well into the advert-like design, but rewatching it just really reinforces how much those kids look like something pretending real hard to be a human.

Also, fake old-celluloid-film filter for something that was supposed to be from 20 years ago? Really?

load more comments (3 replies)
[–] BlueMonday1984@awful.systems 12 points 2 weeks ago

hallucinatory RETVRN clips about the good old days

Nostalgiabait is the slopgens' specialty - being utterly incapable of creating anything new isn't an issue if you're trying to fabricate an idealis-

such as, uh, walmart 20 years ago?

Okay, stop everything, who the actual fuck would be nostalgic for going to a fucking Wal-Mart? I've got zero nostalgia for ASDA or any other British big-box hellscape like it, what the fuck's so different across the pond?

(Even from a "making nostalgiabait" angle, something like, say, McDonalds would be a much better choice - unlike Wal-Mart, McD's directly targets kids with their advertising, all-but guaranteeing you've got fuzzy childhood memories to take advantage of.)

[–] o7___o7@awful.systems 16 points 2 weeks ago* (last edited 2 weeks ago)

The US economy is 100% on coyote time.

It wouldn't matter if everyone came to their senses today. All the money that's been invested into AI is gone. It has been turned into heat and swiftly-depreciating assets and can never be recouped.

It's surreal isn't it?

[–] rook@awful.systems 16 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

I know it’s terrible being a drama gossip, but there are some Fun Times on bluesky at the moment. I’m sure most of you know the origins of the project, and the political leanings of the founders, but they’re currently getting publicly riled up about trans folk and palestinians and tying themselves up in knots defending their decision to change the rules to keep jesse singal on site, and penniless victims of the idf off it.

They really cannot cope with the fact that their user base aren’t politically aligned with them, and are desperate to appease the fash (witness the crackdowns on people’s reaction to charlie kirk’s overdue departure from this vale of tears) and have currently reached the Posting Through It stage. I’m assuming at some point their self-image as Reasonable Centrists will crack and one or more of them will start throwing around transphobic slurs and/or sieg-heiling and bewailing how the awful leftists made them do it. Anyone want to hazard a guess at a timeline?

[–] Soyweiser@awful.systems 12 points 2 weeks ago (1 children)

And all this because she simply could not shut up. Which seems to be one of the oldest, modding a large community rules.

[–] rook@awful.systems 14 points 2 weeks ago (2 children)

Nobody ever seems to learn the “never get high from your own supply” lesson. Gotta get that hit of thousands of people instantly supporting and agreeing with whatever dumbfuck thought just fell out.

You absolutely don’t have to hand it to zuckerberg, but he at least is well aware that he runs an unethical ad company that’s bad for the world, has always expressed his total contempt for his users, and has not posted through it.

[–] sc_griffith@awful.systems 15 points 2 weeks ago* (last edited 2 weeks ago)

whenever there's a huge scandal he listens to his PR people, keeps his mouth shut, then goes on a media tour where he lies his ass off in exactly the same way at each stop. it's that simple and yet almost no other tech people can do it

load more comments (1 replies)
[–] self@awful.systems 12 points 2 weeks ago (9 children)

here’s a good summary of the situation including an analysis of the brand new dogwhistle a bunch of bluesky posters are throwing around in support of Jay and Singal and layers of other nasty shit

here’s Jay Graber, CEO of Bluesky, getting called out by lowtax’s ex-wife:

here’s Jay posting about mangosteen (mangosteen juice was a weird MLM lowtax tried to market on the Something Awful forums as he started to spiral)

Anyone want to hazard a guess at a timeline?

since Jay posted AI generated art about dec/acc and put the term in her profile, her little “ironic” nod to e/acc and to AI, my guess is this is coming very soon

load more comments (9 replies)
load more comments (1 replies)
[–] rook@awful.systems 16 points 2 weeks ago (1 children)

Oh hey, bay area techfash enthusing about AI and genocidal authoritarians? Must be a day ending in a Y. Today it is Vercel CEO and next.js dev Guillermo Rauch

https://nitter.net/rauchg/status/1972669025525158031

image descriptionA screenshot of a tweet by Guillermo Rauch, the CEO of Vercel. There’s a photograph of him next to Netanyahu. The tweet reads:

Enjoyed my discussion with PM Netanyahu on how Al education and literacy will keep our free societies ahead. We spoke about Al empowering everyone to build software and the importance of ensuring it serves quality and progress. Optimistic for peace, safety, and greatness for Israel and its neighbors.

I also have strong opinions about not using next.js or vercel (and server-side javascript in general is a bit of a car crash) but even if you thought it was great you should probably have a look around for alternatives. Just not ruby on rails, perhaps.

[–] Soyweiser@awful.systems 12 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

Amazing how they all went from oh us poor nerds vs the jocks to bending the knee and licking the boots of the strongmen. see also YT bribing Trump.

load more comments (1 replies)
[–] Soyweiser@awful.systems 15 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

Tyler Cowen saying some really weird shit about an AI 'actress'.

(For people who might wonder why he is relevant. See his 'see also' section on wikipedia)

E: And you might wonder, rightfully imho, that this cannot be real, that this must be an edit. https://archive.is/vPr1B I have bad news.

[–] blakestacey@awful.systems 17 points 2 weeks ago* (last edited 2 weeks ago)

The Wikipedia editors are on it.

image descriptionscreenshot of Tyler Cowen's Wikipedia article, specifically the "Personal life" section. The concluding sentence is "He also prefers virgin actresses."

[–] blakestacey@awful.systems 15 points 2 weeks ago (10 children)
load more comments (10 replies)
[–] swlabr@awful.systems 15 points 2 weeks ago (11 children)
[–] blakestacey@awful.systems 18 points 2 weeks ago* (last edited 2 weeks ago)

On top of everything else, "the father of quantum computing" is such lazy writing. People were thinking about it before Deutsch, going back at least to Paul Benioff in 1979. Charlie Bennett and Giles Brassard's proposal for quantum key distribution predates Deutsch's quantum Turing machine... No one person should be called "the father of" a subject that had so many crucial contributors in such a short period of time.

Also, chalk up another win for the "billionaires want you to think they are physicists" hypothesis. It's perhaps not as dependable as pedocon theory, but it's putting in a strong showing.

load more comments (10 replies)
[–] flaviat@awful.systems 14 points 2 weeks ago (1 children)
load more comments (1 replies)
[–] sc_griffith@awful.systems 14 points 2 weeks ago* (last edited 2 weeks ago) (9 children)

TERF obsessed with AI finds out the "degenerate" ani skin for grok has an X account, loses her shit

https://xcancel.com/groks_therapist/status/1972848657625198827#m

then follows up with this wall of text

https://xcancel.com/groks_therapist/status/1973127375107006575#m

[–] blakestacey@awful.systems 21 points 2 weeks ago

I got bored and flipped to the replies. The first was this by "TERFs 'r' us":

Excellent overview!

This is transhumanism.

This is going to destroy humanity, @elonmusk.

Put the breaks on!

I hate transhumanism because it's eugenics for 1990s Wired magazine.

You hate it because it has "trans" in the name.

We are not the same.

[–] sc_griffith@awful.systems 13 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

funny thing is she literally talks to ani like a terf talks to a trans woman including saying 'at least I'm a real woman'

load more comments (1 replies)
load more comments (7 replies)
[–] nfultz@awful.systems 14 points 2 weeks ago (3 children)

AI Shovelware: One Month Later by Mike Judge

The fact that we’re not seeing this gold rush behavior tells you everything. Either the productivity gains aren’t real, or every tech executive in Silicon Valley has suddenly forgotten how capitalism works.

... por que no los dos ...

load more comments (3 replies)
[–] swlabr@awful.systems 13 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

Do we have a word for people that are kind of like… AI concern trolls? Like they say they are critical of AI, or even against AI, but only ever really put forward pro-AI propaganda, especially in response to actual criticisms of AI. Kind of centrists or (neo) libs. But for AI.

Bonus points if they also for some reason say we should pivot to more nuclear power, because in their words, even though AI doesn’t use as much electricity as we think, we should still start using more nuclear power to meet the energy demands. (ofc this is bullshit)

E: Maybe it's just sealion

[–] Soyweiser@awful.systems 13 points 2 weeks ago* (last edited 2 weeks ago)

Sealions is a bit more specific, as they do not stop, and demand way more evidence than is normal, Scott had a term for this, forgot it already (one of those more useful Rationalist ideas, which they only employ themselves asymmetrically). Noticed it recently on reddit, some person was mad I didn't properly counter Yuds arguments, while misrepresenting my position (which wasn't that strong tbh, I just quickly typed them up before I had other things to do). But it is very important to take Yuds arguments seriously for some reason, reminds me of creationists.

Think just calling them AI concern trolls works.

load more comments (1 replies)
[–] Soyweiser@awful.systems 13 points 2 weeks ago* (last edited 2 weeks ago)

So bit of a counter to our usual stuff thing. But a worker migrant here won a case against his employer who had linked his living space to his employment contract (forbidden) using chatgpt as an aid (how much is not told). So there actually was a case where it helped.

Interesting note on it, these sorts of cases have no jurisprudence yet, so that might have been a factor. No good links for it sadly as it was all in Dutch. (Cant even find a proper writeup in a bigger news site as a foreigner defending their rights against abuse is less interesting than some other country having a new bisshop). Skeets congratulating the guy here https://bsky.app/profile/isgoedhoor.bsky.social/post/3m27aqkyjjk2c (in Dutch). Nothing much about the genAI usage.

But this does fit a pattern, how, like with blind/bad eyesight people, these tools are veing used by people who have no other recourse because we refuse to help them (this is bad tbh, Im happy they are getting more help don't fet me wrong, but it shouldn't be this substandard).

[–] irelephant@lemmy.dbzer0.com 13 points 2 weeks ago (3 children)

I might flesh this out into a proper post, but I remember stumbling on this weird website: https://www.reactionary.software/

I originally found it on r/shittysysadmin when a screenshot of the discord for it (so much for modern software being bloated and unusable) had the person behind it defending that the site didn't have an ssl cert, since that was a way "modern browsers control you".

I was curious and poked around on the site, and first thing you see is this:

Make software great again! For programmers who hate modern software. Modern software is overcomplicated, bloated, unreliable, incomprehensible, pointlessly ideological, and mostly unusable. We like the values of older software: simplicity, reliability, and usability.

This seems pretty fashy, but there is some legitimate criticisms of modern software.

Saving programming from modern culture

Nevermind, fash.

On the about page, he starts by mentioning html swipers, and one he made himself. The one he made is pretty dogshit on anything but a phone.

After talking a bit on how terrible modern software he then compares it to brick walls, which were also so much better back in the day (?).

He then rambles about ancient greece and rome.
An excerpt:

Why did it take to long? Because humanity had become just too stupid to appreciate good ideas. They weren't completely retarded. Ptolemy's horrible system did require some intelligence to create. [...] Today's West is currently at the level of the decaying Greeks, heading toward complete idiocracy. The programmers in Silicon Valley are like Ptolemy, able to construct and maintain horrible overcomplicated monstrosities, but totally unable to innovate at a fundamental level. All good programming ideas are rejected because they don't fit into current programming ideologies. Any programmer like Aristarchus who comes up with a good programming idea will be rejected and ridiculed for violating orthodoxy. Modern programmers are in love with their own ideas and love complexity. They hate simplicity and anything that violates their ideologies.

He then complains about how his parser was met with scorn by Modern Programmers, because they were too stupid to appreciate how good it was?

The post is here: https://old.reddit.com/r/Compilers/comments/cv78b3/parsing_for_programmers_who_hate_modern_software/

Some excerpts:

What is good? The short answer is that everything that modern culture hates is good, and everything that modern culture loves is bad. But we need more details than that. You can find good values in scripture or in good traditional culture. These values should be applied to programming and to everything else in life. I wrote an Old Testament guide to programming. I also discussed applying traditional Japanese culture to programming. These are just two examples. Any other scripture like the Quran or any traditional culture should work fine to give you good values, in contrast to the horrible values of modern culture, and these good values can guide you to writing good reactionary software.

So, anything anti-modern is good?

Going back to the homepage, there's a page for existing reactionary software.

Java 8.
One of the pinnacles of software, to him, is Java 8.
Not java specifically, just java 8.

Luan is a language he made, which I can only describe as a mash of lua, java and php.
Here's a snippet from the docs:

local Io = require "luan:Io.luan"
local Http = require "luan:http/Http.luan"

return function()
	Io.stdout = Http.response.text_writer()
%>
<!doctype html>
<html>
	<body>
		Hello World
	</body>
</html>
<%
end

So, this is basically a lua clone, written in java. with a few missing features .

The why page on the luan site is probably the most deranged bit of this.

Luan rejects the complexity of modern software. [...] Luan will appeal to you depends on who you are. Members of modern culture will not like Luan because they hate simplicity. Luan will only appeal to good cultures that value simplicity, so I will address the two good cultures that I know of.

The mentioned two good cultures are the Mennonites, and the Japanese.

Japan is the only remaining country that I know of that values quality.

You cannot expect to achieve Japanese standards of quality and reliability if you use modern western software. So use Luan instead.

He has a chronically inactive forum at https://mikraite.arkian.net/Reactionary-Software-f1999.html .

Also, shoutout to this political take: https://web.archive.org/web/20240831203139/http://mikraite.arkian.net/Are-democrats-actually-our-allies-td4846.html

load more comments (3 replies)
[–] antifuchs@awful.systems 13 points 2 weeks ago* (last edited 2 weeks ago) (7 children)

Been tangentially involved in a discussion about how much LLMs have improved in the past year (I don’t care), but now that same space has a discussion of how annoying the stupid pop-up chat boxes are on websites. Don’t know what the problem is, they’ve gotten so much better in the past year?

load more comments (7 replies)
[–] gerikson@awful.systems 12 points 2 weeks ago (6 children)

Check out this epic cope from an Anthropic employee desperately trying to convince himself and others that actually LLMs are getting exponentially better

https://www.julian.ac/blog/2025/09/27/failing-to-understand-the-exponential-again/

Includes screenshots of data where he really really hopes you don't look at the source, and links to AI 2027.

[–] dovel@awful.systems 13 points 2 weeks ago (3 children)

I took a quick peek at his blog.

Oh dear, there is a dedicated rationality subsection...

[–] BigMuffN69@awful.systems 14 points 2 weeks ago

Oh god, he unironically recommends reading the sequences wtf 🤢🤮

[–] antifuchs@awful.systems 12 points 2 weeks ago

Oh lol, I thought his name sounded familiar and yup, he was a concern troll in a Hackerspace I was in, some 12 years ago.

load more comments (1 replies)
[–] aio@awful.systems 12 points 2 weeks ago

Given consistent trends of exponential performance improvements over many years and across many industries, it would be extremely surprising if these improvements suddenly stopped.

I have a Petri dish to sell you

load more comments (4 replies)
[–] blakestacey@awful.systems 12 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

Elon Musk announces "Grokipedia", which is exactly what it sounds like.

for article in wikipedia: grok.is_this_true(article)

The Gizmodo story mentions that he retweeted Larry Sanger, but it doesn't dive into the rabbit hole of just how much of a kook Sanger is and how badly his would-be Wikipedia competitors have failed.

[–] BigMuffN69@awful.systems 12 points 2 weeks ago (5 children)
load more comments (5 replies)
load more comments (1 replies)
[–] corbin@awful.systems 12 points 2 weeks ago* (last edited 2 weeks ago) (7 children)

Jeff "Coding Horror" Atwood is sneering — at us! On Mastodon:

bad news "AI bubble doomers". I've found the LLMs to be incredibly useful … Is it overhyped? FUCK Yes. … But this is NOTHING like the moronic Segway (I am still bitter about that crap), Cryptocurrency, … and the first dot-com bubble … If you find this uncomfortable, I'm sorry, but I know what I know, and I can cite several dozen very specific examples in the last 2-3 weeks where it saved me, or my team, quite a bit of time.

T. chatbot booster rhetoric. So what are those examples, buddy? Very specifically? He replies:

a friend confided he is unhoused, and it is difficult for him. I asked ChatGPT to summarize local resources to deal with this (how do you get ANY id without a valid address, etc, chicken/egg problem) and it did an outstanding, amazing job. I printed it out, marked it up, and gave it to him.

Um hello‽ Maybe Jeff doesn't have a spare room or room to sublet, but surely he can spare a couch or a mailbox? Let your friend use your mailing address. Store some of their stuff in your garage. To use the jargon of hackers, Jeff should be a better neighbor. This is a common issue for unhoused folks and they cannot climb back up the ladder into society without some help. Jeff's reinvented the Hulk tacos meme but they can't even eat it because printer paper tastes awful.

[–] CinnasVerses@awful.systems 13 points 2 weeks ago* (last edited 2 weeks ago) (4 children)

"Provide an overview of local homeless services" sounds like a standard task for a volunteer or a search engine, but yes "you can use my address for mail and store some things in my garage and I will email some contacts about setting you up with contract work" would be a better answer than just handing out secondhand information! Many "amazing things AI can do" are things the Internet + search engines could do ten years ago.

I would also like to hear from the friend "was this actually helpful?"

load more comments (4 replies)
[–] self@awful.systems 12 points 2 weeks ago (1 children)

so I got angry

I really hope atwood’s unhoused friend got the actual infrastructural support you mentioned (a temporary mailing address and an introduction letter emailed to an employer is only slightly more effort than generating slop, jeff, please) but from direct experience with philanthropists like him, I’m fairly sure Jeff now considers the matter solved forever

load more comments (1 replies)
load more comments (5 replies)
load more comments
view more: next ›