this post was submitted on 21 Jun 2025
107 points (84.5% liked)

Programming

21166 readers
152 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
 

About enshitification of web dev.

top 50 comments
sorted by: hot top controversial new old
[–] scriptlesslemmypls@lemmy.ml 6 points 6 days ago

Very much true what the author writes, even if the title blames javascript but then in a subtitle he says javascript is not the villain and puts the blame on misuse.

IMHO that possibility of misuse is the reason why javascript needs to have stricter reins.

[–] vext01@lemmy.sdf.org 77 points 1 week ago* (last edited 6 days ago) (7 children)

Yep.

On a rare occasion I hit a website that loads just like "boom" and it surprises me.

Why is that? Because now we are used to having to wait for javascript to load, decompress, parse, JIT, transmogrify, rejimble and perform two rinse cycles just to see the opening times for the supermarket.

(And that's after you dismissed the cookie, discount/offer and mailing list nags with obfuscated X buttons and all other manner of dark patterns to keep you engaged)

Sometimes I wish we'd just stopped at gopher :)

See also: https://motherfuckingwebsite.com/

EDIT: Yes, this is facetious.

[–] who@feddit.org 9 points 6 days ago* (last edited 6 days ago)

Another continual irritation:

The widespread tendency for JavaScript developers to intercept built-in browser functionality and replace it with their own poor implementation, effectively breaking the user's browser while on that site.

And then there's the vastly increased privacy & security attack surface exposed by JavaScript.

It's so bad that I am now very selective about which sites are allowed to run scripts. With few exceptions, a site that fails to work without JavaScript (and can't be read in Firefox Reader View) gets quickly closed and forgotten.

[–] luciole@beehaw.org 1 points 6 days ago* (last edited 6 days ago) (1 children)

having to wait for javascript to load, decompress, parse, JIT, transmogrify, rejimble and perform two rinse cycles

This is whole sentence is facetious nonsense. Just-in-time compilation is not in websites, it's in browsers, and it was a massive performance gain for the web. Sending files gzipped over the wire has been going on forever and the decompressing on receival is nothing compared to the gains on load time. I'm going to ignore the made up words. If you don't know you don't know. Please don't confidently make shit up.

EDIT: I'm with about the nags though. Fuck them nags.

[–] Typewar@infosec.pub 1 points 6 days ago

Having 2 loads gives the illusion that it's fast, aka. not waiting staring at something not doing anything for too long.

From a business perspective, isn't it best to just yeet most stuff to the front end to deal with?

[–] Nachtnebel@lemmy.dbzer0.com 33 points 1 week ago* (last edited 1 week ago) (1 children)
[–] vext01@lemmy.sdf.org 4 points 1 week ago

Hahahahhah.

[–] Zagorath@aussie.zone 18 points 1 week ago (5 children)
[–] vext01@lemmy.sdf.org 12 points 1 week ago* (last edited 1 week ago)

The key idea remains though. Text on a page, fast. No objections with (gasp) colours, if the author would like to add some.

[–] GreatBlueHeron@piefed.ca 8 points 1 week ago (1 children)

I prefer the original. The "better" one had a bit of a lag (only a fraction of a second, but in this context that's important) loading and the "best" one has the same lag and unreadable colours.

[–] Zagorath@aussie.zone 4 points 1 week ago (2 children)

The original is terrible. It works ok on a phone, but on a wide computer screen it takes up the full width, which is terrible for readability.

If you don't like the colours, the "Best" lets you toggle between light mode and dark mode, and toggle between lower and higher contrast. (i.e., between black on white, dark grey on light grey, light grey on dark grey, or white on black)

[–] ulterno@programming.dev 1 points 6 days ago* (last edited 6 days ago)

I exist btw

my settings of the wiki page. This particular one is wiki.archlinux.org, but my settings on wikipedia are similar

Although these websites are still doable.
The kind I absolutely loathe are the ones which, if I make the window width smaller (because the website is not using the space any way), the text in the website further reduces with exact proportion.
At that point, I consider if what I am reading is actually worth clicking the "Reader Mode" button or should I just Ctrl+W

[–] GreatBlueHeron@piefed.ca 3 points 1 week ago (1 children)

OK, I was on my phone. Just checked on my desktop and agree the original could do with some margins. I stand behind the rest of what I said - the default colours for the "best" are awful - the black black and red red is really garish. If I didn't notice the dark/light mode switch and contrast adjustment does it really matter if they were there or not? There is also way to much information on the "best" one - if I'm going to a web site cold, with no expectation at all of what you might find, I'm not going to sit there and read that much text - I need a gentle introduction, that may lead somewhere.

[–] Zagorath@aussie.zone 2 points 6 days ago (2 children)

I actually really like the black black. And they didn't use red red (assuming that term is supposed to mean FF0000); it's quite a dull red, which I find works quite well. I prefer the high contrast mode though, with white white on black black, rather than slightly lower-contrast light grey text. I'm told it's apparently evidence-based to use the lower-contrast version, but it doesn't appeal to me.

Though I will say I intensely dislike the use of underline styling on "WRONG". Underline, on the web, has universally come to be a signal of a hyperlink, and should almost never be used otherwise. It also uses some much nicer colours for both unclicked and visited hyperlinks.

[–] ulterno@programming.dev 1 points 6 days ago

I tend to use proper black on proper white too, specially on a laptop monitor of mine, that makes it look specially good.

[–] GreatBlueHeron@piefed.ca 1 points 6 days ago

Beauty is in the eye of the beholder :-)

[–] 30p87@feddit.org 3 points 1 week ago (2 children)

What's the difference between 1 and 2? And 3's colors hurt my eyes, and flimmers while scrolling (though, color weirdness may come from DarkReader)

[–] grue@lemmy.world 5 points 1 week ago

What’s the difference between 1 and 2?

"7 fucking [CSS] declarations" adjusting the margins, line height, font size, etc.

[–] Zagorath@aussie.zone 3 points 1 week ago

The most important difference between 1 and 2 is, IMO, the width limiter. You can actually read the source yourself, it's extremely simple hand-written HTML & (inline) CSS. max-width:650px; stops you needing to crane your head. It also has slightly lower contrast, which I'm told is supposedly better for the eyes according to some studies, but personally I don't really like as much, which is why "Best" is my favourite, since it has a little button to toggle between light mode and dark mode, or between lower and maximum contrast.

[–] MonkderVierte@lemmy.zip 8 points 1 week ago (1 children)

My usual onlineshop got a redesign (sort of). Now, the site loads the header, then the account and cart icons blink a while and after a few seconds it loads the content.

[–] vext01@lemmy.sdf.org 16 points 1 week ago

Ah yes, and the old "flash some faded out rectangles" to prepare you for that sweet, sweet, information that's coming any.... moment..... now....

No, now....

Now...

[–] reactionality@lemmy.sdf.org 4 points 1 week ago (2 children)

Is "rejimble" a real word for a real thing?

Who's the genius who named it that?

[–] vext01@lemmy.sdf.org 3 points 1 week ago

I made it up, but if be happy for it to be adopted.

[–] grue@lemmy.world 2 points 1 week ago

No, but it could be if we try hard enough!

[–] masterspace@lemmy.ca 66 points 1 week ago* (last edited 1 week ago) (1 children)

An fuck off with these dumbass, utterly vacuous Anti JavaScript rants.

I'm getting so sick of people being like "I keep getting hurt by bullets, clearly it's the steel industry that's the problem".

Your issue isn't with JavaScript it's with advertising and data tracking and profit driven product managers and the things that force developers to focus on churning out bad UXs.

I can build an insanely fast and performant blog with Gatsby or Next.js and have the full power of React to build a modern pleasant components hierarchy and also have it be entirely statically rendered and load instantly.

And guess what, unlike the author apparently, I don't find it a mystery. I understand every aspect of the stack I'm using and why each part is doing what . And unlike the author's tech stack, I don't need a constantly running server just to render my client's application and provide basic interactivity on their $500 phone with a GPU more powerful than any that existed from 10 years ago.

This article literally says absolutely nothing substantive. It just rants about how websites are less performant and react is complicated and ignore the reality that if every data tracking script happened backend instead, there would still be performance issues because they are there for the sole reason that those websites do not care to pay to fix them. Full stop. They could fix those performance issues now, while still including JavaScript and data tracking, but they don't because they don't care and never would.

[–] marlowe221@lemmy.world 22 points 1 week ago* (last edited 6 days ago) (2 children)

Thank you!

Almost everything the author complains about has nothing to do with JS. The author is complaining about corporate, SaaS, ad-driven web design. It just so happens that web browsers run JavaScript.

In an alternate universe, where web browsers were designed to use Python, all of these same problems would exist.

But no, it’s fun to bag on JS because it has some quirks (as if no other languages do…), so people will use the word in the title of their article as nerd clickbait. Honestly, it gets a little old after a while.

Personally, I think JS and TS are great. JS isn’t perfect, but I’ve written in 5 programming languages professionally, at this point, and I haven’t used one that is.

I write a lot of back end services and web servers in Node.js (and Express) and it’s a great experience.

So… yeah, the modern web kind of sucks. But it’s not really the fault of JS as a language.

[–] rikudou@lemmings.world 6 points 6 days ago

Well, JS is horrible, but TS is really pleasant to work with.

[–] masterspace@lemmy.ca 1 points 6 days ago

Exactly, even if you had no front end language at all, and just requests to backend servers for static html and CSS content, those sites would still suck because they would ship the first shitty server that made them money out the door and not care that it got overloaded or was coded garbagely.

[–] perry@aussie.zone 14 points 1 week ago (1 children)

Now it takes four engineers, three frameworks, and a CI/CD pipeline just to change a heading. It’s inordinately complex to simply publish a webpage.

Huh? I mean I get that compiling a webpage that includes JS may appear more complex than uploading some unchanged HTML/CSS files, but I’d still argue you should use a build system because what you want to write and what is best delivered to browsers is usually 2 different things.

Said build systems easily make room for JS compilation in the same way you can compile SASS to CSS and say PUG or nunjucks to HTML. You’re serving 2 separate concerns if you at all care about BOTH optimisation and devx.

Serious old grump or out of the loop vibes in this article.

[–] GreenKnight23@lemmy.world 4 points 6 days ago

I straddle the time between dumping html and CSS files over sftp and using a pipeline to deliver content.

the times a deployment failed over sftp vs cicd is like night and day.

you're always one bad npm package away from annihilation.

[–] grue@lemmy.world 12 points 1 week ago* (last edited 1 week ago) (3 children)

Around 2010, something shifted.

I have been ranting about Javascript breaking the web since probably close to a decade before that.

load more comments (3 replies)
load more comments
view more: next ›