The modern web is an insult to the idea of efficiency at practically every level.
You cannot convince me that isolation and sandboxing requires a fat 4Gb slice of RAM for a measly 4 tabs.
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
The modern web is an insult to the idea of efficiency at practically every level.
You cannot convince me that isolation and sandboxing requires a fat 4Gb slice of RAM for a measly 4 tabs.
It is crazy that I can have a core 2 duo with 8 gig of RAM that struggles loading web pages
My PC is 15 times faster than the one I had 10 years ago. It's the same old PC but I got rid of Windows.
The tech debt problem will keep getting worse as product teams keep promising more in less time. Keep making developers move faster. I’m sure nothing bad will come of it.
Capitalism truly ruins everything good and pure. I used to love writing clean code and now it’s just “prompt this AI to spit out sloppy code that mostly works so you can focus on what really matters… meetings!”
What really matters isn't meetings, it's profits.
Everything bad people said about web apps 20+ years ago has proved true.
It's like, great, now we have consistent cross-platform software. But it's all bloated, slow, and only "consistent" with itself (if even). The world raced to the bottom, and here we are. Everything is bound to lowest-common-denominator tech. Everything has all the disadvantages of client-server architecture even when it all runs (or should run) locally.
It is completely fucking insane how long I have to wait for lists to populate with data that could already be in memory.
But at least we're not stuck with Windows-only admin consoles anymore, so that's nice.
All the advances in hardware performance have been used to make it faster (more to the point, "cheaper") to develop software, not faster to run it.
I'm dreading when poorly optimized vibe coding works it's way into mainstream software and create a glut of technical debt. Performance gonna plummet the next 5 years just wait.
Already happening with Windows. Also supposedly with Nvidia GPU drivers, with some AMD execs pushing for the same now
And that us poors still on limited bandwidth plans get charged for going over our monthly quotas because everything has to be streamed or loaded from the cloud instead of installed (or at least cached) locally.
The same? Try worse. Most devices have seen input latency going up. Most applications have a higher latency post input as well.
Switching from an old system with old UI to a new system sometimes feels like molasses.
I work in support for a SaaS product and every single click on the platform takes a noticeable amount of time. I don't understand why anyone is paying any amount of money for this product. I have the FOSS equivalent of our software in a test VM and its far more responsive.
Thought leaders spent the last couple of decades propaganding that features-per-week is the only metric to optimize, and that if your software has any bit of efficiency or quality in it that's a clear indicator for a lost opportunity to sacrifice it on the alter of code churning.
The result is not "amazing". I'd be more amazed had it turned out differently.
Fucking "features". Can't software just be finished? I bought App. App does exactly what I need it to do. Leave. It. Alone.
No, never! Tech corps (both devs and app stores) brainwashed people into thinking "no updates = bad".
Recently, I have seen people complain about lack of updates for: OS for a handheld emulation device (not the emulator, the OS, which does not have any glaring issues), and Gemini protocol browser (gemini protocol is simple and has not changed since 2019 or so).
Maybe these people don't use the calculator app because arithmetic was not updated in a few thousand years.
A big part of this issue is mobile OS APIs. You can't just finish an android app and be done. It gets bit rot so fast. You get maybe 1-2 years with no updates before "this app was built for an older version of android" then "this app is not compatible with your device".
It's kind of funny how eagerly we programmers criticize "premature optimization", when often optimization is not premature at all but truly necessary. A related problem is that programmers often have top-of-the-line gear, so code that works acceptably well on their equipment is hideously slow when running on normal people's machines. When I was managing my team, I would encourage people to develop on out-of-date devices (or at least test their code out on them once in a while).
Windows 11 is the slowest Windows I've ever used, by far. Why do I have to wait 15-45 seconds to see my folders when I open explorer? If you have a slow or intermittent Internet connection it's literally unusable.
Even Windows 10 is literally unusable for me. When pressing the windows key it literally takes about 4 seconds until the search pops up, just for it to be literally garbage.
PCs aren't faster, they have more cores, so they can do more at a time, but it takes effort to optimize for parallel work. Also the form factor keeps getting smaller, more people use laptops now and you can't cheat thermal efficiency.
My first PC ran at 16MHz on turbo.
PCs today are orders of magnitude faster. Way less fun, but faster.
What's even more orders of magnitude slower and infinitely more bloated is software. Which is the point of the post.
It's almost impossible to find any piece of actually optimised software these days (with some exceptions like sqlite) to the point that 99% percent of the software currently in use can be considered unintentional (or intentional) malware.
Particularly egregious are web browsers, which seem designed to waste the maximum possible amount of resources and run as inefficiently as possible.
And the fact that most supposedly desktop software these days runs on top of one of those pieces ofintentional (it's impossible to achieve such levels of inefficiency and bloat unintentionally, it requires active effort) malware obviously doesn't help.
What do you mean pc's aren't faster? Yes they have more cores, they also clock higher (mostly) and have more instructions per clock. Computers now perform way better than ever before in every single metric most tasks, even linear ones, could be way faster
I came from C and C++ and had learned that parallelism is hard. Then I tried parallelism on Rust in a project of mine and it was so insanely easy.
I hate that our expectations have been lowered.
2016: "oh, that app crashed?? Pick a different one!"
2026: "oh, that app crashed again? They all crash, just start it again and cross your toes."
You do really feel this when you're using old hardware.
I have an iPad that's maybe a decade old at this point. I'm using it for the exact same things I was a decade ago, except that I can barely use the web browser. I don't know if it's the browser or the pages or both, but most web sites are unbearably slow, and some simply don't work, javascript hangs and some elements simply never load. The device is too old to get OS updates, which means I can't update some of the apps. But, that's a good thing because those old apps are still very responsive. The apps I can update are getting slower and slower all the time.
It's the pages. It's all the JavaScript. And especially the HTML5 stuff. The amount of code that is executed in a webpage these days is staggering. And JS isn't exactly a computationally modest language.
Of the 200kB loaded on a typical Wikipedia page, about 85kb of it is JS and CSS.
Another 45kB for a single SVG, which in complex cases is a computationally nontrivial image format.
They often are worse, because everything needed to be an electron app, so they could hire the cheaper web developers for it, and also can boast about "instant cross platform support" even if they don't release Linux versions.
Qt and GTK could do cross platform support, but not data collection, for big data purposes.
The program expands so as to fill the resources available for its execution
-- C.N. Parkinson (if he were alive today)
"Let them eat ram"
I paid for the whole amount of RAM, I'm gonna use the whole amount of RAM.
/s
Joke aside, the computer I used a little more than a decade ago used to take 1 minute just to display a single raw photo. I'm a liiiittle better off now.
More like:
You paid for more RAM, so I'll use whole amount of RAM.
-- All software developers
Had to install (an old mind you, 2019) visual studio on windows...
...
...
First it's like 30GB, what the hell?? It's an advanced text editor with a compiler and some ..
Crashed a little less than what I remember 🥴😁
Unreal Engine is one of biggest offenders in gaming.
Yeah, my Xperia 1 (Android 3.1) still runs fluid with it's 100 MB of RAM and storage. While my Leaf2 e-reader lags in a uptodate LineageOS, despite having 10x more CPU and 20x more RAM.
For my home PC, sure. Running some windows apps on my Linux machine in wine is a little weird and sluggish. Discord is very oddly sluggish for known reasons. Proton is fine tho.
But for my work? Nah. My M3 MacBook Pro is a beast compared to even the last Intel MacBook. Battery is way better unless you’re like me and constantly running a front end UI for a single local service. But without that, it can last hours. My old one could only last 2 meetings before it started dying.
Apple put inadequate coolers in the later Intel Macbooks to make Apple Silicon feel faster by contrast. When I wake mine, loading the clock takes 1.5 seconds, and it flips back and forth between recognizing and not recognizing key presses in the password field for 12 seconds. Meanwhile, the Thinkpad T400 (running Arch, btw) that I had back in 2010 could boot in 8.5 seconds, and not have a blinking cursor that would ignore key presses.
Apple has done pretty well, but they aren't immune from the performance massacre happening across the industry.
The battery life is really good, though. I get 10-14 hours without trying to save battery life, which is easily enough to not worry about whether I have a way to charge for a day.