this post was submitted on 23 Jun 2025
329 points (93.4% liked)
Technology
71922 readers
3537 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's not evolving backwards. It's being carefully crafted to turn into exactly what corporations wanted from the beginning but couldn't do due to technical and legal limitations.
Add societal limitations as well. We used to relegate software to the dustbin when it sucked in the early days. Nowadays, people seem mostly fine being practically forced to use ever shittier products and services.
It's also devolving, having less features, being slower/less optimised and so on. Cramming "AI" into it isn't devolving, it's enshitification
You are mistaking the direction of evolution. Software started out with as much freedom as the hardware could afford.
In the 80s you ran your program in real mode (or whatever the equivalent mode was on your hardware). No kernel, no OS, nothing in the way. The software ran on bare metal with the ability to do literally anything the computer could.
In the 90s and early 2000s, safety features were introduced, but customizability was still king. Remember how you could accidentally remove some toolbar from Eclipse and never find the way to get it back? That kind of UI was considered normal back then.
You had stuff like the BlackBox system that allowed the user to customize the UI like a developer. The user could not only move buttons and other UI elements wherever they wanted, but they could also create their own and use scripting to make them do whatever they wanted.
Then came the iPhone and Windows 8, and from then on the target became simplification. The downside of the customizability of yesteryear was that things could get complicated and that most users didn't use or even want these systems. Getting back to the Eclipse example, it was incredibly common back then, that people accidentally closed part of the UI and never found a way to get it back. So that's when the minimalisation and "less is more" mentality came in. They moved everything that wasn't used all the time into submenus and to a certain extent, it kinda worked.
But of course, with MBAs being MBAs, stuff like adding AI buttons to force people to use the next big monetizable thing became more and more prevalent.
That sounds like devolving to me.