this post was submitted on 01 Sep 2024
70 points (96.1% liked)
Programming
17423 readers
85 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The answer to questions like this is often that there was no need for such safety features when the underlying technology was introduced (more examples here) and adding it later required consensus from many people and organizations who wouldn't accept something that broke their already-running systems. It's easy to criticize something when you don't understand the needs and constraints that led to it.
(The good news is that gradual changes, over the course of years, can further improve things without being too disruptive to survive.)
He's not wrong in principle, though: Building safe web sites is far more complicated than it should be, and relies far too much on a site to behave in the user's best interests. Especially when client-side scripts are used.
And that assumption is exactly what led us to the current situation.
It doesn't matter, why the present is garbage, it's garbage and we should address that. Statements like this are the engineering equivalent of "it is what it is shrug emoji".
Take a step back and look at the pile of overengineered yet underthought, inefficient, insecure and complicated crap that we call the modern web. And it's not only the browser, but also the backend stack.
Think about how many indirections and half-baked abstraction layers are between your code and what actually gets executed.
The problem is fixing it without inadvertently breaking for someone else. Changing the default behavior isn’t easy.
There’s probably some critical systems that relies on old outdated practices because that’s the way it worked when it was written 20 years ago. Why should they go back and fix their code when it has worked perfectly fine for the past two decades?
If you think anything in software has worked "perfectly fine for the past two decades", you're probably not looking closely enough.
I exaggerate, but honestly, not much.
Billions of programs worked perfectly fine today.
Cynicism is easy, but not helpful.
Yes, popular programs behave correctly most of the time.
But "perfectly fine for the last two decades" would imply a far lower rate of CVEs and general reliability than we actually have in modern software.