this post was submitted on 12 Jun 2023
274 points (100.0% liked)
Technology
37702 readers
353 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
How in the world does setting a bunch of subs to private crash the website?
High-scale software is complex, sometimes there are edge cases where weird unexpected stuff happens. This isn’t a situation they would normally run into.
It absolutely is something they would normally run into. I work on maintaining a massive application; think 60+ teams of 6, each extremely specialized and minimal overlap. Almost 75% of my job is predicting issues and avoiding them. Peer testing draws on this a ton as well. They just continue to plainly show that they don't care. Time and time again, year after year, they continue to have the exact same issues and do fuck all about it.
Why would they normally run into 6000+ subs going private? I'm sure they tested that their code can generally handle some (usually smaller) subs going private, but the number and size of the subs going dark isn't a normal scenario and I doubt anyone would have assumed such a successful and coordinated protest involving some of the biggest subs would even be possible a few months ago.
Someone on Tilde posted that they used to work for Reddit and the way they have the front page set up to pull your subscribed subreddits and the ones that you might like to read from is spaghetti code and very brittle.
Reddit has had extremely spotty reliability forever. It got better in recent years, but still came down every few weeks, or would just randomly say "you broke reddit!". Circa 2015 every evening it would just randomly return 50x errors a good chunk of the time because it was always overloaded.
Backend reliability mustn't be very high up their priority list. Well, neither is UX (old OR new reddit), and let's not pretend that they've been masterminds when it comes to ad placement either, so the real question is what do the higher ups want, and why can't they achieve it?
They want money, and they've tried nothing and are all out of ideas.
Honestly even this year it hasn't been super reliable, even before any of this. Stability has never really been a top priority for them.
It's the user/moderator fault. If they would just keep working without pay there wouldn't be any problem. /s
honestly I figured it'd be the result of all those people running deletion scripts on their accounts
This is probably it. Also ArchiveTeam is archiving Reddit as a high priority target so lots of bots scraping it
God bless archive.org. We'd be so screwed without their efforts.
Maybe their caching doesn't work for private subreddits
It's not crashing. It's just more than 7-80% of the requests getting request error because most of the subreddits went private. It sure does looks like crash to a completely uninformed users but it's not a crash as we think.
they havent worked in a world where the site is not constantly slammed in years. they built the thing to be pre-loaded i bet.