this post was submitted on 21 Nov 2023
484 points (98.6% liked)
Firefox
17952 readers
139 users here now
A place to discuss the news and latest developments on the open-source browser Firefox
founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Most of the articles writing about it seem to reference following reddit post: https://old.reddit.com/r/firefox/comments/17ywbjj/whenever_i_open_a_youtube_video_in_a_new_tab_its/k9w3ei4/
Following code is pointed out:
While this is a 5s timeout, the code itself does not check for the user agent. So wherever the code is the 5s timeout will occur. The code also does not seem to be injected server side. I spoofed my user agent and for good measure installed a fresh google chrome, both times the code was present. So this code cannot be used to make any browser slower without making the other browsers slow too.
There is a response to the reddit post, which most articles seem to take their intel from. IMO this response does a good job at exploring what the code could be used for and points out that it is more than likely not for slowing down Firefox users: https://old.reddit.com/r/firefox/comments/17ywbjj/whenever_i_open_a_youtube_video_in_a_new_tab_its/ka08uqj/
I am amused by thinking that many journalists seem to take this story from a post on reddit, without even reading the direct responses - or just copy from another article.
The user agent is in the request header, so it's known before any response is sent from YouTube.
I don't know if that's what they're doing, because it's not possible to know what their server code is doing, making it a far better place to hide sleazy code.
But the server outputs code for the browser to run. Doesn't matter what the server does as long as the browser gets there same output.
Server side they could stall based on the agent but it's not the case here. Whatever is happening seems to be client side.
The client code can be modified depending on the request headers before being returned by the server
They're looking at the code returned by the server with a chrome user-agent and it's the same.
They only said that snippet was present, not that all of the client code was the same
That's out of context. That snippet of code existing is not sufficient to understand when does that part of the code gets actually executed, right?
For all we know, that might have been taken from a piece of logic like this that adds the delay only for specific cases:
It's possible that
complex_obfuscated_logic_to_discriminate_users
has some logic that changes based on user agent.And I expect it's likely more complex than just one if-else. I haven't had the time to check it myself, but there's probably a mess of extremely hard to read obfuscated code as result of some compilation steps purposefully designed to make it very hard to properly understand when are some paths actually being executed, as a way to make tampering more difficult.
The code is not obfuscated. The person i linked to even formatted it nicely. I do not have the time or energy to go through all of youtube's JS. But the 5s everyone is talking about does target every browser the same. Serverside the code isn't altered based on browser detection.
It can be formatted "nicely" with no issue. But that doesn't necessarily make it easy to understand.
What that person posted was in a function named
smb()
that only gets called byrmb()
under certain conditions, andrmb()
gets called byAdB()
under other conditions after being called fromeeB()
used inBaP()
.... it's a long list of hard to read minified functions and variables in a mess of chained calls, declared in an order that doesn't necessarily match up with what you'd expect would be the flow.In the same file you can also easily find references to the user agent being read at multiple points, sometimes storing it in variables with equally esoteric short names that might sneak past the reader if they aren't pedantic enough.
Like, for example, there's this function:
Searching for
vc()
gives you 56 instances in that file, often compared to some strings to check what browser the user is using. And that's just one of the methods where the userAgent is obtained, there's also ayc=Yba?Yba.userAgentData||null:null;
later on too... and several direct uses of bothuserAgent
anduserAgentData
.And I'm not saying that the particular instance that was pointed out was the cause of the problem.. it's entirely possible that the issue is somewhere else... but my point is that you cannot point to a snippet of "nicely formated" messed up transpiler output without really understanding fully when does it get called and expect to draw accurate conclusions from it.
Alternatively, it's funny that people write comments arguing that it wasn't targeted at Firefox users, on a post that already says that it wasn't targeted at Firefox users :P
It doesn't really matter whether it was "targeted" at Firefox specifically or not, what matters is whether the website has logic that discriminates against Firefox users. Those are 2 different things. "End" vs "means".
I wouldn't be surprised if the logic was written by some AI, without specifically targeting any browser, and from the training data the AI concluded that there's a high enough chance of adblocking to deserve handicapping the UX when the browser happens to be Firefox's. Given that all it's doing is slowing the website down (instead of straight out blocking them) it might be that this is just a lower level of protection they added for cases where there's some indicators even if there's not a 100% confidence an adblock is used.
I said that i found different articles blindly copying. But i did not say 404 did so ;)
Allright allright, you win :)