this post was submitted on 13 Dec 2023
591 points (98.8% liked)

Technology

59414 readers
3162 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 49 comments
sorted by: hot top controversial new old
[–] NeoNachtwaechter@lemmy.world 124 points 11 months ago (2 children)

What the heck...? My CPU is none of their business.

[–] kbotc@lemmy.world 77 points 11 months ago (2 children)

Google chooses codecs based on what it guesses your hardware will decode. (iPhones get HEVC, Android gets VP9, etc) They just didn’t put much thought into arm based home devices outside of a specific few like the shield.

[–] brochard@lemmy.world 14 points 11 months ago (1 children)

Why wouldn't it be my browser asking for the codecs it prefers instead of the website trying to guess my computer's hardware ?

[–] custard_swollower@lemmy.world 6 points 11 months ago (1 children)

Lots of hardware lies about its useful capabilities.

Can you run 4k? Of course. But can you run more than 4 frames a second?

[–] brochard@lemmy.world 4 points 11 months ago

The browser can lie all they want, at the end of the day the user has the final word if they want to change things.

[–] barsoap@lemm.ee 10 points 11 months ago

My by now rather ancient rk3399 board can hardware-decode both at 4k 60Hz. Which has nothing to do with the fact that it's aarch64, but that Rockchip included a beast of a VPU (it was originally designed for set-top boxes).

How about, dunno, asking the browser what kind of media it would prefer?

[–] w2tpmf@lemmy.world 34 points 11 months ago* (last edited 11 months ago) (1 children)

If you use any Google service, everything of yours is their business. You are their product, voluntarily.

[–] originalucifer@moist.catsweat.com 102 points 11 months ago* (last edited 11 months ago) (2 children)

this prolly wasnt a bad decision early on... why push something to a population who cant utilize it... but shit changes fast, google.

[–] ozymandias117@lemmy.world 50 points 11 months ago (1 children)

It seems somewhat damning that Google’s own browser had a workaround for this, though

[–] originalucifer@moist.catsweat.com 14 points 11 months ago (1 children)

was it ignorance or malicious intent?

if it was a person, i would try and assume ignorance.. im not sure google the company deserves such respect

[–] villainy@lemmy.world 28 points 11 months ago (1 children)

Or it's a company so fuckoff huge that one department (Chrome on Android) couldn't get a bug report escalated in another department (YouTube). Eventually they just put in a UA workaround while the bug rots in a backlog somewhere. Common enterprise bullshit.

Or the Chrome on Android team didn't even bother reporting the issue to YouTube and just threw in a cheap workaround. Also common enterprise bullshit.

[–] lolcatnip@reddthat.com 8 points 11 months ago* (last edited 11 months ago)

Bingo. When I was a Chrome developer working on video stuff, we mostly treated YouTube like a separate company. Getting our stuff to work with theirs was a priority, but no more than, say, Netflix. We pretty much treated them as a black box that consumed the same API we provided for everyone.

[–] Flaky@iusearchlinux.fyi 37 points 11 months ago (1 children)

The weirder thing is Firefox on ARM being detected as a HiSense TV. I did a cursory search to see if HiSense ever used Firefox OS on the TV and it doesn't seem like it. Panasonic seemed to be the only manufacturer using it.

[–] ericswpark@lemmy.ml 10 points 11 months ago* (last edited 11 months ago)

Could be that the developers for the HiSense TV just copy-pasted whatever UA into their browser codebase and called it a day.

[–] crit@links.hackliberty.org 39 points 11 months ago

YouTube is having a lot of totally not anticompetitive "bugs" in these past couple of weeks

[–] MonkderZweite@feddit.ch 28 points 11 months ago* (last edited 11 months ago)

UA sniffing again? What was it with feature detection and whatnot?

[–] catastrophicblues@lemmy.ca 18 points 11 months ago (2 children)

Does this include Apple Silicon Macs? That would be a bold move.

[–] labsin@sh.itjust.works 7 points 11 months ago

This issue was detected when running Firefox on Linux on Apple silicon. Firefox on Mac just identifies as x64.

It's probably not on purpose by YouTube. It's stupid they put restrictions on some heuristics to begin with but maybe because otherwise people would think YouTube is not loading properly while it's the software decoding on the not capable arm PC that can't handle the resolution.

[–] CriticalMiss@lemmy.world 3 points 11 months ago

Nope, my work Mac has 1080p\4K playback no problem.

[–] Wes_Dev@lemmy.ml 14 points 11 months ago

Repeat after me kids. It's not an "oversight", or "mistake", or "bug", or "misunderstanding"...

IF

IT

KEEPS

HAPPENING

[–] biscuitswalrus@aussie.zone 14 points 11 months ago* (last edited 11 months ago) (1 children)

Seems like my Samsung TV app is being hit by stuff too, I had 5 unskippable ads and can't seem to get stable 1080p at 60fps any more despite gigabit fibre and cat6. Meanwhile getting 4k on my YouTube app on Android on WiFi.

Go figure.

YouTube is so desperate to fight this war that they're harming legitimate watchers meanwhile my rockpi running Android TV seems to keep running sTube just fine.

[–] Avg@lemm.ee 5 points 11 months ago (1 children)

The nic on TVs tend to be awful. I can barely break 100mbps on my lg wired or wireless.

[–] c10l@lemmy.world 2 points 11 months ago (1 children)

100mbps should be enough for a few 4K streams, and I imagine you’re not streaming more than one thing to your TV at any given time.

[–] Avg@lemm.ee 1 points 11 months ago (1 children)

4k yes, 4k hdr is where it becomes limiting...from what I've read.

[–] c10l@lemmy.world 1 points 11 months ago

Perhaps, and I’ll readily admit my ignorance on this.

That said, I doubt the HDR overhead would be any larger than the equivalent baseline SDR content.

If my intuition is right, depending on other factors like compression you could still fit at least 2 streams on that bandwidth.

[–] Malfeasant@lemm.ee 11 points 11 months ago

Enshittification intensifies!

[–] atocci@kbin.social 6 points 11 months ago (1 children)

Does this apply to Windows on ARM as well, or is it just Linux specifically for some reason?

[–] Kbobabob@lemmy.world 10 points 11 months ago (1 children)

It's a processor variable, not OS

[–] atocci@kbin.social 2 points 11 months ago

That's what I figured, but every article I've seen on this calls out Linux specifically. I'll have to give it a try from my Surface Pro X when I get home and test.

[–] pbsds@lemmy.ml -2 points 11 months ago

Sounds like a raspberry thing