this post was submitted on 13 Dec 2023
591 points (98.8% liked)

Technology

59414 readers
3376 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] kbotc@lemmy.world 77 points 11 months ago (2 children)

Google chooses codecs based on what it guesses your hardware will decode. (iPhones get HEVC, Android gets VP9, etc) They just didn’t put much thought into arm based home devices outside of a specific few like the shield.

[–] brochard@lemmy.world 14 points 11 months ago (1 children)

Why wouldn't it be my browser asking for the codecs it prefers instead of the website trying to guess my computer's hardware ?

[–] custard_swollower@lemmy.world 6 points 11 months ago (1 children)

Lots of hardware lies about its useful capabilities.

Can you run 4k? Of course. But can you run more than 4 frames a second?

[–] brochard@lemmy.world 4 points 11 months ago

The browser can lie all they want, at the end of the day the user has the final word if they want to change things.

[–] barsoap@lemm.ee 10 points 11 months ago

My by now rather ancient rk3399 board can hardware-decode both at 4k 60Hz. Which has nothing to do with the fact that it's aarch64, but that Rockchip included a beast of a VPU (it was originally designed for set-top boxes).

How about, dunno, asking the browser what kind of media it would prefer?