this post was submitted on 24 Jun 2024
108 points (86.0% liked)
Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ
54716 readers
354 users here now
⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.
Rules • Full Version
1. Posts must be related to the discussion of digital piracy
2. Don't request invites, trade, sell, or self-promote
3. Don't request or link to specific pirated titles, including DMs
4. Don't submit low-quality posts, be entitled, or harass others
Loot, Pillage, & Plunder
📜 c/Piracy Wiki (Community Edition):
💰 Please help cover server costs.
Ko-fi | Liberapay |
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Great wizard of the bitrates, grant me your wisdom...
I can't wrap my head around bitrate - if I have a full hd monitor and the media is in full hd then how is it that the rate of bits can make so much difference?
If each frame in the media contains the exact 1920 × 1080 pixels beamed into their respective positions in the display then how can there be a difference, does it have to do something with compression?
Exactly, this is about compression. Just imagine a full HD image, 1920x1080, with 8 bits of colors for each of the 3 RGB channels. That would lead to 1920x1080x8x3 = 49 766 400 bits, or roughly 50Mb (or roughly 6MB). This is uncompressed. Now imagine a video, at 24 frames per second (typical for movies), that's almost 1200 Mb/second. For a 1h30 movie, that would be an immense amount of storage, just compute it :)
To solve this, movies are compressed (encoded). There are two types, lossless (where the information is exact and no quality loss is resulted) and lossy (where quality is degraded). It is common to use lossy compression because it is what leads to the most storage savings. For a given compression algorithms, the less bandwidth you allow the algorithm, the more it has to sacrifice video quality to meet your requirements. And this is what bitrate is referring to.
Of note: different compression algorithms are more or less effective at storing data within the same file size. AV1 for instance, will allow for significantly higher video quality than h264, at the same file size (or bitrate).
This image has the same number of pixels on the top and bottom half, but you can probably see the bottom half looks worse. That's what lower bitrate does. It's like turning up the compression on a jpg -- you are not getting the exact same pixels, just the exact same image size.
https://i.imgur.com/CFriCXf.png
Simple explanation, the higher the bitrate, the more data is dedicated to each frame to be displayed, so the higher the quality of each frame assuming the same resolution. This means fewer artifacts/less blocking, less color banding, etc.
Lower bitrate is the opposite, basically. The video is more compressed, and in the process it throws out as much information as possible while trying to maintain acceptable quality. The lower the bitrate, the more information is thrown out for the sake of a smaller filesize.
Resolution is the biggest factor that affects picture quality at the same bitrate. A 1080p video has a quarter of the resolution of a 2160p video, so it takes much less data to maintain a high quality picture.
Yes, every video you download or stream is actually compressed quite a lot, the bitrate just determines how much compression is applied. Higher bitrate means the file is bigger and less compression is done, while low bitrate means the video has a lot less bits to store all that data and so has to do more compression.
.