this post was submitted on 02 Nov 2023
116 points (92.0% liked)
Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ
54609 readers
454 users here now
⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.
Rules • Full Version
1. Posts must be related to the discussion of digital piracy
2. Don't request invites, trade, sell, or self-promote
3. Don't request or link to specific pirated titles, including DMs
4. Don't submit low-quality posts, be entitled, or harass others
Loot, Pillage, & Plunder
📜 c/Piracy Wiki (Community Edition):
💰 Please help cover server costs.
Ko-fi | Liberapay |
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Well to be fair compression/decompression is EXTREMELY CPU intensive even on newest hardware. It was always that way and will always stay that way probably. The more you compress the longer and more CPU intensive it is to decompress No matter if your pc is 10 years old or if it's a rack server with newest hardware
I believe this is becoming less and less true with modern algorithms. Take for example ZSTD: while the compression speeds differs by several orders of magnitude between the fastest and slowest modes, the decompression difference is only about 20%. The same holds true for flac, where the decompression speed is pretty uniform across all compression levels.
These algorithms probably aren’t used by repacked like fitgirl (so your answer is generally correct in the context of repacks). I do believe it is still interesting to see these new developments in compression techniques.
Definitely! I cannot even imagine what compression algorithm we have in a couple of years. They are probably much better and less CPU intensive while also giving other benefits I can imagine. But as always that's for the future ;D