this post was submitted on 06 Aug 2023
1392 points (98.8% liked)
Announcements
23301 readers
8 users here now
Official announcements from the Lemmy project. Subscribe to this community or add it to your RSS reader in order to be notified about new releases and important updates.
You can also find major news on join-lemmy.org
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is interesting, I've never considered torrents for this exact case before. Has anyone done any groundwork to figure out what this would look like from a systems level? I'm having a hard time wrapping my head around the big picture - where the seeders come from, what are the incentives to keep certain kinds of data resilient, how to keep complexity away from the clients, etc.
This issue isn't specific to lemmy, but to the entire web, which has a serious duplicated static data problem. Webtorrents started to tackle these problems, but didn't get too far IMO because the vast majority of torrent seeders use native clients.
Essentially, people need to normalize seeding / posting torrent magnet links, and create platforms built around them. And web and app UIs need ways to view that data in a seamless way.
If I had a lot more time I'd wanna work on it, but my backlog is years long.
There may be an opportunity here for Lemmy to help solve part of the distributed blob problem, that is, what are the incentives for people to contribute bandwidth/storage? Instead of the dodgy crypto-reward schemes we see come up, it could just be an extension to the motivations already driving why people set up Lemmy instances or contribute hours to moderate communities.
Some brain-droppings:
The thing that came to my mind is to have all torrent links in some kind of blockchain, where new torrent links can be added all the time and if someone wants to help the community they can just download all the torrent links from the blockchain and help seeding the torrents. These can be images, videos, audios or whatever else. The problem hereby is it can get really big really fast. This could be prevented by adding size limits for specific file types and also removing torrents from the blockchain, which werent used by a user for a long time. Effectively we would have to add a counter for visits in the last 6months and also the last viewed date from a user for each torrent. That way old, not frequentely accessed data would be deleted and prevent the size to get too big.
This general approach would prevent data loss but also help decentralize data. Of course this is just a concept which just came to my mind but it should be possible to implement i think
Ah, I posted above before I read your reply which basically said the same thing. I think this is a really cool idea (but probably doesn't need blockchain to work).
how about making a requirement of user participation in the network be hosting a chunk of distributed cache?