this post was submitted on 11 Jun 2023
43 points (100.0% liked)

Technology

37702 readers
327 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

The ArchiveTeam has the ArchiveTeamWarrior project which is a small virtual machine or Docker instance that will help index and archive various sites based on need. For instance a big focus is on Reddit right now before the API access gets pulled or too many sites go dark permanently. This tool uses spare bandwidth from multiple locations (everyone running the tool) to not hit IP blocks due to rate limits or other issues that occur when trying to access as much of the site as possible.

Archive team is not affiliated with the Internet Archive (Archive.org) although Archive.org hosts the content. This is then made available via The Wayback Machine.

top 10 comments
sorted by: hot top controversial new old
[–] ram@lemmy.ca 8 points 1 year ago* (last edited 1 year ago) (1 children)

I've been contributing as much as I can since I learned about this on the 9th ArchiveTeam Reddit Leaderboard entry for "rammyramram" listing 40.06 GiB or 425.96k items uploaded

Is there a reason I'm limited to only 6 items running concurrently though? I'd like to actually use some of my resources to help with this.

[–] PoorlyShaveApe@beehaw.org 6 points 1 year ago (1 children)

I believe it is to prevent getting your IP blocked. At least that is what I got from the FAQ.

[–] rektifier@sh.itjust.works 1 points 1 year ago* (last edited 1 year ago)

This is true. If you run the reddit-grab project directly without using the warrior (sudo docker run -d --name reddit --label=com.centurylinklabs.watchtower.enable=true --restart=unless-stopped atdr.meo.ws/archiveteam/reddit-grab --concurrent 6 yourname), you can set up to --concurrent 20, and some projects do work well with higher concurrent, but not reddit. 6 is already pushing the limit.

I'm running reddit-grab on 25 VMs on azure (trying to burn my $200 free credit that expires in 10 days) and I can only run --concurrent 4 safely on most of them. The only VMs that can run --concurrent 6 are the ones in India, which seem to be soft-ratelimited by their higher latency anyway.

[–] Subito@beehaw.org 4 points 1 year ago (1 children)

This is a cool idea! Is there anyway to point it at a specific community? /r/nosleep was my favorite.

[–] PoorlyShaveApe@beehaw.org 5 points 1 year ago

I don't think so. I've been trying to find a tool to export as much of a site as possible but haven't found that yet. Bulk Downloader For Reddit, or BDFR might do the tick but I don't have an instance running yet to see.

[–] QuarterSwede@lemmy.world 3 points 1 year ago (1 children)

I’m on no cap fiber here. Might up my plan for the month to 1gb from 500/500. Either way, I’ll definitely run this on my server. Thanks for the idea!

[–] PoorlyShaveApe@beehaw.org 1 points 1 year ago

Each Warrior can only work on one project at a time. You can spin up multiples and have them process different projects. Just remember to use "bridged" instead of "NAT" on the virtual adapter so they each get a unique local IP address.

[–] NotAnArdvark@lemmy.ca 1 points 1 year ago (1 children)

I'm wondering how the subreddit blackout is affecting things. I've noticed my VM has really dropped down in bandwidth used the past hour or so.

[–] PoorlyShaveApe@beehaw.org 2 points 1 year ago

Not sure. I'm leaving mine running for as long as possible to get what can still be archived. When some subreddits come back on the 15th the process will pick up. Then it is a dash until the API dies.

[–] Malin@omg.qa 1 points 1 year ago

I have been using ripME on all the subreddits that I have on my subscription list and have been dowloading for two weeks now and it is still going strong.

load more comments
view more: next ›