this post was submitted on 21 Jul 2024
208 points (94.4% liked)

Selfhosted

39206 readers
377 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

Sorry but I can't think of another word for it right now. This is mostly just venting but also if anyone has a better way to do it I wouldn't hate to hear it.

I'm trying to set up a home server for all of our family photos. We're on our way to de-googling, and part of the impetus for the change is that our Google Drive is almost full.We have a few hundred gigs of photos between us. The problem with trying to download your data from Google is that it will only allow you to do so in a reasonable way through Google takeout. First you have to order it. Then you have to wait anywhere from a few hours to a day or two for Google to "prepare" the download. Then you have one week before the takeout "expires." That's one week to the minute from the time of the initial request.

I don't have some kind of fancy California internet, I just have normal home internet and there is just no way to download a 50gig (or 2 gig) file in one go - there are always intrruptions that require restarting the download. But if you try to download the files too many times, Google will give you another error and you have to start over and request a new takeout. Google doesn't let you download the entire archive either, you have to select each file part individually.

I can't tell you how many weeks it's been that I've tried to download all of the files before they expire, or google gives me another error.

you are viewing a single comment's thread
view the rest of the comments
[–] stepan@lemmy.cafe 5 points 1 month ago (1 children)

There was an option to split the download into archives of customizable size IIRC

[–] gedaliyah@lemmy.world 1 points 1 month ago (1 children)

Yeah, that introduces an issue of queuing and monitoring dozens of downloads rather than just a few. I had similar results.

As my family is continuing to add photos over the week, I see no way to verify that previously downloaded parts are identical to the same parts in another takeout. If that makes sense.

[–] Willdrick@lemmy.world 2 points 1 month ago* (last edited 1 month ago) (2 children)

You could try a download manager like DownThemAll on Firefox, set a queue with all the links and a depth of 1 download at a time.

DtA has been a godsend when I had shitty ADSL. It splits download in multiple parts and manages to survive micro interruptions in the service

[–] gedaliyah@lemmy.world 1 points 1 month ago

I couldn't get it working, but I didn't try too hard. I may give it another shot. I'm trying a different approach right now.

[–] gedaliyah@lemmy.world 1 points 1 month ago

DownloadThemAll seems to be helping. I'll update the original post with the details once I have success. In this case, I was able to first download them internally in the browser, then copy the download link and add them to DtA using the link. Someone smarter than me will be able to explain why the extra step was necessary, or how to avoid it.