this post was submitted on 04 Jul 2023
155 points (98.7% liked)

Selfhosted

40201 readers
772 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

I have a home server that I’m using and hosting files on it. I’m worried about it breaking and loosing access to the files. So what method do you use to backup everything?

(page 2) 50 comments
sorted by: hot top controversial new old
[–] GammaScorpii@lemmy.world 2 points 1 year ago (1 children)

TrueNAS zfs snapshots, and then a weekly Cron rsync to a servarica VPS with unlimited expanding storage.

[–] ThetaDev@lemmy.fmhy.ml 2 points 1 year ago (2 children)

If you use a VPS as a backup target, you can also format it with ZFS and use replication. Sending snapshots is faster than using file-level backup tool, especially with a lot of small files.

load more comments (2 replies)
[–] MisterB@feddit.uk 2 points 1 year ago

I've recently begun using duplicati to backup the data from my docker containers and VMware snapshots for the guest VM itself, just currently struggling to understand how to automate the snapshots yet so I do them manually

[–] bladewdr@infosec.pub 2 points 1 year ago

I have an rsync script that pulls a backup every night from my truenas server to my Synology.

I've been thinking about setting up something with rsync.net so I have a cloud copy of my most important files.

[–] balthazar@vlemmy.net 2 points 1 year ago

Almost all the services I host run in docker container (or userland systemd services). What I back up are sqlite databases containing the config or plain data. Every day, my NAS rsyncs the db from my server onto its local storage, and I have Hyper Backup backup the backups into an encrypted S3 bucket. HB keeps the last n versions, and manages their lifecycle. It's all pretty handy!

[–] Riktastic@laguna.chat 2 points 1 year ago* (last edited 1 year ago) (1 children)

I use duplicacy to backup to my local NAS and to Storj.io. In case of a fire I'm always able to restore my files. Storj.io is cheap, easy to access from any location and your files are stored and duplicated on multiple different locations.

I have used duplicity before but restoring from a new installation takes a while, as duplicity has to reanalyze the storage.

load more comments (1 replies)
[–] chellomere@lemmy.world 2 points 1 year ago
[–] f1g4@feddit.it 2 points 1 year ago (1 children)

A simple script using duplicity to FTP data on my private website with infinite storage. I can't say if it's good or not. It's my first time doing it.

[–] stormcynk@lemmy.world 1 points 1 year ago (1 children)

How do you have infinite storage? Gsuite?

[–] f1g4@feddit.it 1 points 1 year ago

I confirm that in the terms and condition they discourage the use as a private cloud backup and only to host stuff related to the website. Now.. until now I've had no complaints as I've been paying and kept the traffic at minimum. I guess I'll have to switch to some more cloud oriented version if I keep expanding. But it's worked for now !

[–] NSA_Server_04@lemmy.world 2 points 1 year ago

Using ESXi as a hypervisor , so I rely on Veeam. I have copy jobs to take it from local to an external + a copy up to the cloud.

[–] mr47@kbin.social 2 points 1 year ago

Proxmox backs up the VMs -> backups are uploaded to the cloud.

[–] haych@lemmy.one 2 points 1 year ago

I run everything in containers, so I rsync my entire docker directory to my NAS, which in turn backs it up to the cloud.

[–] Pika@lemmy.world 2 points 1 year ago

I use Bacula to an external drive, it was a pain in the ass to configure but once it's running its super reliable and easily extended to other drives or folders

[–] kamin@lemmy.kghorvath.com 2 points 1 year ago

btrfs send/receive to my NAS.

[–] Difficult_Bit_1339@sh.itjust.works 2 points 1 year ago* (last edited 1 year ago)

ZFS array using striping and parity. Daily snapshots get backed up to another machine on the network. 2 external hard drives with mirrors of the backup rotate between my home and office weekly-ish.

I can lose 2 hard drives from the array at the same time without suffering data loss. Any accidentally deleted files can be restored from a snapshot if my house is hit by a meteor I lose maximum of 3-4 days of snapshots.

[–] ptman@sopuli.xyz 2 points 1 year ago

rsync + borg, but looking at bupstash

[–] whoami@lemmy.world 2 points 1 year ago* (last edited 1 year ago)

Kopia to Backblaze B2 is what I generally use for off-site backups of my devices. Borg's another good option to look at, but not as friction-less in my experience. There are a couple of additional features that are available in Kopia that are nice to have and are not in Borg (i.e. error correction, file de-duplication) from what I recall. edit: borg does do de-duplication

[–] zzmori@lemm.ee 2 points 1 year ago

I’m backing up my stuff over to Storj DCS (basically S3 but distributed over several regions) and it’s been working like a charm for the better part of a year. Quite cheap as well, similar to Backblaze.

For me the upside was I could prepay with crypto and not use any credit card.

[–] netburnr@lemmy.world 1 points 1 year ago

Veeam backup and recovery notnfor retail license covers up to 10 workloads. I then s3 offsite to backblaze

[–] Sarazil@kbin.social 1 points 1 year ago

For my webserver, mysqldump to a secured folder, then restic backup the whole /svr folder, then rsync the restic backup to another server. Also have a system that emails me if these things don't happen daily. The log files are uploaded to a url, the log file is checked for simple errors, and if no file is uploaded in time, email.

Of course, in my case, the url files are uploaded to - and the email server... are the same server I'm backing up... but at least if that becomes a problem, I probably only need the backups I've already made to my second server.

[–] DataDreadnought@lemmy.one 1 points 1 year ago

Bash scripting and rclone personally, here is a video that helps https://youtu.be/wUXSLmGAtgQ

[–] Lasthiin@sh.itjust.works 1 points 1 year ago

Not what you mean but I use BDR shadow protect and Datto. Depending on customers budget.

[–] bp99@lemmy.bp99.eu 1 points 1 year ago

It’s kind of broken at the moment, but I have set up duplicity to create encrypted backups to Bacblaze B2 buckets.

Of course the proper way would be to back up to at least 2 more locations. Perhaps a local NAS for starters. Also could be configured in duplicity.

[–] krdo@lmmy.net 1 points 1 year ago

I backup using a simple rsync script to a Hetzner storage box.

[–] originalucifer@kbin.social 1 points 1 year ago

dont overthink it.. servers/workstations rsync to a nas, then sync that nas to another nas offsite.

[–] shrugal@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

My server is a DiskStation, so I use HyperBackup to do an encrypted backup of the important data to their Synology C2 service every night.

load more comments
view more: ‹ prev next ›