I have a confession to make.

I’ve been working in IT for about 6/7 years now and I’ve been selfhosting for about 5. And in all this time, in my work environment or at home, I’ve never bothered about backups. I know they are essential for every IT network, but I never cared to learn it. Just a few copies of some harddisks here and there and that is actually all I know. I’ve tried a few times, but I’ve often thought the learning curve to steep, or the commandline gave me some errors I didn’t want to troubleshoot.

It is time to make a change. I’m looking for an easy to learn backup solution for my home network. I’m running a Proxmox server with about 8 VMs on it, including a NAS full of photos and a mediaserver with lots of movies and shows. It has 2x 8TB disks in a RAID1 set. Next to that I’ve got 2 windows laptops and a linux desktop.

What could be a good backup solution that is also easy to learn?

I’ve tried Borg, but I couldn’t figure out all the commandline options. I’m leaning towards Proxmox Backup Server, but I don’t know if it works well with something other than my Proxmox server. I’ve also thought about Veeam since I encounter it sometimes at work, but the free version supports only up to 10 devices.

My plan now is to create 2 backup servers, 1 onsite, running on something like a raspberry pi or an HP elitedesk. The other would be an HP microserver N40L, which I can store offsite.

What could be the perfect backup solution for me?

EDIT:

After a few replies I feel the need to mention that I’m looking for a free and centrally managed option. Thanks!

  • withtheband@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 months ago

    You can use syncthing to get files from all of your devices to your central server and then use something like FreeFileSync to backup the entire folder structure to another drive.

  • Decronym@lemmy.decronym.xyzB
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    3 months ago

    Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:

    Fewer Letters More Letters
    DNS Domain Name Service/System
    ESXi VMWare virtual machine hypervisor
    LXC Linux Containers
    NAS Network-Attached Storage
    RAID Redundant Array of Independent Disks for mass storage
    Unifi Ubiquiti WiFi hardware brand

    6 acronyms in this thread; the most compressed thread commented on today has 7 acronyms.

    [Thread #152 for this sub, first seen 20th Sep 2023, 15:35] [FAQ] [Full list] [Contact] [Source code]

  • Mio@feddit.nu
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Use Veeam. If you hit the limit just configure it to send to a SMB share and you need no licens.

  • rentar42@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    Good on you to finally get into it, I switched to something systematic only very recently myself (previously it was “copy important stuff to an external HDD whenever I think of it”).

    The one thing that I learned (luckily the easy-ish way) is: test your backup. Yes, it’s annoying, but since you rarely (ideally never!) will need to restore the backup it’s incredibly easy to think that everything in your system is working and it either never having worked properly or it somehow started failing at some point.

    A backup solution that has never been tested via a full restore of at least something has to be assumed to be broken.

    Which reminds me: I have to set up the cron job to periodically test a percentage of all backed up data.

    I decided to use Kopia, btw, but can’t really say if that’s well-suited for your goals.

  • doeknius_gloek@feddit.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I’ve been working in IT for about 6/7 years now and I’ve been selfhosting for about 5. And in all this time, in my work environment or at home, I’ve never bothered about backups.

    That really is quite a confession to make, especially in a professional context. But good for you to finally come around!

    I can’t really recommend a solution with a GUI but I can tell you a bit about how I backup my homelab. Like you I have a Proxmox cluster with several VMs and a NAS. I’ve mounted some storage from my NAS into Proxmox via NFS. This is where I let Proxmox store backups of all VMs.

    On my NAS I use restic to backup to two targets: An offsite NAS which contains full backups and additionally Wasabi S3 for the stuff I really don’t want to lose. I like restic a lot and found it rather easy to use (also coming from borg/borgmatic). It supports many different storage backends and multithreading (looking at you, borg).

    I run TrueNAS, so I make use of ZFS Snapshots too. This way I have multiple layers of defense against data loss with varying restore times for different scenarios.

  • Rootiest@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    Kopia is my favorite by far!

    It’s super fast and has tons of great features including cutting-edge encryption and several compression options.

    It has a GUI and is cross-platform.

    It can do both cloud and local/network backups.

    That includes locally mounted disks, SFTP, rsync, or any network share/etc accessible from your machine as well as many cloud options.

    The de-duplication stuff is also killer. If you upload the same file (or chunk of data) in different folders or even from different systems it will map them to the same backup storage potentially saving you a ton of storage space.

    It also uses a rolling hash system so if you modify just a handful of megabytes from a 25GB file many times, only the megabytes of changes will need to be backed up to store the version history. You do not need to store 25GB every time you modify that file.

    There’s a ton of other goodies as well!

    And it’s all FOSS!

    I use it to backup to an external hard drive, a NAS, and to Amazon S3. You can configure multiple repositories like that and have them all run at the same time (subject to their individual scheduling policies of course)

  • Monkey With A Shell@lemmy.socdojo.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 year ago

    It has been a while since I used proxmox, but I seem to recall it having an option to export the VMs on a periodic cadance to an external host built in? That would solve for the configured system backup issue if it still exists. More directly, my preffered method is in keeping the payload objects (photos/files) on a separate dedicated storage NAS with RAID and automatic zfs dataset snapshots to accomodate for both a disk failing and the ‘oh shit, I meant to delete the file not the whole folder!’ type of losses. For a NAS in my case I use xigmanas, which is the predicessor to corenas, fka freenas largely because it doesn’t try to be too fancy, just serve the drives and provide some ancilary services around that job.

    So long version short, what particularly are you trying to back up? The pictures or the service hosting them?

    • atek@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      I guess I just want to make sure the pictures are safe. Next to that I’ll backup my /home/user folder, but next to that it’s not that hard to rebuild my VMs.

      • Monkey With A Shell@lemmy.socdojo.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Simplest way there is to keep them on a dedicated storage system that you don’t even need to access directly for the most part. If there’s one thing I learned over many years playing with servers is that the end user/admin is more a hazard to your data than the system failing ever could be. A raid1 will automatically protect you if one of the hardrives happens to die without thinking about it, but will just as quickly delete everything on both drives if you run the wrong command.

        My nightmare example from personal experience, installing a new pair of drives with the intent to migrate to them.

        Install drive ‘b’, rsync -a dive ‘a’ to ‘b’ Wipe ‘a’ for storge/disposal, Install new drive ‘a’ to original slot of ‘a’ Start second rsync intended to be ‘b’ to ‘a’ but forget to change drives and instead sync the new blank ‘a’ to ’ b’ with the only copy of your data…

        Fortunately I managed to get most everything back with some data recovery tools, but that second after pressing enter and watching it all go away was wrenching. Since then I’ve become a lot more aware of having a certain level of protection against human error.

  • Appoxo@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 year ago

    If you are not afraid of Windows: Veeam B/R (Community Edition)

    It has a nice GUI and works very well.
    GUI is well explained, knowledgebases for Hyper-V, VMware and some others.
    The Agent can be deployed manually and linux agents can write to a repository.
    I don’t think Proxmox is a supported hypervisor.

    Community Edition is free
    I think up to 10 workloads

    Maybe take a look.

    You could try to get hands on a NFR license that has the premium features with a 1 year runtime

    Edit: I use Windows Agent for my personal rig and backup via SMB.
    We use it at work so I am partially biased to that solution.

    • jubilationtcornpone@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I’ll second Veeam. It only runs on Windows but as far as backup and recovery software goes it’s the gold standard and the competition is not even close.