Lots of people have mentioned rsynx, restic, borgbackup, and others, but which would be best for backing up nextcloud, immich, and radicale? Do all of them have a method of automatically backing up every X days/weeks? Why use one over the other, what are the differences?

  • doeknius_gloek@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    3 months ago

    The question you’re asking is too broad. Every tool somehow differs from the others, but listing all differences requires in-depth knowledge of each tool and a lot of time.

    At the end of the day, every tool somehow backs up your data. CLI interfaces, encryption algorithms, deduplication logic, supported backends, underlying programming languages and a lot more may differ. Identify what’s most important to you, test different solutions and then use the tool that works best for your use-case.

  • JoeKrogan@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    I do monthly backups with cron and tar and syncthing for my containers.

    I do quarterly backups of my server (14TB) to external USB HDDs. This is done via a script that mounts the drives, runs rsync to copy, then unmounts the drives again and emails me when it is done. I dont bother encrypting them as it ia mainly just media.

  • Fermiverse@gehirneimer.de
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    syncthing fom mobile direct sync to zfs mirror homeserver as soon as within home wlan. Same with PC files.

    rclone sync via cron weekly to cloud server, encrypted.

    security cams direct encrypted sync to zfs and cloud as soon as something is recorded.

  • branch@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 months ago

    I am currently looking into borg because it can take incremental backups. I just need figure out how I should handle a running system, if I need to turn of all my docker images or if there is some kind of snapshot function I can use.

    From what I read on their FAQ, Borg cannot verify the integrity so I would need to turn everything off during the backup process. A filesystem like ZFS could have solved that problem (cannot find the link, something about shadow copy I think?) but since I don’t have a backup yet nor physical access, I need to work with what I have.

    I think I will set it to take a backup every night.

    EDIT: Maybe it can verify integrity? Still trying to find information on my use case. https://borgbackup.readthedocs.io/en/stable/usage/check.html

  • starshipwinepineapple@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    Haven’t used all of those but my recommendation would be to just start trying them. Start small, get a feel for it and expand usage or try a different backup solution. You should be able to do automatic backups for any of them either directly or setting up your own timer/cron jobs (which is how i do it with rsync).

  • nfreak@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    Not sure about other options but Backrest has worked wonderfully for me since day 1. Basically just a GUI for Restic. My only complaints are that jobs can’t be assigned to multiple repos and you can’t edit a job’s name or repo once created. Aside from those quirks, it works fine - I have daily, weekly, monthly, and manual jobs set up across both servers and my desktop, basically just set it and forget it.

  • drkt@scribe.disroot.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 months ago

    There’s a balance to pick between ease of use, ease of recovery, and security. You have to define exactly what you want, and then look at what solutions are available to do that.

    I wrote my own bash script for rsync that simply pulls copies of the vital folders and files over SSH from the machines I want backups from. Then it pushes a copy of all of that to an offsite (in-city friends house) location. There is no encryption at rest, because I choose easy of recovery over security. I also trust my friend, and there really isn’t anything that would compromise me totally, if that harddrive became available on the internet. There also aren’t multiple versions of old files, and if a file is deleted, then it is gone, because I don’t need that feature from my backup system.

    Define your needs, then shop around. No one solution does everything easily.

    • PlutoniumAcid@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Simple file copying is easy and smart.

      What do you do about databases? I’m guessing you are running some containers that have a database, like paperless and many others.

      • drkt@scribe.disroot.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        I’m not backing up any databases that are so intensively used that I can’t live-copy them. Most of my databases (SQLite) sit idle until I explicitly do something to them. SQLite doesn’t really care about it unless it’s actively writing to the database.

        • WhatAmLemmy@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 months ago

          This is why I switched everything to single disk ZFS. The ability to snapshot everything with zero downtime, including data in case I ever misconfig something, as well as replicate all of it to other ZFS drives offsite in the most efficient way possible — including encrypted data without transferring the keys — was a no brainier.

          It isn’t a full backup strategy, but it has features that no other backup software can do anywhere near as easily or efficiently.