I have a proxmox+Debian+docker server and I’m looking to setup my backups so that they get backed up (DUH) on my Linux PC whenever it comes online on the local network.

I’m not sure if what’s best is backing up locally and having something else handling the copying, how to have those backup run only if they haven’t run in a while regardless of the availability of the PC, if it’s best to have the PC run the logic or to keep the control over it on the server.

Mostly I don’t want to waste space on my server because it’s limited…

I don’t know the what and I don’t know the how, currently, any input is appreciated.

  • schizo@forum.uncomfortable.business
    link
    fedilink
    English
    arrow-up
    12
    ·
    5 months ago

    I see syncthing being recommended, and like, it’s fine.

    But keep in mind it’s NOT a backup tool, it’s a syncing tool.

    If something happens to the data on your client, for example, it will happily sync and overwrite your Linux box’s copy with junk, or if you delete something, it’ll vanish in both places.

    It’s not a replacement for recoverable-in-a-disaster backups, and you should make sure you’ve got a copy somewhere that isn’t subject to the client nuking it if something goes wrong.

    • daddy32@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      This is very important distinction to be made. Sync is not a backup.

      However, you can get 90% there with Syncthing when you enable file versioning or at least trash can for the files.

    • dwindling7373@feddit.itOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      5 months ago

      Thanks for the heads up, yea I’m well aware of that, I use it to, well… sync, my phone pictures with my PC.