My use-case: streaming video to a Linux virtual mount and want compression of said video files on the fly.

Rclone has an experimental remote for compression but this stuff is important to me so that’s no good. I know rsync can do it but will it work for video files, and how I get rsync to warch the virtual mount-point and automatically compress and move over each individual file to rclone for upload to the Cloud? This is mostly to save on upload bandwidth and storage costs.

Thanks!

Edit: I’m stupid for not mentioning this, but the problem I’m facing is that I don’t have much local storage, which is why I wanted a transparent compression layer and directly push everything to the Cloud. This might not be worth it though since video files are already compressed. I will take a look at handbrake though, thanks!

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    7 months ago

    https://github.com/bcopeland/gstfs

    If you want to do it at the filesystem level, which is what it sounds like you’re asking for, it sounds like this could do it. I have not used it.

    If you want to just watch a local directory or directory tree for a file being closed (like, the stream is complete) and then run a command on it (like, to compress and upload it), it sounds like you could use inotifywait with the close_write event.

    • MigratingtoLemmy@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      Thanks, but the second problem I’m working with (and what I forgot to mention) is that I have no local storage - I would like to write semi-directly to cloud storage. I can probably manage a few GB for caching and that’s it.