0

I have several hundred gigs of data that I store on a NAS for my own usage. The NAS storage solution manages local encryption, however I am looking into cloud backup solutions for disaster recovery. To that end, I am looking for a way to create a bulk file container (like a zip or tarball) and a piece of software that will let me encrypt said container using a keypair.

I am looking for a solution that meets the following requirements:

  • Cryptographically secure (obviously)
  • Able to handle large input files; the data in question is many hundreds of gigabytes with individual files as large as 50GB
  • Can output a single, portable, file container for easy upload
  • Has the option to use a key file for encryption

Note: something like Vera/Trucrypt's encrypted volume containers already occurred to me. The problems are twofold:

  • A portable volume file cannot grow/shrink to fit my data
  • I am looking to encrypt existing data so I don't want my backup utility to have to first decrypt a portable volume, copy data to it, then re-encrypt it for upload

Note 2: This question and this question, while similar, do not provide an option for using a key file rather than a passphrase.

EDIT: As an alternative, maybe I am off-base and using a key file doesn't get me any more security than a good implementation and strong passphrase. If that's the case, I'd love to learn why.

enpaul
  • 151
  • 1
  • 1
  • 4
  • You could take a look at [dar](http://dar.linux.free.fr). Haven't tried it yet myself, but I've been meaning to. – AndrolGenhald Mar 03 '18 at 23:01
  • You're asking for a product recommendation which is off-topic here. This question is purely opinion based. You've given us requirements for a solution and we're supposed to tell you what programs would suffice those requirements which is explicitly a product recommendation. –  Jun 02 '18 at 07:00
  • Get good old TrueCrypt and make a file container as large as you want. I can be any size, ever over 4TB. – Overmind Jun 07 '18 at 10:30

2 Answers2

1

If a zip/tarball is an option for you, you can try using PGP to encrypt/decrypt your backup.

PGP uses key pairs for encryption/decryption.

gpg -er <email> -o <encrypted_archive> <raw_archive>

The above solution does require re-encrpyting the whole archive...

If you want file system behavior to allow adding/modifying files, without re-encrypting the whole output - You may want to try eCryptFS with tar for incremental backups (and then you can just use tar to update modified files).

tar -uf myarchive.tar ~/.ecrpytfs/
0

If it were me, I'd use afio with a -P argument pointing to a shell script implementing openssl for compression and encryption. A lot of of archiving solutions will apply encryption or compression to the archive stream making them very intolerant of errors in the data. OTOH afio applies the encryption/compression a file at a time.

This will work well for the model where you want to create a complete backup on each invocation. However if you are working with a large datasets, then just sending the delta across the network will save on bandwidth. For such a scenario, I suggest exposing the clearest from the Nas via a reverse encfs and running rsync on this to the remote mirror (then take a local copy on the remote system if you want to maintain snapshots). The reverse mode encfs exposes a filesystem in an encrypted form:

 NAS -> NAS decrypt -> clearest -> rv encfs -> rsync -> remote host
symcbean
  • 18,418
  • 40
  • 74