From dec1c65f5a42bac67d87fb90dbf1e2aaea9dbe94 Mon Sep 17 00:00:00 2001 From: "jochen.keil@38b1f86ab65128dab3e62e726403ceee4f5141bf" Date: Thu, 13 Feb 2020 11:39:02 +0000 Subject: [PATCH] --- doc/forum/Transparent_compression_of_files.mdwn | 5 +++++ 1 file changed, 5 insertions(+) create mode 100644 doc/forum/Transparent_compression_of_files.mdwn diff --git a/doc/forum/Transparent_compression_of_files.mdwn b/doc/forum/Transparent_compression_of_files.mdwn new file mode 100644 index 0000000000..2b3a28a35f --- /dev/null +++ b/doc/forum/Transparent_compression_of_files.mdwn @@ -0,0 +1,5 @@ +Hi, + +I have a lot of files which are around 80MB and can be easily compressed down to ~55MB. I did some tests with brotli and decompression was reasonable fast, at least fast enough that I would probably not notice given my current transfer speeds. In order to save disk space I would like to able to transparently compress my files. That means, a file is stored compressed in git-annex's blob store and decompressed when I `get` it. + +I understand that gpg does compression, but I don't want to deal with encryption, all my repos are local. I've looked at the code and from what I could see the Hash-Backends are rather simple. However, that's probably not the right place. Is this a planned feature? Would it be hard to implement? Of course, ideally the compression algorithm should be configurable. E.g. by just doing a syscall to `brotli` or `gzip`.