From 5638ae9688bb62eecd97a9f69c773e61f4e46c2f Mon Sep 17 00:00:00 2001 From: Ilya_Shlyakhter Date: Tue, 7 May 2019 00:59:31 +0000 Subject: [PATCH] Added a comment --- .../comment_2_aaff896ff99366d3b96c523284c1248e._comment | 8 ++++++++ 1 file changed, 8 insertions(+) create mode 100644 doc/todo/key_checksum_from_chunk_checksums/comment_2_aaff896ff99366d3b96c523284c1248e._comment diff --git a/doc/todo/key_checksum_from_chunk_checksums/comment_2_aaff896ff99366d3b96c523284c1248e._comment b/doc/todo/key_checksum_from_chunk_checksums/comment_2_aaff896ff99366d3b96c523284c1248e._comment new file mode 100644 index 0000000000..9a02bff2de --- /dev/null +++ b/doc/todo/key_checksum_from_chunk_checksums/comment_2_aaff896ff99366d3b96c523284c1248e._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="Ilya_Shlyakhter" + avatar="http://cdn.libravatar.org/avatar/1647044369aa7747829c38b9dcc84df0" + subject="comment 2" + date="2019-05-07T00:59:31Z" + content=""" +Thanks for the BLAKE2SP224 pointer; it does solve (1). I'm still looking for the best way to solve (2): registering large remote files with git-annex, without downloading them. `addurl --fast` does that, but creates a non-checksum key. If I can get an MD5 without downloading, I can use `setpresentkey`. But often I only have the MD5 for the fixed-size chunks of the file, not for the whole. Adding a backend variant computable from MD5s of the chunks would solve the problem. Maybe, there are other solutions? +"""]]