diff --git a/doc/forum/git-annex_parallel_invocation_and_the_git-annex_branch/comment_6_7d6d0da2243967b9660d8222c546112d._comment b/doc/forum/git-annex_parallel_invocation_and_the_git-annex_branch/comment_6_7d6d0da2243967b9660d8222c546112d._comment new file mode 100644 index 0000000000..7558377b49 --- /dev/null +++ b/doc/forum/git-annex_parallel_invocation_and_the_git-annex_branch/comment_6_7d6d0da2243967b9660d8222c546112d._comment @@ -0,0 +1,18 @@ +[[!comment format=mdwn + username="Ilya_Shlyakhter" + avatar="http://cdn.libravatar.org/avatar/1647044369aa7747829c38b9dcc84df0" + subject="comment 6" + date="2019-05-06T22:38:43Z" + content=""" +During parallel running of git-annex commands, I also get errors like + +git-annex: .git/annex/othertmp/inge59014-3: getFileStatus: does not exist (No such file or directory) +failed + +git-annex: .git/annex/othertmp/ingest-assemble_den59014-8: removeLink: does not exist (No such file or directory) +failed + +Is this also now fixed by [[todo/only_allow_one_git_queue_to_be_flushed_at_a_time]], or is it a separate issue? + +Also, is it correct that linked worktrees must be on the same filesystem as the main worktree, when using git-annex, because the linked worktrees must be on the same filesystem as .git/annex/othertmp ? +"""]] diff --git a/doc/todo/key_checksum_from_chunk_checksums/comment_2_aaff896ff99366d3b96c523284c1248e._comment b/doc/todo/key_checksum_from_chunk_checksums/comment_2_aaff896ff99366d3b96c523284c1248e._comment new file mode 100644 index 0000000000..9a02bff2de --- /dev/null +++ b/doc/todo/key_checksum_from_chunk_checksums/comment_2_aaff896ff99366d3b96c523284c1248e._comment @@ -0,0 +1,8 @@ +[[!comment format=mdwn + username="Ilya_Shlyakhter" + avatar="http://cdn.libravatar.org/avatar/1647044369aa7747829c38b9dcc84df0" + subject="comment 2" + date="2019-05-07T00:59:31Z" + content=""" +Thanks for the BLAKE2SP224 pointer; it does solve (1). I'm still looking for the best way to solve (2): registering large remote files with git-annex, without downloading them. `addurl --fast` does that, but creates a non-checksum key. If I can get an MD5 without downloading, I can use `setpresentkey`. But often I only have the MD5 for the fixed-size chunks of the file, not for the whole. Adding a backend variant computable from MD5s of the chunks would solve the problem. Maybe, there are other solutions? +"""]]