From 2a6f8fa74a466e6300915b6ce922ba2b55cebc9e Mon Sep 17 00:00:00 2001 From: jonas Date: Sun, 13 Jan 2019 12:34:15 +0000 Subject: [PATCH] --- ...f_files_are_not_dropped_even_though_enough_copies_exist.mdwn | 2 ++ 1 file changed, 2 insertions(+) diff --git a/doc/forum/Old_versions_of_files_are_not_dropped_even_though_enough_copies_exist.mdwn b/doc/forum/Old_versions_of_files_are_not_dropped_even_though_enough_copies_exist.mdwn index 07df03ad56..dc7ef44eec 100644 --- a/doc/forum/Old_versions_of_files_are_not_dropped_even_though_enough_copies_exist.mdwn +++ b/doc/forum/Old_versions_of_files_are_not_dropped_even_though_enough_copies_exist.mdwn @@ -8,6 +8,8 @@ My use case is a big music library which does not fit onto my laptop. So for updating the metadata in the files I basically run `git annex get && git annex unlock ` then update all files in that folder and then run `git annex add && git commit && git copy --to remote && git annex drop `. This works perfectly but the problem is, that all old versions of changed files are retained in `.git/annex/objects` which in no time filled my drive to 100%. +EDIT: As a clarification, I would expect git-annex to also drop the old versions of the file if enough copies exist… + The script: #!/usr/bin/env bash