Added a comment

This commit is contained in:
Lukey 2021-01-19 16:49:05 +00:00 committed by admin
parent da428b152d
commit 40107f1ef1

View file

@ -0,0 +1,17 @@
[[!comment format=mdwn
username="Lukey"
avatar="http://cdn.libravatar.org/avatar/c7c08e2efd29c692cc017c4a4ca3406b"
subject="comment 2"
date="2021-01-19T16:49:05Z"
content="""
direct-mode doesn't exist anymore, it is replaced with the [[annex.thin config|https://git-annex.branchable.com/tips/unlocked_files]] (See \"using less disk space\"). And yes, it works on NTFS.
It definitely isn't going to be fast, the numbers you gave suggests that there will be ~1000000 files per repository (For the V*-related dirs). Still you should try it and see if it's fast enough for you. Some tips to improve performance: Don't use `include=`/`exclude=` in preferred-content-expressions and [[Repositories with large number of files|https://git-annex.branchable.com/tips/Repositories_with_large_number_of_files/]]. My experimental script [[here|https://git-annex.branchable.com/todo/Incremental_git_annex_sync_--content_--all/]] might also be worth a try.
Having the root of the repo on the root of the drive and then excluding everything that shouldn't be in the repo via `.gitignore` can be a vivable approach. But with that many files I'd create one repo per directory. It could also be done with git worktrees.
Don't use git-annex on top of a network share, in that case run it directly on the server. git-annex is designed to run on local drives/storage. Also, git-annex on windows is way slower than on linux.
You can donate via [[Patreon|https://www.patreon.com/joeyh]] and [[Liberapay|https://liberapay.com/joeyh/]].
"""]]