Merge branch 'master' of ssh://git-annex.branchable.com

This commit is contained in:
Joey Hess 2014-10-09 17:03:44 -04:00
commit 7dd65d1bc3
4 changed files with 63 additions and 0 deletions

View file

@ -0,0 +1,12 @@
[[!comment format=mdwn
username="http://joeyh.name/"
ip="209.250.56.54"
subject="behaving as intended"
date="2014-10-09T20:30:26Z"
content="""
git-annex add is supposed to add unlocked files. See the documentation for the unlock command on the man page. Typical workflow is to unlock a file, edit it, add the changes, and commit it.
Your example has 2 files with content \"foo\" and 1 file with content \"foobar\", which require 2 objects to be stored by git-annex, so that's what it stores.
I suggest you get a bit more familiar with git-annex before filing bugs on it.
"""]]

View file

@ -0,0 +1,13 @@
[[!comment format=mdwn
username="http://joeyh.name/"
ip="209.250.56.54"
subject="comment 4"
date="2014-10-09T20:43:34Z"
content="""
Ah, ok. git's index has the file listed as not being a symlink, because `git commit $file` stages it in the index that way. Running `git reset --hard` will fix git's index.
This problem is avoided if you `git annex add $file` before committing. Which is generally a good idea
for other reasons, including avoiding staging a potentially huge file's contents in the git index in the first place.
git-annex's pre-commit hook should probably update the git index for the committed files, replacing the staged full file contents with the git-annex symlink. That would avoid this problem.
"""]]

View file

@ -0,0 +1,19 @@
[[!comment format=mdwn
username="http://joeyh.name/"
ip="209.250.56.54"
subject="comment 1"
date="2014-10-09T19:57:54Z"
content="""
git-annex doesn't currently have a way to generate those lists itself, but you could use `git annex metadata --json` to get the metadata of all files, and pipe that json into a parser to get the data you want.
The output could also be parsed in non-json mode. For example, this will list the tags:
git annex metadata | grep '^ tag=' | cut -d '=' -f 2 | sort | uniq
Although it's possible for metadata to contain newlines, and so parsing the json is a more reliable approach.
Another nice way to see all the tags is to switch to a view of all tags:
git annex view 'tag=*'
ls
"""]]

View file

@ -0,0 +1,19 @@
Imagine the following situation:
You have a directory structure like this:
`./`
`+--dir1`
`|+--file1 (local)`
`|+--file2 (remote1)`
`|+--file3 (remote2)`
Now when these files are quite big and you need them in one directory temporarily you would need to use `git annex get dir1` to copy them all over to local. This can take some time.
I whish we had a command like this:
`git annex getlinks dir1`
where git annex would try to not link to the missing local objects but to the remote ones. So there is no need to copy the data around just to use it for a short time. After you are done you could use `git annex resetlinks dir1` to reset the links to the local objects.
I know that many specialremotes will not support this without much hassle, but it would be cool to be able to get atleast the links from external drives and maybe ssh remotes via sshfs.
To keep the data consistent there can be a constraint that every action (add, sync, commit or others) first issue a `resetlinks`.
What do you think of that?