Merge branch 'master' of ssh://git-annex.branchable.com

This commit is contained in:
Joey Hess 2015-06-11 15:20:55 -04:00
commit f7a10dc421
7 changed files with 58 additions and 8 deletions

View file

@ -0,0 +1,9 @@
[[!comment format=mdwn
username="eigengrau"
subject="comment 3"
date="2015-06-11T15:12:23Z"
content="""
Thanks! FWIW I didnt have any hard kernel lockups recently. I figure git replaces the index file atomically, and only after all objects have been written to the store? I guess a userland crash couldnt be the cause either, in that case?
What happens when you manually run git gc at an inopportune moment, seeing that it probably doesnt know about the secondary index? In the logs, I saw mention of locks on individual refs. Is the whole repository also locked down when git-annex commits something, or could it happen that a manual git gc prunes away objects added by git-annex before it had a chance to write the tree and commit it to a ref?
"""]]

View file

@ -0,0 +1,18 @@
### Please describe the problem.
git annex import leaves empties directories behind.
now maybe that's desired, but in this case, why are the *files* gone and not the directories?
### What steps will reproduce the problem?
<pre>
mkdir -p /tmp/foo/bar
touch /tmp/foo/bar/file
git annex import /tmp/foo
[ -d /tmp/foo/bar ] && echo 'fail: directory still there!'
</pre>
### What version of git-annex are you using? On what operating system?
5.20141125

View file

@ -1,8 +0,0 @@
### Please describe the problem.
inability to quickly grep locally present files
### What steps will reproduce the problem?
run "git annex grep"

View file

@ -0,0 +1,15 @@
[[!comment format=mdwn
username="juh"
subject="Thanks"
date="2015-06-11T15:51:53Z"
content="""
Thanks for your comments. I am very interested in things people build on top of git-annex. I am looking forward to read more about these projects.
Interesting that both of you don't use the webapp or the assistant. It was the first thing I used and it was disappointing, so now I try out the commandline.
I understand that managing an amount of files way too big for ones notebook or desktop is one of the main use cases. And this is a use case I definitely will cover in the book, if I write it at all.
Still evaluating...
"""]]

View file

@ -0,0 +1,7 @@
Is there a way to set bandwidth limits for [[special_remotes/s3]]?
From what i can see in the [[todo/credentials-less_access_to_s3]] patch, the `downloadUrl` function is used, does that mean that the `annex.web-download-command` is used? If that's the case, it's great because it means we can use the `--bwlimit` parameter in `wget` to limit transfers.
But what about uploads? Are there limits there as well?
I'll also abuse this forum to see if/when it will be possible to have a shiny new release to ship that amazing new feature? There seems to be sufficient stuff piled up in the unreleased changelog to warrant a release, no? :) --[[anarcat]]

View file

@ -0,0 +1,9 @@
### Please describe the problem.
inability to quickly grep locally present files
### What steps will reproduce the problem?
run "git annex grep"
> i don't understand this request. just running `grep` will grep all the locally present files: sure there will be warnings, but you can use `2>/dev/null` to silence those. as for the suggested solution in comment, that greps for the filenames. please clarify the feature request here or this is [[invalid|done]]. --[[anarcat]]