Merge branch 'master' of ssh://git-annex.branchable.com
This commit is contained in:
commit
2e54564061
3 changed files with 77 additions and 0 deletions
|
@ -0,0 +1,59 @@
|
|||
[[!comment format=mdwn
|
||||
username="jasonb@ab4484d9961a46440958fa1a528e0fc435599057"
|
||||
nickname="jasonb"
|
||||
avatar="http://cdn.libravatar.org/avatar/c7330f4da122c671b935fc1d58bb02b1"
|
||||
subject="I have this behavior consistently on the 2 repos I use"
|
||||
date="2021-08-11T18:49:41Z"
|
||||
content="""
|
||||
Here's the one. Now that a dozen changed files are committed, it doesn't happen. As long as they were modified, it happened every time. (On the other repo with 100,000+ files, it takes several minutes to complete this check. Absolutely kills my git aware shell prompt.)
|
||||
|
||||
```
|
||||
git status
|
||||
Refresh index: 100% (1453/1453), done.
|
||||
|
||||
git annex info
|
||||
trusted repositories: 0
|
||||
semitrusted repositories: 4
|
||||
...
|
||||
untrusted repositories: 0
|
||||
transfers in progress: none
|
||||
available local disk space: 56.15 gigabytes (+1 megabyte reserved)
|
||||
local annex keys: 2159
|
||||
local annex size: 16.89 gigabytes
|
||||
annexed files in working tree: 1437
|
||||
size of annexed files in working tree: 11.29 gigabytes
|
||||
bloom filter size: 32 mebibytes (0.4% full)
|
||||
backend usage:
|
||||
SHA256E: 1437
|
||||
|
||||
|
||||
cat .git/config
|
||||
[core]
|
||||
repositoryformatversion = 0
|
||||
filemode = true
|
||||
bare = false
|
||||
logallrefupdates = true
|
||||
ignorecase = true
|
||||
precomposeunicode = true
|
||||
[annex]
|
||||
uuid = abc
|
||||
version = 8
|
||||
thin = true
|
||||
[filter \"annex\"]
|
||||
smudge = git-annex smudge %f
|
||||
clean = git-annex smudge --clean %f
|
||||
[merge]
|
||||
renameLimit = 1000
|
||||
|
||||
git annex version
|
||||
git-annex version: 8.20210428
|
||||
build flags: Assistant Webapp Pairing FsEvents TorrentParser MagicMime Feeds Testsuite S3 WebDAV
|
||||
dependency versions: aws-0.22 bloomfilter-2.0.1.0 cryptonite-0.28 DAV-1.3.4 feed-1.3.2.0 ghc-8.10.4 http-client-0.7.8 persistent-sqlite-2.12.0.0 torrent-10000.1.1 uuid-1.3.14 yesod-1.6.1.1
|
||||
key/value backends: SHA256E SHA256 SHA512E SHA512 SHA224E SHA224 SHA384E SHA384 SHA3_256E SHA3_256 SHA3_512E SHA3_512 SHA3_224E SHA3_224 SHA3_384E SHA3_384 SKEIN256E SKEIN256 SKEIN512E SKEIN512 BLAKE2B256E BLAKE2B256 BLAKE2B512E BLAKE2B512 BLAKE2B160E BLAKE2B160 BLAKE2B224E BLAKE2B224 BLAKE2B384E BLAKE2B384 BLAKE2BP512E BLAKE2BP512 BLAKE2S256E BLAKE2S256 BLAKE2S160E BLAKE2S160 BLAKE2S224E BLAKE2S224 BLAKE2SP256E BLAKE2SP256 BLAKE2SP224E BLAKE2SP224 SHA1E SHA1 MD5E MD5 WORM URL X*
|
||||
remote types: git gcrypt p2p S3 bup directory rsync web bittorrent webdav adb tahoe glacier ddar git-lfs httpalso borg hook external
|
||||
operating system: darwin x86_64
|
||||
supported repository versions: 8
|
||||
upgrade supported from repository versions: 0 1 2 3 4 5 6 7
|
||||
local repository version: 8
|
||||
```
|
||||
"""]]
|
10
doc/forum/Pull_files_through_cloud.mdwn
Normal file
10
doc/forum/Pull_files_through_cloud.mdwn
Normal file
|
@ -0,0 +1,10 @@
|
|||
I'm not sure if the feature I want exists in git-annex, so this might be a question for the wishlist.
|
||||
|
||||
Can I use a cloud server as a middle-man for pulling files from a host that's unreachable at the moment? My situation is:
|
||||
- Desktop, behind cg-nat, can talk to Cloud. Multiple TB of files.
|
||||
- Cloud provider, generally accessible, just a few GB of space.
|
||||
- Notebook, when at home can talk to desktop, on the go can only talk to cloud.
|
||||
|
||||
I would like to (in the notebook), request a file to the cloud, and have the cloud pull it from the desktop to be retrieved by the notebook (Or the cloud broker the double-NAT hole punching, but that's much harder).
|
||||
|
||||
My current (crude) solution is a script running in the desktop that every few minutes reads a textfile in the cloud with the paths that should be pushed there, so when I need a file in the notebook I add its path to the textfile, and after a few minutes I can sync and pull it from the cloud. But is there a more automated solution?
|
|
@ -0,0 +1,8 @@
|
|||
[[!comment format=mdwn
|
||||
username="Lukey"
|
||||
avatar="http://cdn.libravatar.org/avatar/c7c08e2efd29c692cc017c4a4ca3406b"
|
||||
subject="comment 1"
|
||||
date="2021-08-11T18:25:51Z"
|
||||
content="""
|
||||
A better solution is to use [metadata](https://git-annex.branchable.com/git-annex-metadata/). You just tag the files you want in the cloud and for the cloud remote set a preferred-content expression like `metadata=tag=cloud`. And then I think you can just run the [git-annex-assistant] on the desktop and it'll automatically sync the files if metadata for a file changes.
|
||||
"""]]
|
Loading…
Reference in a new issue