When proxying an upload to a special remote, verify the hash.

While usually uploading to a special remote does not verify the content,
the content in a repository is assumed to be valid, and there is no trust
boundary. But with a proxied special remote, there may be users who are
allowed to store objects, but are not really trusted.

Another way to look at this is it's the equivilant of git-annex-shell
checking the hash of received data, which it does (see StoreContent
implementation).
This commit is contained in:
Joey Hess 2024-07-29 13:39:28 -04:00
parent 960daf210b
commit fcc052bed8
No known key found for this signature in database
GPG key ID: DB12DB0FF05F8F38
4 changed files with 41 additions and 35 deletions

View file

@ -40,16 +40,6 @@ Planned schedule of work:
When using ssh and not the http server, the node that had the incomplete
copy also doesn't get the file, altough no error is displayed.
* When proxying a PUT to a special remote, no verification of the received
content is done, it's just written to a file and that is sent to the
special remote. This violates a usual invariant that any data being
received into a repository gets verified in passing. Although on the
other hand, when sending data to a special remote normally, there is also
no verification. On the third hand, a p2p http proxy (or for that matter
a ssh server) may have users who are allowed to store objects, but are
not really trusted, and if they can upload garbage without verification,
that could be bad.
## items deferred until later for p2p protocol over http
* `git-annex p2phttp` should support serving several repositories at the same