fix slowloris timeout in hashing resume of download of large file
Hash the data that is already present in the file before connecting to the http server.
This commit is contained in:
parent
0594338a78
commit
10f2c23fd7
4 changed files with 33 additions and 36 deletions
|
@ -28,13 +28,6 @@ Planned schedule of work:
|
|||
|
||||
## work notes
|
||||
|
||||
* Test resume of download of large file when large amount of file is
|
||||
already downloaded and verification takes a long time. Will the http
|
||||
connection be dropped due to inactivity? May need to do verification in a
|
||||
separate thread that feeds in the existing file followed by the newly
|
||||
downloaded data. Eg, a version of tailVerify that operates on a handle
|
||||
open for read+write.
|
||||
|
||||
* Rest of Remote.Git needs implementing.
|
||||
|
||||
* git-annex p2phttp serving .well-known for ACME.
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue