fix http-client gzip decompression bug

Prevent haskell http-client from decompressing gzip files, so downloads of
such files works the same as it used to with wget and curl.

Explicitly setting accept-encoding to "identity" is probably not needed,
but that's what wget sends (curl does not send the header), and since
http-client is trying to be excessively smart, it seems we need to set
hAcceptEncoding to something to prevent it from inserting its own,
and this seems better than some hack like "".

This commit was sponsored by Ole-Morten Duesund on Patreon.
This commit is contained in:
Joey Hess 2018-05-21 15:10:25 -04:00
parent f9e3bcdeb8
commit caaedb2993
No known key found for this signature in database
GPG key ID: DB12DB0FF05F8F38
4 changed files with 36 additions and 2 deletions

View file

@ -284,11 +284,20 @@ download meterupdate url file uo =
downloadconduit req = catchMaybeIO (getFileSize file) >>= \case
Nothing -> runResourceT $ do
resp <- http req (httpManager uo)
resp <- http req' (httpManager uo)
if responseStatus resp == ok200
then store zeroBytesProcessed WriteMode resp
else showrespfailure resp
Just sz -> resumeconduit req sz
Just sz -> resumeconduit req' sz
where
-- Override http-client's default decompression of gzip
-- compressed files. We want the unmodified file content.
req' = req
{ requestHeaders = (hAcceptEncoding, "identity") :
filter ((/= hAcceptEncoding) . fst)
(requestHeaders req)
, decompress = const False
}
alreadydownloaded sz s h = s == requestedRangeNotSatisfiable416
&& case lookup hContentRange h of