speed up populating the importfeed database
Avoid conversion from ByteString to String for urls that will just be converted right back to ByteString to go into the database. Also setTempUrl is not used by importfeed, so avoid checking for temp urls in this code path. This benchmarks as only a small improvement. From 2.99s to 2.78s when populating a database with 33k urls. Note that it does not seem worth replacing URLString with URLByteString generally, because the ways urls are used all entails either parseURI, which takes a string, or passing a parameter to eg curl, which also is currently a string. Sponsored-by: Leon Schuermann on Patreon
This commit is contained in:
parent
aaeadc422a
commit
c9866d2164
2 changed files with 19 additions and 12 deletions
|
@ -112,9 +112,9 @@ isKnownItemId (ImportFeedDbHandle h) i =
|
|||
] []
|
||||
return $ not (null l)
|
||||
|
||||
recordKnownUrl :: ImportFeedDbHandle -> URLString -> IO ()
|
||||
recordKnownUrl :: ImportFeedDbHandle -> URLByteString -> IO ()
|
||||
recordKnownUrl h u = queueDb h $
|
||||
void $ insertUniqueFast $ KnownUrls $ SByteString $ encodeBS u
|
||||
void $ insertUniqueFast $ KnownUrls $ SByteString u
|
||||
|
||||
recordKnownItemId :: ImportFeedDbHandle -> SByteString -> IO ()
|
||||
recordKnownItemId h i = queueDb h $
|
||||
|
@ -177,7 +177,7 @@ updateFromLog db@(ImportFeedDbHandle h) (oldtree, currtree)
|
|||
let f = getTopFilePath (DiffTree.file ti)
|
||||
case extLogFileKey urlLogExt f of
|
||||
Just k -> do
|
||||
knownurls =<< getUrls k
|
||||
knownurls =<< getUrls' k
|
||||
Nothing -> case extLogFileKey metaDataLogExt f of
|
||||
Just k -> do
|
||||
m <- getCurrentMetaData k
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue