batch commit every 5 thousand changes, not 10 thousand
There's a tradeoff between making less frequent commits, and needing to use memory to store all the changes that are coming in. At 10 thousand, it needs 150 mb of memory. 5 thousand drops that down to 90 mb or so. This also turns out to have significant imact on total run time. I benchmarked 10k changes taking 27 minutes. But two 5k batches took only 21 minutes.
This commit is contained in:
parent
bda237f14a
commit
cd7055631f
1 changed files with 1 additions and 1 deletions
|
@ -140,7 +140,7 @@ humanImperceptibleDelay = threadDelay $
|
|||
shouldCommit :: UTCTime -> [Change] -> Bool
|
||||
shouldCommit now changes
|
||||
| len == 0 = False
|
||||
| len > 10000 = True -- avoid bloating queue too much
|
||||
| len > 5000 = True -- avoid bloating change pool too much
|
||||
| length recentchanges < 10 = True
|
||||
| otherwise = False -- batch activity
|
||||
where
|
||||
|
|
Loading…
Reference in a new issue