avoid unnecessary transfer scans when syncing a disconnected remote

Found a very cheap way to determine when a disconnected remote has
diverged, and has new content that needs to be transferred: Piggyback on
the git-annex branch update, which already checks for divergence.

However, this does not check if new content has appeared locally while
disconnected, that should be transferred to the remote.

Also, this does not handle cases where the two git repos are in sync,
but their content syncing has not caught up yet.

This code could have its efficiency improved:

* When multiple remotes are synced, if any one has diverged, they're
  all queued for transfer scans.
* The transfer scanner could be told whether the remote has new content,
  the local repo has new content, or both, and could optimise its scan
  accordingly.
This commit is contained in:
Joey Hess 2012-08-22 14:51:11 -04:00
parent 5d577c32a9
commit 5c3e14649e
4 changed files with 25 additions and 13 deletions

View file

@ -238,7 +238,8 @@ updateAlertMap dstatus a = notifyAlert dstatus `after` modifyDaemonStatus_ dstat
where
go s = s { alertMap = a (alertMap s) }
{- Displays an alert while performing an activity.
{- Displays an alert while performing an activity that returns True on
- success.
-
- The alert is left visible afterwards, as filler.
- Old filler is pruned, to prevent the map growing too large. -}
@ -247,6 +248,7 @@ alertWhile dstatus alert a = alertWhile' dstatus alert $ do
r <- a
return $ (r, r)
{- Like alertWhile, but allows the activity to return a value too. -}
alertWhile' :: DaemonStatusHandle -> Alert -> IO (Bool, a) -> IO a
alertWhile' dstatus alert a = do
let alert' = alert { alertClass = Activity }