Merge branch 'master' into sqlite

This commit is contained in:
Joey Hess 2019-11-22 12:49:35 -04:00
commit 7263aafd2b
No known key found for this signature in database
GPG key ID: DB12DB0FF05F8F38
6 changed files with 33 additions and 7 deletions

View file

@ -31,7 +31,7 @@ git-annex (7.20191115) UNRELEASED; urgency=medium
* benchmark: Changed --databases to take a parameter specifiying the size * benchmark: Changed --databases to take a parameter specifiying the size
of the database to benchmark. of the database to benchmark.
* benchmark --databases: Display size of the populated database. * benchmark --databases: Display size of the populated database.
* benchmark --databases: Improve the "addAssociatedFile to (new)" * benchmark --databases: Improve the "addAssociatedFile (new)"
benchmark to really add new values, not overwriting old values. benchmark to really add new values, not overwriting old values.
* Windows: Fix handling of changes to time zone. (Used to work but was * Windows: Fix handling of changes to time zone. (Used to work but was
broken in version 7.20181031.) broken in version 7.20181031.)

View file

@ -48,31 +48,31 @@ benchmarkDbs _ = error "not built with criterion, cannot benchmark"
#ifdef WITH_BENCHMARK #ifdef WITH_BENCHMARK
getAssociatedFilesHitBench :: BenchDb -> Benchmark getAssociatedFilesHitBench :: BenchDb -> Benchmark
getAssociatedFilesHitBench (BenchDb h num) = bench ("getAssociatedFiles from " ++ show num ++ " (hit)") $ nfIO $ do getAssociatedFilesHitBench (BenchDb h num) = bench ("getAssociatedFiles (hit)") $ nfIO $ do
n <- getStdRandom (randomR (1,num)) n <- getStdRandom (randomR (1,num))
SQL.getAssociatedFiles (keyN n) (SQL.ReadHandle h) SQL.getAssociatedFiles (keyN n) (SQL.ReadHandle h)
getAssociatedFilesMissBench :: BenchDb -> Benchmark getAssociatedFilesMissBench :: BenchDb -> Benchmark
getAssociatedFilesMissBench (BenchDb h num) = bench ("getAssociatedFiles from " ++ show num ++ " (miss)") $ nfIO $ getAssociatedFilesMissBench (BenchDb h _num) = bench ("getAssociatedFiles (miss)") $ nfIO $
SQL.getAssociatedFiles keyMiss (SQL.ReadHandle h) SQL.getAssociatedFiles keyMiss (SQL.ReadHandle h)
getAssociatedKeyHitBench :: BenchDb -> Benchmark getAssociatedKeyHitBench :: BenchDb -> Benchmark
getAssociatedKeyHitBench (BenchDb h num) = bench ("getAssociatedKey from " ++ show num ++ " (hit)") $ nfIO $ do getAssociatedKeyHitBench (BenchDb h num) = bench ("getAssociatedKey (hit)") $ nfIO $ do
n <- getStdRandom (randomR (1,num)) n <- getStdRandom (randomR (1,num))
SQL.getAssociatedKey (fileN n) (SQL.ReadHandle h) SQL.getAssociatedKey (fileN n) (SQL.ReadHandle h)
getAssociatedKeyMissBench :: BenchDb -> Benchmark getAssociatedKeyMissBench :: BenchDb -> Benchmark
getAssociatedKeyMissBench (BenchDb h num) = bench ("getAssociatedKey from " ++ show num ++ " (miss)") $ nfIO $ getAssociatedKeyMissBench (BenchDb h num) = bench ("getAssociatedKey (miss)") $ nfIO $
SQL.getAssociatedKey fileMiss (SQL.ReadHandle h) SQL.getAssociatedKey fileMiss (SQL.ReadHandle h)
addAssociatedFileOldBench :: BenchDb -> Benchmark addAssociatedFileOldBench :: BenchDb -> Benchmark
addAssociatedFileOldBench (BenchDb h num) = bench ("addAssociatedFile to " ++ show num ++ " (old)") $ nfIO $ do addAssociatedFileOldBench (BenchDb h num) = bench ("addAssociatedFile to (old)") $ nfIO $ do
n <- getStdRandom (randomR (1,num)) n <- getStdRandom (randomR (1,num))
SQL.addAssociatedFile (keyN n) (fileN n) (SQL.WriteHandle h) SQL.addAssociatedFile (keyN n) (fileN n) (SQL.WriteHandle h)
H.flushDbQueue h H.flushDbQueue h
addAssociatedFileNewBench :: BenchDb -> Benchmark addAssociatedFileNewBench :: BenchDb -> Benchmark
addAssociatedFileNewBench (BenchDb h num) = bench ("addAssociatedFile to " ++ show num ++ " (new)") $ nfIO $ do addAssociatedFileNewBench (BenchDb h num) = bench ("addAssociatedFile to (new)") $ nfIO $ do
n <- getStdRandom (randomR (1,num)) n <- getStdRandom (randomR (1,num))
SQL.addAssociatedFile (keyN n) (fileN (num+n)) (SQL.WriteHandle h) SQL.addAssociatedFile (keyN n) (fileN (num+n)) (SQL.WriteHandle h)
H.flushDbQueue h H.flushDbQueue h

View file

@ -9,6 +9,7 @@ module Utility.LinuxMkLibs (
installLib, installLib,
parseLdd, parseLdd,
glibcLibs, glibcLibs,
inTop,
) where ) where
import Utility.PartialPrelude import Utility.PartialPrelude

View file

@ -1 +1,3 @@
Right now, non-annexed files get passed through the `annex` clean/smudge filter (see [[forum/Adding_files_to_git__58___Very_long___34__recording_state_in_git__34___phase]]). It would be better if `git-annex` configure the filter only for the annexed unlocked files, in the `.gitattributes` file at the root of the repository. Right now, non-annexed files get passed through the `annex` clean/smudge filter (see [[forum/Adding_files_to_git__58___Very_long___34__recording_state_in_git__34___phase]]). It would be better if `git-annex` configure the filter only for the annexed unlocked files, in the `.gitattributes` file at the root of the repository.
> not a viable solution, [[done]] --[[Joey]]

View file

@ -0,0 +1,19 @@
[[!comment format=mdwn
username="joey"
subject="""comment 1"""
date="2019-11-22T16:01:26Z"
content="""
It immediately occurs to me that the proposal would break this:
git annex add foo
git annex add bar
git annex unlock bar
git mv bar foo
git commit -m add
Since foo was a locked file, gitattributes would prevent from being
smudged, so the large content that was in bar gets committed directly to git.
The right solution is to improve the smudge/clean filter interface to it's
not so slow, which there is copious discussion of elsewhere.
"""]]

View file

@ -1 +1,5 @@
If running the clean/smudge filter once per file is a [[bottleneck|forum/Adding_files_to_git__58___Very_long___34__recording_state_in_git__34___phase]], might it speed things up to split them off into something more lightweight than the full git-annex binary? If running the clean/smudge filter once per file is a [[bottleneck|forum/Adding_files_to_git__58___Very_long___34__recording_state_in_git__34___phase]], might it speed things up to split them off into something more lightweight than the full git-annex binary?
> This is a duplicate of stuff discussed at
> [[todo/git_smudge_clean_interface_suboptiomal]], so closing. [[done]]
> --[[Joey]]