From 60ef5eb1f83f9d61d52463edc01798f7a2cee996 Mon Sep 17 00:00:00 2001 From: "https://me.yahoo.com/a/chZVFogbu55hxJc3FfeXs.RfaA--#cc6f9" Date: Wed, 29 Apr 2015 21:26:51 +0000 Subject: [PATCH] --- doc/forum/How_do_I_backup_my_data_to_blue_ray_disks__63__.mdwn | 1 + 1 file changed, 1 insertion(+) create mode 100644 doc/forum/How_do_I_backup_my_data_to_blue_ray_disks__63__.mdwn diff --git a/doc/forum/How_do_I_backup_my_data_to_blue_ray_disks__63__.mdwn b/doc/forum/How_do_I_backup_my_data_to_blue_ray_disks__63__.mdwn new file mode 100644 index 0000000000..b94111e943 --- /dev/null +++ b/doc/forum/How_do_I_backup_my_data_to_blue_ray_disks__63__.mdwn @@ -0,0 +1 @@ +I have several TB of media on a Debian ZFS server. If I created a git-annex repo for the data, how hard would it be get git-annex (using bup I assume) to back up the files onto a set of Blu Ray disks? I realize that 8TB of data would take about 320 BR 25GB disks, but it seems like git annex would be great for identifying what disk(s) I needed to recover a file. I'm good at bash scripting, and use git daily. I have no experience with git-annex or bup however. Any links to information or suggestions is very appreciated. Thanks!