well, i just did a test on a backup folder, and it was about just over a quarter of the source files used for parity (but that was just my quick test and the values i chose for the test)
heres the breakdown for you:
source folder of 580files (f) = (115MB total)
mixed image files, .gif, jpg, psd, bmp, etc
in the particular folder, the files directly in that folder had size ranges approximatley as follows:
317 files were from 1kb to 100kb
214 files were from 100kb+ to 500kb
50 files were from 501kb to 7mb
(i aimed for a file cover of 10 large, and about 80 smaller) (also known as max / min)
= This lets me “fully” reconstruct the 10 largest files of my source files. i copied the source and parity files to a test location, and deleted the 10 largest images from that test location, and was able to reconstruct them.
I also picked a bunch of about 40 near the mid-range, and zapped them and was able to reconstruct them too.
i used the default usenet block size, and it gave me about 32mb of parity data, (i made 1 parity file of 32mb in this case, instead of multiple parity files of a smaller size, because there is less wastage / overhead that way)
however, as most of my files were 500kb of less, the largest files in my set needed about 5 or 6 times as much parity data… eg i would have needed less parity data if most of the files were similar size as the rest.
a general rule of thumb is add about 10% (so if you have a folder with 50gb, you get about 5gb of parity data)
my prefered way is to structure my data into smaller folders, instead of huge amounts of files in 1 folder (which slows things down in windows anyway), and to then pick a folder of similar filesizes, and to my start with 10% approx on the slider, but to then adjust things until i know that i have values in both the MAX/MIN ranges which are above zero, and at least 5 depending on how many files ive got in there, or how important i deem the files to be
but, if you analyse more on how youve got your final files and folders structured, you can probably work out a more efficient process. - if all you want to do though, it get a bit more protection on a folder you have, then you probaly could stick with the default general 10% rule, and make a separate parity set for each folder you have.
(if you really get stuck then you can either post back with a directory listing or pm me your email and i can try to help without clogging up this thread too much)