On Tue, Jun 15, 2010 at 5:47 PM, Tom Horsley <horsley1953@xxxxxxxxx> wrote: > On Tue, 15 Jun 2010 17:07:13 -0500 > Robert G. (Doc) Savage wrote: > >> If there really is overhead and other data being added, we'd like to >> know as accurately as possible the size of that added amount. > > I think it depends entirely on how many directory entries it > needs to make, etc. I don't think there is any way to predict > it. I suspect if you want to minimize ununsed space on a DVD > you may have to build an image you think will fit, discover it > is too big, remove a file and try again till it fits. > > Maybe the iso image tools should have an option to do everything > except actually write the data to the image so you can iterate > without taking up as much time (maybe they do have such an > option already - I haven't looked :-). I did some searching but found no definitive answer. One thing that could improve the image size estimation is to round up the size of each file to a multiple of 2k. This still will not take into account file system metadata overhead but it will get us closer and should be pretty easy to implement (if it's not already, I'll check the code). Richard -- users mailing list users@xxxxxxxxxxxxxxxxxxxxxxx To unsubscribe or change subscription options: https://admin.fedoraproject.org/mailman/listinfo/users Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines