On Tue, Jun 15, 2010 at 6:33 PM, Richard Shaw <hobbes1069@xxxxxxxxx> wrote: > I did some searching but found no definitive answer. One thing that > could improve the image size estimation is to round up the size of > each file to a multiple of 2k. This still will not take into account > file system metadata overhead but it will get us closer and should be > pretty easy to implement (if it's not already, I'll check the code). Replying to myself here. I'm not sure what's going on but I forced the size of each file to round up to the nearest 2K block but I got a strange result. I have it spit out the cumulative file size of each disc and for disc one I got 4679MB, which would be fine in MiB but not MB, but the resulting ISO file was 4476MB. Here's part of the code: (line numbers added) <CODE> 1 if not os.path.islink(file): 2 file_size = os.path.getsize(file) 3 iso_size = math.ceil(file_size/(2*1024))*2*1024 4 disc_size += iso_size </CODE> You don't have to know python to interpret it so please let me know if you see anything wrong with the logic. 1. Test to make sure it's a real file and not a link to a file. 2. Get the file size in bytes. 3. Round up the file size to the nearest 2K block. 4. Add the iso_size of the file to the disc_size variable ... wash, rinse, repeat... Any ideas? Richard -- users mailing list users@xxxxxxxxxxxxxxxxxxxxxxx To unsubscribe or change subscription options: https://admin.fedoraproject.org/mailman/listinfo/users Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines