On Thu, 20 Jan 2005 13:14:53 -0500, Dennis Shaw <dennis@xxxxxxxxxxxxxxxxxxx> wrote: > wget worked perfect. I guess the Firefox downloader was killing it after > a certain amount of time. The -c option is great. I should have used man > wget before hand. Thanks for the help Paul and Deron. > > Deron Meranda wrote: > > >On Thu, 20 Jan 2005 10:30:25 -0500, Dennis Shaw > ><dennis@xxxxxxxxxxxxxxxxxxx> wrote: > > > > > >>Ya, the file I downloaded is only1557566248 bytes, obviously not right. > >>I was using Firefox to download it... I'm trying wget now. > >> > >> > > > >Use the -c option to wget and it should attempt to resume your > >download from where it was truncated rather than starting all > >over from byte 0 again. > > > > > > The Mozilla/Firefox downloader is not very robust. It has improved over time but is still (IMHO) not as good as wget. One of the problems in the past was that Download Manager would download a file into the temp directory and then copy the file to the destination folder. Problems with this implementaton were: 1) slow performance [data need to be moved twice] and 2) data loss when temp directory have insufficient space for the file. To this day I do not know if Download Manager checks if there is sufficient space to hold a file before it downloads it.