Erik Hemdal wrote:
Can anyone explain, in ordinary language, what possible
advantage it would give me over, say, wget?
Mike
I'll try to help.
If you use a conventional tool, even wget, you are making one connection to
a remote server. If that server goes down, or slows down, your transfer
slows down too. Regardless of the bandwidth you have available, you are
limited by the bandwidth of the remote server (or of the slowest link
between you).
Again, if the transfer is interrupted, you lose. You must start again.
More than once, I've lost a complete Red Hat download because, after
downloading 80% of (say) a CD image, the connection failed somewhere and all
was lost.
I've never experienced that "wget -c" failed to get a complete
intact image. Could you please explain in what way torrent could
complete a download that wget could not?
BitTorrent establishes multiple connections between your computer and others
which have the files you want. The files are transferred in multiple
pieces. If a single connection fails, you only lose a portion of the data
you are transferring; the previously downloaded parts are still valid.
How is this different from wget? (Aside from possibly having to do some
manual intervention?) You seem to be saying that if a server fails, wget
cannot be used to get the rest from another.
[snip]
In payment for a more-efficient download, your system also turns into a
server for the length of time you are running BitTorrent. So others are
downloading from you at the same time you are downloading from others.
Well, my downloads are already pushing 70% or so occupancy of my ADSL,
so I don't think that having more than one source is going to make
it much if any faster, since it's approaching saturation anyway.
Mike
--
p="p=%c%s%c;main(){printf(p,34,p,34);}";main(){printf(p,34,p,34);}
This message made from 100% recycled bits.
I can explain it for you, but I can't understand it for you.
I speak only for myself, and I am unanimous in that!