mike... the basic strength/elegance of bittorrent is that it essentially uses the power/bandwidth of average users... if i have a server setup, and allow you to access the file, you can use wget, and you're pretty much going to scream at the slow speed of my upload cap from my isp... even though your download can go much faster... if you're getting the file from a mirror/download site with plenty of bandwidth, then bittorrent really won't do you much good.... however, keep in mind, someone has/is paying for the mirror/bandwidth that you're using to download the file (on the mirror end). if these guys ever started charging to recoup the cost of the bandwidth.. you might start to appreciate the strength of the bittorrent technology. in a simialr manner... if you were to create a file of 500-600M, and you had 1000's of people who wanted to access your file, you're not going to all have them get it from your home server, using wget!! however, with bittorrent, it's possible that you could use your home server, with the torrent approach spreading the file over the users who want to access/download the file... -bruce -----Original Message----- From: fedora-list-bounces@xxxxxxxxxx [mailto:fedora-list-bounces@xxxxxxxxxx]On Behalf Of Mike McCarty Sent: Monday, August 01, 2005 1:39 PM To: For users of Fedora Core releases Subject: Re: (OT) Bit Torrent usage ... Erik Hemdal wrote: > > > > >>Can anyone explain, in ordinary language, what possible >>advantage it would give me over, say, wget? >> >>Mike > > > I'll try to help. > > If you use a conventional tool, even wget, you are making one connection to > a remote server. If that server goes down, or slows down, your transfer > slows down too. Regardless of the bandwidth you have available, you are > limited by the bandwidth of the remote server (or of the slowest link > between you). > > Again, if the transfer is interrupted, you lose. You must start again. > More than once, I've lost a complete Red Hat download because, after > downloading 80% of (say) a CD image, the connection failed somewhere and all > was lost. I've never experienced that "wget -c" failed to get a complete intact image. Could you please explain in what way torrent could complete a download that wget could not? > > BitTorrent establishes multiple connections between your computer and others > which have the files you want. The files are transferred in multiple > pieces. If a single connection fails, you only lose a portion of the data > you are transferring; the previously downloaded parts are still valid. How is this different from wget? (Aside from possibly having to do some manual intervention?) You seem to be saying that if a server fails, wget cannot be used to get the rest from another. [snip] > In payment for a more-efficient download, your system also turns into a > server for the length of time you are running BitTorrent. So others are > downloading from you at the same time you are downloading from others. Well, my downloads are already pushing 70% or so occupancy of my ADSL, so I don't think that having more than one source is going to make it much if any faster, since it's approaching saturation anyway. Mike -- p="p=%c%s%c;main(){printf(p,34,p,34);}";main(){printf(p,34,p,34);} This message made from 100% recycled bits. I can explain it for you, but I can't understand it for you. I speak only for myself, and I am unanimous in that! -- fedora-list mailing list fedora-list@xxxxxxxxxx To unsubscribe: http://www.redhat.com/mailman/listinfo/fedora-list