On Friday 22 December 2006 12:11, Justin W wrote: >Dave Ihnat wrote: >> On Fri, Dec 22, 2006 at 10:32:16AM -0500, Dmitriy Kropivnitskiy wrote: >>> Basically, AFAIU, you get major version upgrades. For example, FC5 >>> has GNOME 2.14 as the main Desktop. FC6 has 2.16. >>> FC5 is not going to get 2.16. Ever. It will only get updates for >>> minor versions. >> >> I think what he's getting at is, why do big-bang releases instead of >> simply continually releasing updates via an automatic mechanism such >> as yum? >> >> This model has had its proponents over the years. Probably the >> biggest reasons you have major releases are: >> >> o A major release gives someone new to the product line a starting >> point that isn't horrendously out of date. Ever have to reinstall a >> copy of Windows XP from CD, then live through hours of updates? > >In my experience though, you *do* have to sit through hours of updates: >the first few months of a release are so hectic with bug fixes that you >end up downloading nearly an entire CD's worth of information in updates >to what you just installed. I can do this because I have DSL that can >run for hours downloading everything, but it won't work for anyone on >dialup. > >That brings me to a related question I've wondered for some time: why do >we have to download entire packages for updates? Why can't there be an >RPM package similar to patches? Then you'd only have to download the >difference in a package (and I don't mean a partial file, but just whole >files that have been updates. Most files don't get too large >individually). I would see where this could be a problem if we didn't >have new Fedora Core releases, but since we do, the patch RPMs would >only have to be based off the initial package of that version release. >If any patch RPMs are needed before a particular patch RPM could be >installed, I don't really see why it would be a problem for the patch >RPM's spec file to include a list of dependency patches (much like >packages already do) and have yum automatically download them too. > >Justin W I'd like to second those thoughts. OTOH, if one wants to, I believe the basic rsync algorithm could be trained to do exactly that. The disadvantage to that is that one would have to maintain a local rpm repository of the whole install in order to make that work. But a scenario where it sucks a list of files that have been updated since the last time the update was executed, with both the old version and the new version, and add a bit of code to edit the filename in the repo to match, possibly just softlinking the new name to the old file, then just run through it, grabbing what it needs. I could foresee that the network traffic load might be cut by at least 50%, and possibly as much as 95+% for simple patch a function upgrades. The user then would simply run another script that detected the filename update and update-installed the updated package from the users own repository. This wouldn't take a lot of 'new' code, and could probably be done by a python programmer to be fancy, or even a bash script could probably do it, run at some low load time by cron. The possibilities seem to be well worth the effort IMO. >> Dave Ihnat >> President, DMINET Consulting, Inc. >> dihnat@xxxxxxxxxx Thanks for triggering the thought Dave, & Merry Christmas. -- Cheers, Gene "There are four boxes to be used in defense of liberty: soap, ballot, jury, and ammo. Please use in that order." -Ed Howdershelt (Author) Yahoo.com and AOL/TW attorneys please note, additions to the above message by Gene Heskett are: Copyright 2006 by Maurice Eugene Heskett, all rights reserved.