On Fri, 2005-12-02 at 09:44, Tim wrote: > > The slowness depends on how your cgi executes. If it is a perl > > script loading perl on every hit it will be slower. If you > > use php, mod_perl, fastcgi, speedycgi, java, etc. where the > > interpreter is loaded once for many hits you won't really see > > a speed difference. > > As well as how well you write your program... It does seem however, > that nearly every dynamically generated site that I've come across > behaves like it's on a 16 MHz 486. Poke around http://www.marketcenter.com or http://www.futuresource.com keeping in mind that just about everything there is dynamic and the web servers have to pull the data from other sources as requested. Usually the slowest part of the page is loading the ads from a third-party site. These happen to be mostly java-based, but php and mod_perl can do as well. > That actually was my main concern (no local caching). Time and time > again I've used incredibly slow HTTPS sites where nothing is cacheable. > I can't back track (nothing loads, or the server throws a wobbly). I > can only navigate via the links on the page. Tough luck if the idiot > webmaster made it impossible to go back to where you need to go. Part of that is using sensible URL's. If you design the site so the URL's can be bookmarked you can usually backtrack. Plus, if you can keep track of what is sensitive and what isn't, you can put the bulk of your static content in an unprotected directory to let caching proxies work - especially for images and frequently used icons. -- Les Mikesell lesmikesell@xxxxxxxxx