On Sat, 19 Mar 2005 21:03:38 +0800, Ow Mun Heng <Ow.Mun.Heng@xxxxxxx> wrote: > On Fri, 2005-03-18 at 15:25 -0500, Dave Jones wrote: > > On Fri, Mar 18, 2005 at 03:18:00PM -0500, Scot L. Harris wrote: > > > > > And yes there are tools available to help mitigate the potential problem > > > as you pointed out. But why not set a default limit instead of leaving > > > it open? > > > > Because then we get flooded with "I cant run two copies of openoffice, wtf?", > > "concurrent users of ftpd downloading iso's or other large files goes bang" > > and many other similar bugs. > > But shouldn't there be some sane limit already applied? As a normal > user, I believe having ~300 user proceses would be more than enough? > > Why not ship it with a reasonable sane limit? 1000 perhaps? > Typically I would say this is not needed. Usually I would say that linux or other *Nixes typically do not try to baby the user or admin, it assumes the exact opposite of windows, in essence that you have a clue of what you are doing. But .... considering the bad press a sane very high limit might be nice. Then again if you can fire off processes on a box you can always find a way to bring it to its knees. Also, due to a flaw in like awstats for example a cracker did bring a guy's box to its knees by gaining access on the box (non-root) and forkbombing it. I read the irc transcript of his talk with the guy. Actually kind of interesting. Crackers are such worthless toads. Especially the script kiddies.