this looks very interesting, however one thing that looks odd to me in
this is the thought of comparing the results for significantly different
hardware.
for some of the loads you really are going to be independant of the speed
of the hardware (burn, compile, etc will use whatever you have) however
for others (X, audio, video) saying that they take a specific percentage
of the cpu doesn't seem right.
if I have a 400MHz cpu each of these will take a much larger percentage of
the cpu to get the job done then if I have a 4GHz cpu for example.
for audio and video this would seem to be a fairly simple scaleing factor
(or just doing a fixed amount of work rather then a fixed percentage of
the CPU worth of work), however for X it is probably much more complicated
(is the X load really linearly random in how much work it does, or is it
weighted towards small amounts with occasional large amounts hitting? I
would guess that at least beyond a certin point the liklyhood of that much
work being needed would be lower)
David Lang
--
There are two ways of constructing a software design. One way is to make it so simple that there are obviously no deficiencies. And the other way is to make it so complicated that there are no obvious deficiencies.
-- C.A.R. Hoare
-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to [email protected]
More majordomo info at http://vger.kernel.org/majordomo-info.html
Please read the FAQ at http://www.tux.org/lkml/
[Index of Archives]
[Kernel Newbies]
[Netfilter]
[Bugtraq]
[Photo]
[Gimp]
[Yosemite News]
[MIPS Linux]
[ARM Linux]
[Linux Security]
[Linux RAID]
[Video 4 Linux]
[Linux for the blind]
|
|