On 10/04/2007 05:11 PM, David Miller wrote:
> From: Chuck Ebbert <[email protected]>
> Date: Thu, 04 Oct 2007 17:02:17 -0400
>
>> How do you simulate reading 100TB of data spread across 3000 disks,
>> selecting 10% of it using some criterion, then sorting and
>> summarizing the result?
>
> You repeatedly read zeros from a smaller disk into the same amount of
> memory, and sort that as if it were real data instead.
You've just replaced 3000 concurrent streams of data with a single
stream. That won't test the memory allocator's ability to allocate
memory to many concurrent users very well.
-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to [email protected]
More majordomo info at http://vger.kernel.org/majordomo-info.html
Please read the FAQ at http://www.tux.org/lkml/
[Index of Archives]
[Kernel Newbies]
[Netfilter]
[Bugtraq]
[Photo]
[Stuff]
[Gimp]
[Yosemite News]
[MIPS Linux]
[ARM Linux]
[Linux Security]
[Linux RAID]
[Video 4 Linux]
[Linux for the blind]
[Linux Resources]