Jonathan Berry wrote:
> It isn't that someone decided to implement a limitation, but that
> they didn't program around a limitation of 32 bit processors.
Somebody decided on size of an off_t back then.
> The
> limit is imposed by using the standard GNU libc as compiled by gcc
> on 32 bit processors. Considering that 1G hard drive was a large
> drive at the time it was implemented, it was not unreasonable to
> accept a 2G size limit.
I think it was unreasonable for a year or two of probably unmeasurably
small performance gain to force an ugly workaround to be needed for the
rest of the life of 32-bit systems.
Then what is reasonable? Make an off_t 64-bits long?
That's pretty obvious in retrospect, but I suppose everyone thought we'd
have had 64 ints long ago and it would happen naturally. Who could have
guessed that windows binary backwards compatibility would be a
requirement and that it would take so long to produce it?
Why stop there?
Sure, a 2^63 byte file sounds huge (admittedly it is, 8 uh...
Exabytes...), but remember that it wasn't that long ago that a 2^31
byte file sounded enormous. Just some food for thought.
There are lots of things in human terms where 32 bits aren't enough. Not
so many with 64 bits. Maybe we'll start counting in smaller units or
something.
There are many instances where people in this industry have been
rather short-sighted. A famous Bill Gates quote comes to mind. And
then there was the "Y2K bug," even though it never amounted to much in
reality. We are not in the habit of programming for growth.
Until Y2K proved otherwise we were in the habit of assuming that the
shortcuts made in programming would be replaced by something better
before they did any harm. The reason Y2K didn't cause real problems was
that every company where it could have spent an enormous amount of time
and money checking and fixing it ahead of time. I suspect we'll
actually see more problems next month when everyone's outlook
appointments are off by an hour from the DST move.
That said, with where we are, efforts should be made to ensure that
all programs can deal with things like > 2 (or 4) GB files. Things
are quickly progressing toward 64-bit and even under 32-bit we can
easily have files much bigger than that. The longer this "legacy"
code hangs around the more painful it will be to fix it later.
What's done is done, and backwards compatibility is a good and necessary
thing, but it means that every program needs to be rebuilt in a way that
invokes the macros for large file support. And they haven't yet, as
this thread demonstrates.
--
Les Mikesell
lesmikesell@xxxxxxxxx