Rahul Sundaram wrote,
On 07/19/2009 11:31 PM, Tom Horsley wrote,
> On Sun, 19 Jul 2009 23:11:52 +0530
> Rahul Sundaram wrote,
>
>> It is not so simple. This is not a compiler bug. I suggest you read
>> through http://lwn.net/Articles/341773/rss to understand why.
>
> I did. It is a compiler bug no matter what a bunch of language
lawyer
> holier than thou compiler developers say :-).
The kernel developers claim it is a kernel bug and the compiler
developers claim it is not their bug and this everybody agrees with
and
yet you don't agree with all of them? Who is being holier than thou
here?
Kernel first dereferences a pointer, and after that checks whether
it's
NULL. It is quite common as a compiler optimization to compile out
code
like this.
Rahul
I vaguely remember some of the reasons (mainly the difficulty of
untangling the semantics of pointer tests), but, I must admit, after
using Java long enough, it seems odd to me that the C compiler would
be smart enough to catch the theoretical "don't care" and not smart
enough to distinguish between the reasons for the don't care.
I think I remember, way back when, using C compilers that would warn
about code that isn't reached and is discarded, and even warn about
code that dereferences a pointer before testing it after the pointer
is returned by a function call.
So it definitely feels like both a coding error and a (design class?)
bug in the way the compiler options interact.
It would be nice if certain combinations of options would at least
issue "potential evil compiler switch combination" warnings before a
compile started.
Maybe it does already, and the warnings are being swallowed in the
make process?
--
fedora-list mailing list
fedora-list@xxxxxxxxxx
To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list
Guidelines: http://fedoraproject.org/wiki/Communicate/MailingListGuidelines