Re: OT: Requesting C advice

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Sat, 2007-05-26 at 08:03 -0500, Chris Schumann wrote:
> Late reply; sorry.
> 
> > Date: Thu, 24 May 2007 08:43:26 -0700
> > From: Les <hlhowell@xxxxxxxxxxx>
> 
> > Embedded applications today
> > are mostly 8 bit, but many, many designers have already begun the
> > transition to 16 bit, and soon 
> > will be moving to 32 bit.  The reasons are much the same as 
> > the reasons
> > that general computing has moved from 8 to 16 to 32 and now 
> > to 64, with
> > the cutting edge already 
> > looking at 128 bit and parallel processing, along with dedicated
> > processors running 32 or 64 bit floating point math.  Also 
> > the length of
> > the integer used in C, which is a virtual 
> > machine is independent of the word length of the processor, 
> > except the C
> > language designers (originally Kernigan and Ritchie) made the language
> > somewhat flexible to simplify migration.  That is why there were some
> > undefined situations in the original specification.  Remember 
> > that C is
> > a virtual machine language, whose processor only has 24 
> > instructions (I
> > think the Ansi committee added a couple, but they have specific uses
> > that were not foreseen in the original usage of the language) 
> >  It can be
> > ported to any machine currently extant by only writing about 1K of
> > machine code, and even that can be done in another available higher
> > level language if you so desire, as long as it is compiled for
> > efficiency.
> 
> Having used C since the original K&R version, I have to ask WHAT?!?
> 
> Since when is C a virtual machine language?

Having been on the ANSI C committee back in the day, Chris is correct.
C was NEVER a "virtual machine language" system.  That was left to
Nicklas Wirth's team with the old UCSD P-System Pascal language and its
derivatives (such as Modula 2).

Most compilers had multiple passes, a preprocessor (which processed the
"#" statements", a tokenizer, and a machine-specific code generator.
Whitesmiths' C made this REALLY clear as the preprocessor was called
"cpp", the tokenizer "cp1" and the code generator "cp211" for the
PDP-11, "cp2vax" for the VAX, "cp286" for the Intel x86 (186 and 286 in
those days) and so on.

In fact the fact that there was a preprocessor is what gave you the
first implementation of C++.  Bjorn Sostroup (I'm sure I butchered the
spelling of his name) replaced the preprocessor with a new version
called "cfront" and you got C++ (or "incremental C").

> The only CVM I can find is Java's JVM. They have modified gcc (a C
> compiler) to produce byte code for that JVM.
> 
> Every compiler I've used compiles C to native machine code for the
> target platform. There is no intermediate language, and that's what
> gave C its famous speed.
> 
> (and because it is a virtual machine language...)
> >     That is why even the 8 bit implementations of C used a 16 bit
> > integer.
> 
> No it's not. They used 16 bit integers because you can't do much
> of anything useful with only 8 bit integers. The compiler designers
> for those systems (like the Apple II) had to work around the 8 bit
> registers. Looking at the assembly-language source for some of the
> libraries was not pleasant.

Actually, that was a huge bone of contention in the committee meetings.
The 16-bit integer was the "native" size of the registers on the machine
that C was developed on (the PDP-11), so it sorta stuck.  However, the
standard makes absolutely no guarantees on how big an "int" is.  It is
completely up to the implementer of the compiler as to how big an "int"
is (or a "char" is or an "unsigned long long" is...that's why the
"sizeof" operator exists).

There is also no guarantee as to what a "null" is, other than no
legitimate object will ever have a "null" address.  I know of one system
where the null address is all ones.  The compiler knows that '\0' should
be converted to the all ones format as well as when the word "null"
should be, too.

P.J. Plaugher (founder of Whitesmiths') and the committee secretary had
a great way of putting it:  "An int is not a char and it's not a long,
but will be the same as one or the other at various times in your
career."

As far as the libraries are concerned, the initial draft of what was in
the standard C library made the library so damned big that it wouldn't
fit in the standard process memory footprint of a VAX at the time.
That's when it got split up into the standard library, the "math"
library, the "double precision" library and several others.

Once the concept of splitting up libraries came up, lots of splits
were proposed: string handling was going to be in a separate library,
network stuff, file management, you name it.  Some people actually did
implement separate libraries, as the famous Sun network library split
shows.

----------------------------------------------------------------------
- Rick Stevens, Principal Engineer             rstevens@xxxxxxxxxxxx -
- VitalStream, Inc.                       http://www.vitalstream.com -
-                                                                    -
- grasshopotomaus: A creature that can leap to tremendous heights... -
-                                                ...once.            -
----------------------------------------------------------------------


[Index of Archives]     [Current Fedora Users]     [Fedora Desktop]     [Fedora SELinux]     [Yosemite News]     [Yosemite Photos]     [KDE Users]     [Fedora Tools]     [Fedora Docs]

  Powered by Linux