On Fri, 2010-12-17 at 10:00 +0000, John Haxby wrote: > > > On 17 December 2010 09:41, Ralf Corsepius <rc040203@xxxxxxxxxx> wrote: > > That said, I'd choose "C" to getting started. It's a bit of a > rough ride > in the beginning, but it pays off in longer terms. > > > Actually, no, C is dead easy to start but it gets really difficult > really quickly. Consider these for a beginner: > > * Write the declaration of signal(3) -- it takes two parameters, an > integer and a pointer to a function that takes an integer paramater > and returns void. Explain why the parentheses are needed. > > * Why does "a + b == 0" work the way you expect but "a & b == 0" does > not? Are you sure it doesn't? > > * What is the difference between "const char *s" and "char * const > s"? > > * What is the difference between "char *s" and "char s[]"? > > Admitedly the very first of these is not likely to come up as a > beginner, but the other three will, and they'll bite you good and > hard. > > C is not a simple language, it has a lot of subtlety and it is > incredibly expressive, but I would not use it as the beginning > language for someone who wants to learn to program. I'd start with a > language that was designed carefully. There aren't any Algol68 > compilers any more :-) but I'd choose python or java to learn to > program. Once you know what you want to do then you can go for > something else, something applicable to what you want to do. When you > know the basics those questions about C are still difficult, but at > least you're not trying to understand them at the same time as knowing > what happens to a parameter when you pass it to a function or, for > that matter, what a function is. > > jch > I have taught 4 different languages professionally, BASIC, PASCAL, FORTRAN(actually a language with syntax similar to FORTRAN), and C. I prefer to teach people C. I am not a real C guru, but I can tell you that it has the power to either great abstraction, or to be very close to the machine. It depends on how the programmer views the problem. I know that there are people who will argue with me on that point, but lets not get sidetracked here. The goal of a programmer is to produce useful working code that does the jobs needed to be done. There are anomalies in all languages. Some of them are inherent to the language, some are due to the compilation process, in the conversion from a somewhat human recognizable syntax to machine code in several steps. So to talk about the things that have bothered one person in one language is not really material at this point. And in some cases those issues can compiler dependent (as the A+b==0 case where the potential errors come from whether a and b are integer, float, double or long double, and what is the defined value of true (b==0 can be 1 or -1, integer or long. If 1, it is binary 0000 0000 0000 0001 and -1 is 1111 1111 1111 1111 for example) And the C definition makes char and byte synonymous at this time, although new character sets are represented differently. So the discussions of evaluation are compiler dependent and in most cases somewhat machine dependent as well. My view is of a programmer who does mostly embedded type stuff, so it is different from a person who specializes in say graphics, or databases, or even text manipulation, or perhaps a medical or biological programmer. Furthermore, some programmers specialize in mathematical areas, such as filtering, or high precision, or signal processing. Others work in theoretical physics. Choosing Python to manage long chain mathematics would probably not be too efficient or productive. Fortran would not be good for text manipulation. Lisp would not make a good report generator (in my opinion anyway), APL would not be good for database administration. I do all kinds of programming in C. I cannot do a full database from scratch in C, and would probably use SQL in some form for that, but for most "quick hack" tools I still use C from the command line. Most of my programs are repeat use, but require very little interaction and little in the way of feedback (think YUM as an example). When working on microcontrollers I use Assembly for most programs. When working in my professional field as a Test Programmer, I rely on C, BASIC, assembly and machine language (bit control of parts is required to test them). This is all to say that the ultimate language you use will be mostly determined by your career path, but you will most likely use more than one. To learn, I recommend Intel Assembler code and C. Mostly because they are somewhat good examples of both power and complexity, and can produce quick simple programs that do useful things and represent real accomplishment. But remember my programming is focused on functional programming. That is programs that produce hardware operations in real time. I would also recommend reading a lot of code. There are a lot of oxymorons in code production. Self documenting code is one. If it were really self documenting we probably wouldn't call it code. Logical sequence is another. While many things take place in a logical sequence, when programming for efficiency, what is fast and what is logical will sometimes be at odds. Reading code and reading about code will give you an edge in these types of issues. Pick a language and work on it. If you get frustrated or need help, see if the language has a forum. Read it. Ask for help. There are dumb questions, but not to ask them is even worse. If someone blasts you about the dumb question, then tell them you don't know much and are willing to learn, but then make the real effort to learn. Programming is fun, challenging, nerve wracking, frustrating, and amazing all at the same time. Regards, Les H -- users mailing list users@xxxxxxxxxxxxxxxxxxxxxxx To unsubscribe or change subscription options: https://admin.fedoraproject.org/mailman/listinfo/users Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines