[...] > > In fairness, computers are now so complex that it is almost inevitable that > > some unforeseen circumstances will arise. Not that this has any relevance to > > the problem I've just had, but I don't feel that it is possible, today, for > > any systems designers to be absolutely sure that nothing odd could happen. > > Umm, not so. I recall my "software engineers" complaining that > their software/firmware could not anticipate every possible > circumstance. I would point out to them that "Your code is going to > do *something* in every circumstance. What it does can either > be something you *chose* for it to do, a *considered* decision, > or it can be something you simply *let* it do, *without* > any consideration. Which do you prefer to have to support?" > > These days, all the "complex interactions" are pretty much due > to intelligent devices with uCs running firmware. If it does > something weird, then that's just some firmware writer being > in a certain sense "lazy". I would say that depends on the definition of lazy. It's kind of like professors who stand up in the intro to compilers class and say, "A good compiler could completely do away with the need for a program stack." In the ideal, yes, but ideal processors have infinite memory and run at infinite speed. Sure, there are plenty of lazy engineers, programmers, and techs, and plenty of impatient project managers, and plenty of sales personnel only too happy to promise impossible delivery dates, etc., but there are also problems that are difficult to solve, and testing can definitely be part of that class of problems, particularly when the machine being built has to deal with humans fumbling around. I don't think anyone has yet found a solution to the general class of NP-complete problems.