On Mon, Jan 17, 2005 at 03:15:38PM -0800, Globe Trotter wrote: > Does anyone know how the computer *actually* does integer > division? I understand that with powers of 2, it just does bit > shifting, but what about other powers, do you know? > > (for example, 5/2 = 101/10 so shift the 101 by 1 and get 10, but how > does 5/3 work, lets say?) You need to look at the assembly code. Some compilers will see the constant (example 2 ) do a shift or other optimization. With variables in the compiler might have to use the div (idiv) instruction. With constants interesting optimizations take place. Since div/mul by 2 is an optimization the precise code generated can differ depending on the flags passed to the compiler. Both idiv or shift will get the correct answer. For example this line in some dumb test code will also get the correct answer. 6 d=144/2; Becomes movl $0x48,0xfffffff0(%ebp) i.e. the compiler does the computation and loads the constant. No shift, no idiv. (144/2) is a constant. Or are you asking how the compiler computes the constant that it uses after it discovered the code was using a constant. Some compilers will (correctly) go so far as to toss an entire computation block if the results are never referenced. -- T o m M i t c h e l l spam unwanted email. SPAM, good eats, and a trademark of Hormel Foods.