Re: OT: computer arithmetic question on integer division

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, 18 Jan 2005 15:59:50 -0800 (PST), Globe Trotter
<itsme_410@xxxxxxxxx> wrote:
> Hi, thanks, to you and everybody!
> 
> My question is: how does a computer actually do it? When we are dividing a
> number by a multiple of two, for example 5 = 101 divided by 2 = 10, then all we
> do is shift the bit by the number of digits after the leading 1 in 2. So, 101
> divided by 10 means drop the last 1 in the 101 so we get 10.
> 
> What happens if we divide 5= 101 by 3= 11? Anything similar?

Oh, so you're more interested in the "algorithm", and not the actual
hardware (transistors and stuff).  The hardware implementation (which
is how the computer *really* does it) is covered by the subject known
as "computer architecture".  But you can discuss the algorithms without
the hardware, in which case it's then either computer science or
just plain mathematics.

In general, division by any power of your *base* is extremely
easy.  In everyday base 10 (decimal) everyone knows that to divide
by 10 you just drop the rightmost digit.  Some computers actually
work in base 10 (mostly older mainframes using what's called
"packed decimal"), and so for them division by 10 is trivial.

Most modern computers primarily work with base 2 (binary).
And as you discovered, division by 2 in binary is likewise just
dropping the rightmost digit (a right bit-shift operation).

It works the same for any base (>=2).

The difficulty is when you're not dividing by the base (or a power
of the base).  Then it's hard, and the most straightforward solutions
usually involve things like carrys, iteration, etc.  It's really too involved
to discuss here; but it's really not that much different from how go
about doing long division on paper (although there are more complex
optimizations out there too).

You may of course realize that division by any number is extremely
easy if you just convert your numbers into the correct base beforehand.
But, not surprisingly, the algorithms to convert from one base to
another in fact depend upon the division operation.

If you really want to understand these kinds of things in depth, perhaps
the best book you can get is Volume 1 of Donald Knuth's series
The Art of Computer Programming.  It's really the *bible* of these
sorts of things.

See http://en.wikipedia.org/wiki/The_Art_of_Computer_Programming

-- 
Deron Meranda


[Index of Archives]     [Current Fedora Users]     [Fedora Desktop]     [Fedora SELinux]     [Yosemite News]     [Yosemite Photos]     [KDE Users]     [Fedora Tools]     [Fedora Docs]

  Powered by Linux