You are on page 1of 7

Why We Can't Divide By

Zero
By: Hadi Shatat
Grade: 10
In the world of mathematics many strange results are possible
when we change the rules, but there's one rule that most of
us have been warned not to break: don’t divide by zero. How
can the simple combination of an everyday number and a
basic operation cause such problems?
Normally, dividing by smaller and smaller numbers gives you
bigger and bigger answers, for example ten divided by two is
five, by one is ten, by one-millionth is 10 million and so on. So,
it seems like if you divide by numbers that keep shrinking all
the way down to zero the answer will grow to the largest thing
possible. Then, isn't ten divided by zero actually infinity ? That
may sound plausible, but all we really know is that if we divide
10 by a number that tends towards zero, the answer tends
towards infinity. And that's not the same thing as saying that
10 divided by zero is equal to infinity. Why not?
Well, let's take a closer look at what division really means. Ten
divided by two could mean, how many times must we add two
together to make ten, or two times what equals ten? Dividing by a
number is essentially the reverse of multiplying by it, in the
following way: if we multiply any number by a given number x, we
can ask if there's a new number, we can multiply by afterwards to
get back to where we started. If there is, the new number is called
the multiplicative inverse of x. For example, if you multiply three by
two and get six, you can then multiply by one-half to get back to
three. So, the multiplicative inverse of two is one-half, and the
multiplicative inverse of ten is one-tenth. As you might notice, the
product of any number and its multiplicative inverse is always one.
If we want to divide by zero, we need to find its multiplicative
inverse, which should be one over zero. This would have to be
such a number that multiplying it by zero would give one. But
because anything multiplied by zero is still zero, such a number
is impossible, so zero has no multiplicative inverse.
Does that settle things, though? After all, mathematicians
have broken rules before. For example, for a long time, there
was no such thing as taking the sqaure root of negative
numbers. But then mathematicians defined the square root of
negative one as a new number called I, opening a whole new
mathematical world of complex numbers. So, if they can do
that, couldn't we just make up a new rule? Say, that the
symbol infinity means one over zero, and see what happens?
Imagining we don’t know anything about
infinity already. Based on the definition of a
multiplicative inverse, zero times infinity must be
equal to one. That means zero times infinity
plus zero times infinity should equal two. Now
by the distributive property, the left side of the
equation can be rearranged to zero plus zero
times infinity. And since zero plus zero is zero,
that reduces to zero times infinity.
Unfortunately, we've already defined this as
equal to one, while the other side of the
equation is still telling us its equal to two. So,
one equals to two. Oddly enough, that's not
necessarily wrong; it's just not true in our
normal world of numbers. There's still a way it
could be mathematically valid, if one, two and
every other number were equal to zero. But
having infinity equal to zero is ultimately not all
that useful to mathematicians, or anyone else.

You might also like