Why computer takes the division by zero as an undefined?
As we all know in mathematics, we can not divide a value by zero and if it happens we call it an infinite value. Something divided by 0 is infinity is the only case we use limit. But why the computer can not take this division by zero as an infinity?
As from the computer, division means subtraction. If you want to divide a number by some value, the computer does is divide it until zero or goes beyond zero.
Yeah, as from the above two examples on the diagram, it is clear that this method is correct and relies on this way. But if you divide a number by zero, see what happens.
From the above point of view, any number divided by 0 is infinity but if we multiply the dividend by infinity, the exact number should come. But it never happens as we can not define the number.
As from the above diagram, if we try to reach zero by x, the y value will go upwards and the negative value also doing this. So we can not define the infinite value and it is called the undefined value. So as a conclusion, we consider this as divide by zero is undefinable or undefined.