I'm curious what everyone thinks about dividing by zero. That's the real mindfuck.
I'll try to elaborate on this one. Basically you can see why "you can't divide by zero" when you use sequences. A sequence is literally an infinite sequence of numbers. Like 1, 1, 1, 1, .... is a sequence. There are different types of sequences and one of them is a converging sequence. Look at this image:
a sequence converges if and only if there exist a number L, so that for every number e I choose, eventually the "tail" of the sequence will fit into the interval L+e, L-e. Try to think about what this means. For the picture if I pick e = 10000 it's obvious. But it must work for every L so I can pick e = 0.0000000000000000000000000001 etc.
We say that this L is the limit of the sequence.
Anyway, back to "dividing by zero"... But first let's look at the sum 2+3. Obviously it's 5. But we can also see this because of the following.
Take a series that converges to 2, take one that converges to 3 and add them. The sum will converge to 5.
This is obvious. Let's take the sequence 2.1, 2.01, 2.001, 2.0001, ... This sequence obviously converges to 2. In the same way let's take 3.1, 3.01, 3.001, ...
The sum is 5.2, 5.02, 5.002, 5.0002, ... and this sequence will converge to 5.
I could also have taken 1.9, 1.99, 1.999, ... and 2.9, 2.99, 2.999, ... The sum now is 4.8, 4.98, 4.988, 4.998, ... which also converges to 5. No matter what the series is that converges to 2 and to 3, to sum will always converge to 5.
The same is true for any operation like 4+7, 3*27 or 0*5.
Now let's look at for example 2/0. Take any sequence that converges to 2.. for simplicity let's just take the constant sequence 2, 2, 2, 2, ....
Now we can make one that converges to 0 like 1, 0.1, 0.01, 0.001, 0.0001, 0.00001, ...
Now if we divide these we get 2, 20, 200, 2000, 20000, .... as you can see this sequence does not converge to any number, the elements just get bigger and bigger. We say this series diverges to + infinity.
Good. So you could say that 2/0 = + infinity right? Well no because if we take the sequence -1, -0.1, -0.001, -0.0001, ... and use this to divide the constant series that converges to 2 by, we get -2, -200, -2000, -20000, ... Here the elements just get more and more negative and we say this series diverges to -infinity. So 2/0 = - infinity.
And because infinity does not equal minus infinity we can't divide be zero because you get different results depending on how you interpret it.
PS. if you take 0/0 you can get any real number if you construct the series right!
I hope this helps a little.