What do you mean exactly?
I'm saying that the "..." notation isn't really a proper notation for numbers, and it kinda means "reader, I assume you know what I mean, and I'm too lazy to write infinite numbers here". Using this notation for proving something doesn't really work too well.
Then what does 1/9 equal in decimal form?
There is none. There are many numbers that don't have a decimal expansion, including Pi.
1 = 1
1/9 = 1/9
1/9 = 0.11111...
9*(1/9) = 9*(0.11111...)
1 = .99999...
QED
I prefer this: https://upload.wikimedia.org/math/6/f/a/6fa510b44742046a167b4b8515162825.png
This one isn't really that great either. It just relies on a different assumption, which is that the limit of x going towards a number is that number. So, in the end you have a tautology, since the limit notation relies on exactly the same thing as the thing you're trying to prove.
To me, it's mostly just a matter of definition that 0.99999... = 1. The argument goes that the difference is infinitely small and thus you could never define a meaningful difference between the two.
rumborak