Tuesday, September 7, 2010

It can't be true!

Yesterday I had the joy of sharing Cantor's famous diagonal argument, proving that the real numbers form an uncountable set. This semester, instead of teaching my own class, I get to be a TA for an honors Calculus III course, which goes through many advanced topics to prepare students for higher level mathematics. As a result, I get to share moments like these, in which students go from being calculators to being real mathematicians.

Mathematics finally means something to a student when he encounters a proof of something that he can't believe. I had at least one student express his inability to accept the diagonal argument after I gave it. Of course, the proof is irrefutable--it is logically sound, and despite the large number of attempts made every year by crank mathematicians, it can never be overturned.

That's part of the sheer beauty of mathematics. It shows how pure logic can yield surprising results. Why are we surprised by things that are logically irrefutable? This is not a mathematical question, but a question about the human spirit; yet in some respects it is one which only the mathematician can encounter. No one else can experience quite the same thrill, or frustration, at coming face to face with those results of pure logic that seem to break down every sense of intuition you ever had.

Why does Cantor's proof frustrate students so?

The result itself frustrates people in general, at least in my experience. When you explain that the real numbers can never be counted, even if you counted them for all eternity, even if you go all the way to infinity, they do not believe you. Or they don't understand what you even mean. The young student of mathematics, on the other hand, can see and accept each line of the proof; yet much like any other person, he is intuitively troubled by the result itself. This manifests itself in the form of complaints about the proof; yet what it always seems to come down to is that the result is incomprehensible on any intuitive level.

Why is this? I don't really think that it is because the logical steps are hard to follow. Indeed, I think that even a person relatively untrained in mathematics can understand each step in the proof. What I think people can't really and truly deal with are the starting assumptions themselves. To embrace those assumptions takes imagination.

What is infinity, in the first place? To the extent that it touches upon our every day experience, infinity can actually be something quite small. The national debt, for instance, is for all practical purposes, infinite. No single human being can fathom having control of that much money. If I had $13 trillion in my bank account, I would simply never run out of money. I could spend $5000 every second for the next 80 years and still not run out of money.

Infinity, then, is a guarantee: there's always something left. The truth is, though, 13 trillion can be a very small number of some things. Avogadro's constant is somewhere on the order of a trillion trillions; yet this ridiculously large number is equal to the number of molecules in a measly 32 grams of oxygen. Numbers in this universe get so ridiculously large that our minds stop distinguishing between them all. (This has unfortunate political consequences, as people have yet to truly comprehend how astronomical their own governments' spending really is.)

So then, one who is willing to stretch out with the imagination can make this assumption: for every number n, there is always a number n + 1. You can always go, as Christopher Guest on Spinal Tap might put it, "one louder." We have formalized the guarantee of infinity. We have assumed the existence of an infinite set of numbers. One, two, three, four, five, ... ten, ..., twenty, ... thirty, ... one hundred, ... one thousand, ..., one million, ..., one trillion, ... why should it ever end?

But that isn't the only kind of guarantee one seeks in this world of computation. We also seek the guarantee of being really close to something. For instance, there is this ugly business of finding the circumference of a circle. A circle, of all things! A beautiful, simple shape--nothing could be simpler, really. And yet, the ratio between its circumference and its diameter is such a monster of a number that we are forced to give it a Greek name--"pi"--and leave it at that.

Or are we? We all heard in grade school that pi = 3.14... People sometimes wonder if I have all the digits of pi memorized, although some more sophisticated folks know that you can't memorize all of them but still wonder how many I know. Well I know up to about eight, I think: 3.1415926... And I've heard school children will sometimes have contests to see how many digits of pi they can memorize. Yet no matter how many digits they memorize, they will never have actually gotten pi. Never. Not in a million billion trillion years--not ever.

But they are getting closer and closer. How much closer is rather easy to quantify. If I guess pi = 3.14, then I am within one-hundredth of pi--that is, pi is between 3.14 and 3.15. If I guess pi = 3.1415926, then I am within one 10 millionth of pi. That's pretty close. Never exactly right, but closer and closer. So that's my guarantee. I can always get closer.

There are plenty of other common ratios in the real world which have no exact finite decimal representation--in fact, no repeating pattern ever emerges in their decimal expansions. The square root of 2 is one of them--this is simply the length across the diagonal of a square. The square roots of most numbers are the same way, and these can all be represented using common, everyday shapes. In each case, we do have a guarantee, as we did with pi. We can get closer and closer by taking more and more decimal places. That is, we can take the ratio between two ordinary natural numbers and be as close as we want to be to a weird number like pi.

Now what if I imagine that there is a number at the end of every conceivable decimal expansion? I say to myself, "Look, I can just start typing numbers after a decimal point, and there's no stopping me." In fact, just like at me as I cough up numbers right now:

.011923985461928384093745601092983562390523984691348612340213901098234908123984...

Try it! It's kind of therapeutic, actually.

Now I stretch out my imagination and put my faith in a new assumption: that this process of picking new digits can continue on forever and ever, and that no matter how the digits are picked at each step, the result can rightfully be called a number. I don't know what that number is, but I know that I'm getting closer and closer to it with every arbitrary choice of a new digit. Just as it took imagination to take on the assumption that there is always one more number, so it also takes imagination to embrace this new assumption. It means formalizing a guarantee.

It is, in a sense, an act of faith. Implicitly it means trusting that guarantee to have some sort of meaning. Otherwise, what would be the point of studying the logical results of that guarantee?

If you've gone with me this far, if you have enough faith to believe that after every number n there's always n+1 and that any decimal that can be continued on forever should be considered a number, then congratulations! You have, more or less, just embraced what mathematicians call the real numbers. If not, don't worry. Chances are your world can do without such big numbers. Even the national debt is probably higher than you'll ever need to imagine.

Young students of mathematics have generally been unwittingly indoctrinated into having faith in the real numbers. Duh, there's always a bigger number, and of course every decimal is a number. Why would we have to "imagine" that? Why, indeed! The ancient Greeks didn't even believe in zero.

Our educational system teaches these principles from a very young age. Enter pi in on your calculator. See? A decimal comes up! And those digits can keep going and going... Thus the youth are catechized into the traditions of their elders, unaware that without imagination, none of the structure in their mathematical universe could ever have arisen.

No wonder young mathematicians are so shaken by Cantor's proof! Indoctrinated into the assumptions which the proof begins with, these young minds are totally unprepared to handle the consequences of those assumptions. For as soon as the mind willingly submits to the two assumptions I have fleshed out here, then simple logic reveals a truth that is so staggering that it has drawn downright hostility from philosophers and logicians ever since Cantor first made his argument.

The claim is simple. Take all of those counting numbers: 1, 2, 3, 4, 5, 6, 7, ...
And I mean all of them, all infinitely many of them, because you know there is always one more! And now to each one of those counting numbers, assign some decimal, like this:
1: 0.123846130865861902034109826394384...
2: 0.988349102039136358923403948190234...
3: 0.238463985658654865435864238653428...
4: 0.843565643854843643874398340954893...
5: 0.458430938230483240893291293048239...
6: 0.789234923902309238942394823094872...
7: 0.097138428094217089340873078320683...
...
and so on. Except, don't just do it how I just did it--do it however you want! Be completely arbitrary!

Here's what happens: you'll never get all of the decimals!

Never! Never ever! No matter how cleverly you chose which decimals to match to each of your counting numbers! Even though you always have one more counting number--that is, even though there are infinitely many counting numbers--you still don't have enough. The decimals are to the counting numbers what the national debt is to your savings account.

And the argument is quite simple: just read down the diagonal of that list you just made. Take the first digit of the first number, change it to another digit, and write it down. Take the second digit of the second number, change it, and it write it down next to the first one you just wrote. Do the same with the third, and the fourth, and so on. You'll get a new decimal. Mine would start to look like this:
0.2996455...
All I did was add 1 to each of the "diagonal elements" of my list. You can't add 1 to a 9, but you can just change 9 to 8, and the same idea holds.

Now, is this new number in your list? No! It can't be. It's not the same as your first number, because the first digit is different. It's not the same as your second number, because the second digit is different. It's not the same as your third number, because the third digit is different. And so on, even for every single counting number.

That's Cantor's proof. As shocking as the result is, the proof is nothing more than simple logic. Yet logic has to build on certain assumptions, and it's really those assumptions that set up this amazing result. It must have been that Cantor was the first person to fully buy into all of those assumptions. He was a mathematician of true faith, and true imagination.

And also true bravery. When I say that Cantor fully embraced those assumptions about numbers, I mean that he was even willing to embrace the logical consequences of those assumptions. That is faith. And without it, there can be no progress.

I'm allowed to say such things, because it's my blog. But I seriously wonder, how many assumptions do we take for granted, yet without being willing to accept their logical consequences? It is often only when someone shows you what the logical consequences are that you're able to see what the assumptions actually mean.

This, to me, is what's so liberating about mathematics. Whatever faith you have in your assumptions will be thrown to the fire to be tested. You must seek out the logical consequences of whatever you start with. And if you are able to overcome your initial fear, you might just find that the universe is a much grander, more majestic, and more mysterious place than you had ever imagined.

No comments:

Post a Comment

I love to hear feedback!