Tuesday, August 26, 2014

Foundations and Other Unneccessary Things


The economist John Maynard Keynes once said "Madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back.". Practical men, to paraphrase, are usually under the spell of popular philosophy. A few weeks back I did a post on Wittgenstein's criticism of the logicist program. I concentrated on a technical aspect, he pointed out that the interpretation of quantification over infinite sets is left open (that is, there are multiple models for given sets of axioms), therefore the alleged foundations of mathematics don't specify a specific mathematical language. Modern mathematicians admit this, but don't care. I didn't go into as much detail about a stronger, but more philosophical, criticism. Principa Mathematica , Die Grundlagen der Arithmetik, etc claimed to be the foundations of mathematics, but if we found an error in them (and an error was found in Grundlagen), then we would dispose of the book and not mathematics. In other words, in practice, there is nothing special about axioms that make them "below" theorems. Mathematics, and Wittgenstein argues science and even more life in general, is more like a hyperbolic tower where everything leans on everything else than an inverted pyramid where everything leans on the bottom stone. I bring up Keynes because I realize now that there is no way to read this and not be affected. I may or may not be a Wittgensteinian, but he has affected how I see things in a fundamental way. I must keep this in mind when I enter into "foundational" controversies.


After my Jaynes post, I did a bit of re-reading of his big book. What is the value of Cox's Theorem? What makes it superior to the usual Kolmogorov Axioms? To the extent that Cox and Kolmogorov disagree, so much the worse for Cox (as far as I can tell). Kolmogorov's axioms are deliberately vague as to interpretation. They are models for statements about normalized mass or subjective valuations of probability. Cox's theorem is no shorter or more intuitive. I don't think that the interpretation that the functional equations are about subjective degrees of belief is any more suggested than in the Kolmogorov axioms (that is, it isn't at all). Why? We can interpret f to be "the sand in this unit bucket outside this set", then recognize that being outside the outside is being inside, etc. Therefore, Cox isn't any better a foundations for subjective probability than Kolmogorov.

Azazoth

Cox's theorem isn't strong enough to constrain countable unions, which means that if it was The Real foundation of probability then it would run into strange problems. As I said in the Wittgenstein post, mathematicians like to deal with the infinite by making it as much like the finite as we can without risking contradiction. Countable additivity is a way of doing this. If you have half a bucket of sand and half a bucket of sand, then you have (half plus half equals) one bucket of sand. That's additivity in a nutshell. But if you add up infinitesimal (that is, limits of smaller and smaller scoops of) grains of sand (in a limiting procedure), what happens? In countable additivity, you get a bucket of sand - lucky you. In finite additivity, the answer isn't defined. There's no reason to think that you wont add up bits of sand and get Azazoth. In other words, you give up the ability to compute probabilities.


The problems are even worse for the Bayesian, because finite additivity isn't consistent with conditionalization (hat tip: A Fine Theorem). Since finite additivity is all Cox's Theorem gives you, clearly it needs to be made more robust! (Unlike, say Kolmogorov's Axioms) Obviously, I strongly disagree with that paper's thesis that de Finetti gave "compelling reasons" to abandon countable additivity, and regard de Finetti's examples of "intuitive priors" as bizarre. (Also, I find A Fine Theorem's Kevin Bryan's arguments even weaker. It isn't obvious to me that his hostile description of frequentist consistency is induction in any sense, much less a bad one...). The famous Bayesian Jaynes must have at least sensed this, because he was always combatitively pro-countable unions. But is his foundation for himself a castle built on sand? The answer is obvious to me, Jaynes just never cared about such things, thought it was a merely technical problem without deep import to general theory (he says in the appendix that the only difference between his approach and Kolmogorov's was that Kolmogorov took an infinity first method and him an infinity last).

Dr Fine, Dr Howard and Dr Howard in deep philosophical debate

This issue might be worth maintaining low level controversy about it, and Kolmogorov put it in the right place - as a questionable but reasonable assumption. An "axiom" as we mathematicians say. Sure, countable additivity is so useful and clearly correct in so many contexts that giving it up seems like giving up your legs. But science is multithreaded being, and intellectual controversy often ends in clarification. But in the Cox framework, finite additivity isn't a theorem, it's just a quirk of not constraining our function enough. It just doesn't feel like enough to me, it seems to me that if Kolmogorov, Doob et al were wrong they must be wrong in a much deeper way. Anyway, that's enough about countable probability.

As I said from the outset, it seems obvious to me that axioms are philosophical matters and arguing about them gets you into nothing but a Wittgensteinian language game. But there are differences between Kolmogorov and Cox about finite additivity (and whether functional equations are more intuitive than measure theory). So maybe there is some, small content there. Therefore, I will now e-beg for answers. Tell me about the wonders of Cox's Theorem, internet! I'm all ears!

No comments:

Post a Comment