Gottlob Frege
It's interesting, though not surprising, that Frege toiled in such obscurity. All of mathematical logic was suspect until it finally started bearing fruit by taming set theory and giving rise to computers. Mathematical logicians had a reputation of pedants even among mathematicians - and Frege was unusually careful and rigorous even by the standards of mathematical logician.
Bertrand Russell
Of course, Frege wasn't a complete unknown. Frege influenced Bertrand Russell and Peano to be more bold in their formalism. Obviously, Wittgenstein's early philosophy is entirely an attempt to draw out more philosophical consequences of Frege's methods and insights. His later philosophy is also deeply Fregean, though more critical than his fawning early work. I'll come back to this in a bit. Dedekind and Zermelo were aware of his work and held it in esteem. At the time, the analytic/continental philosophy distinction did not exist, so Frege actually had a good bit of influence on several "continental" philosophers. He was one of very few teachers Gershom Scholem respected. Scholem attempted to communicate Frege's ideas to Walter Benjamin, which seems to have been a bit optimistic on Scholem's part. Frege helped embolden Husserl to completely abandon psychologism in mathematics - which became a major plank in developing transcendental phenomenology. All this adds up to one thing: this is going to be one of those black and white pictures of dead men posts.
Gottlob Frege
Frege's analysis of mathematical language was a shining philosophical gem: it killed the mistaken Millian theory of psychological abstraction as a foundation for mathematics and seriously wounded Kant's related notion that arithmetic was synthetic (further work by Godel showed that it was not synthetic in another sense). His book Begriffsschrift may be the greatest technical piece of philosophical argument ever written. I'd like to spend a little time developing what his analysis would be in modern terms.
How do you define "a definition"? This is one of the most fundamental tasks one must take in logical analysis, but it can be surprisingly difficult to do. One answer would be naive atomism: each word (except the logical connectives) represents one idea and sentences are fusions of such ideas. This won't do. Some words are relations which gain meaning only when surrounded by other words. Some words are functions that gain meaning when given an argument. Examples include "My father's mother is gone.", "God's in His Heaven - all's right with the world!", "The cat is on the mat." and "The square of two is four.". The word 'square' in the sense used in the last example is obviously a function. The word "My" in the first is also a function, as is "father's". The word 'on' in the third example is a binary relation.
Frege's solution, which you might call "limited holism", was that each word gains meaning only in the context of the sentences in which it is used. The basic unit of meaning is the sentence in the following sense: only a sentence may be true or false. A "definition" is a rule that tells one how to go about using a word in a way that generates true sentences. When you observe a certain state of the world (in a very rough grained, perception/culture/etc mediated way), you may convey that state of the world to an English speaker by uttering "The cat is on the mat.". When young Pippa observes a certain state of the world (or, more accurately, passes without carefully observing it) she conveys this to the people of Asolo (including herself) by saying "God's in His Heaven - all's right with the world!". There is little syntactical difference between these sentences.
David Hilbert
Frege's vision was somewhat confused because he did not always carefully distinguish syntax and semantics. If symbols are "defined" in the sense above, then they have semantic content. The "definitions" of all the terms of a Fregean language give a model for a syntactical system. The well formed formulas of the syntactical system are given by the true sentences of any model of that system. This means that, for instance, if a formula can be shown to be well formed by purely syntactical means, then it must also be true in all models (I think Godel was the first to notice this). But a formal system may have multiple interpretations, more than one model. This was first formally recognized by Hilbert, who used it to demonstrate the relative consistency of different formal systems.
Carl Gauss
However, even before Hilbert, this model theoretic vision was being used to do non-trivial mathematical and philosophical work. I'll try to explain how one can use model theory to prove that the parallel axiom is independent of the others. Start with all of Hilbert's Axioms considered purely as a formal system. Obviously, Euclidean geometry is one model of these axioms systems. Guided by that model, construct the following objects: great circles on a sphere and antipodal points on the same sphere. Call the great circles "line2s" and antipodal points "point2s". We know have new sentences that are concatenations of old primitives. If we take these sentences as our "primitives", then we find that line2s and point2s satisfy all the axioms ... except Euclid's axiom! When we get to that one, we find instead that given a line2 and a point2 not on that line2, any line2 passing through that point2 will intersect the given line2. Put aside the model for a moment. The syntax of line2s and point2s is just concatenations of earlier concepts. But we don't have to bring in the Euclidean models for these meaningless symbols. We can use elliptic geometry by itself as a model. They have two - really, infinitely many since there are so many Riemannian geometries - models. This shows that the two systems are relatively consistent - one is not contradictory unless they both are.
Ludwig Wittgenstein with his family (including his sister, a woman! Wooo)
I promised earlier to go over one of Wittgenstein's criticisms of the Fregean vision outlined above. Since we only observe people's behavior, we cannot in general know the rules that they are "really" using. Let's say that a highly educated person, like myself, is working on a programming project that puts them along side a brilliant self-taught programmer. At one point in the project, presumably as she's explaining something, she writes "1+1=2". We both agree that this sentence is true. At another point, I write on the same board "23,412,341,243+432,141,234=23,844,482,477" and she storms out of our work area and demands that I be fired. You see, she learned to add on a 32 bit machine. The correct version of the above sentence is clearly "23,412,341,243+432,141,234=2,369,645,997". Why should she have to work with an incompetent like me?
Physical behavior (including human behavior) is syntactical ("The world is the totality of facts, not of things."). We want to be able to attribute to this syntax certain semantics. For instance, I might want to interpret your sound making as a meaningful sentence. I have a model for your verbal utterances. But this model is not unique. The meaning of terms is a social process that can break down, as shown above.
This was part of Wittgenstein's general criticism of (his interpretation of) Frege's idea of language. Fregean analysis works fine on many things - for instance those Hamiltonian systems I'm always talking about. I have a great example in terms of Sinai Billiards in particular that I don't now have the energy to go through. Even given this, Wittgenstein's example shows that it may not be very good at analyzing language - which is what it promised to be!
David Lewis
In order for a rule to be learnable, it must be (at most) recursive. In fact, it must be fairly efficiently learnable to gain any popularity. Bacteria and other primitive organisms signal each other, these signals may be very low on the Chomsky hierarchy. The theory of signalling in biology is well established. It's a particularly successful application of game theory, first applied by the philosopher David Lewis in his book Convention. This may be the single most successful theory started by a pure philosopher in the 20th century. The theory was actually perfectly rigorously described by Hume in the 18th century. The core insight is that conventions are not essentially linguistic, instead language is conventional. Successful communication is given by success in some other sense (biological fitness, utility, etc.). We only care about having different models insofar as they are inconsistent and even then only insofar as the inconsistency affects things. This is not what Frege promised. He promised too much. Wittgenstein was correct to point this out.
Donald Davidson
The signalling games considered by Lewis explicitly are very simple, though he gives a sketch of how to go about human language. If signaling rules are learnable, then they must be at most recursive in complexity. Living humans have an extremely advanced immune system that is a "model" in the Hilbert sense for a Fregean formal syntax. The immune system learns and communicates in a very complex way. Humans also have spoken language which is also at most recursive. Learning a language is (partially) gaining enough of a culture that one can apply enough of a model to the verbal syntax of other speakers that behavior can be coordinated. A command is true when it is obeyed, a question is true when its declarative translation is true, etc. The varieties of signalling syntax that can be understood so are called "languages". Other signals - such as the chin flick, the "get bent" gesture, the Bronx cheer, the middle finger, blushing, smiling etc. - cannot be interpreted so and are not languages (though they may be cultural - such as the middle finger - or biological - such as smiling determined signals). The signal game is larger than the language game.
Well, we've come a long way from the original, simple Fregean vision. I believe that vision is broadly correct, even with all the adjustments made above. Frege's seeming pedantry and perfectionism made him obscure in his life except to a few people who shared similar obsessions, but they gave birth to the modern, computational world.
No comments:
Post a Comment