proof-theoretic reflection principles See REFLEC -. TION PRINCIPLE. proof theory, a branch of mathematical logic founded by David Hilbert in the 1920s to pursue Hilbert’s Program. The foundational problems underlying that program had been formulated around the turn of the century, e.g., in Hilbert’s famous address to the International Congress of Mathematicians in Paris (1900). They were closely connected with investigations on the foundations of analysis carried out by Cantor and Dedekind; but they were also related to their conflict with Kronecker on the nature of mathematics and to the difficulties of a completely unrestricted notion of set or multiplicity. At that time, the central issue for Hilbert was the consistency of sets in Cantor’s sense. He suggested that the existence of consistent sets (multiplicities), e.g., that of real numbers, could be secured by proving the consistency of a suitable, characterizing axiomatic system; but there were only the vaguest indications on how to do that. In a radical departure from standard practice and his earlier hints, Hilbert proposed four years later a novel way of attacking the consistency problem for theories in Über die Grundlagen der Logik und der Arithmetik (1904). This approach would require, first, a strict formalization of logic together with mathematics, then consideration of the finite syntactic configurations constituting the joint formalism as mathematical objects, and showing by mathematical arguments that contradictory formulas cannot be derived.
Though Hilbert lectured on issues concerning the foundations of mathematics during the subsequent years, the technical development and philosophical clarification of proof theory and its aims began only around 1920. That involved, first of all, a detailed description of logical calculi and the careful development of parts of mathematics in suitable systems. A record of the former is found in Hilbert and Ackermann, Grundzüge der theoretischen Logik (1928); and of the latter in Supplement IV of Hilbert and Bernays, Grundlagen der Mathematik II (1939). This presupposes the clear distinction between metamathematics and mathematics introduced by Hilbert. For the purposes of the consistency program metamathematics was now taken to be a very weak part of arithmetic, so-called finitist mathematics, believed to correspond to the part of mathematics that was accepted by constructivists like Kronecker and Brouwer. Additional metamathematical issues concerned the completeness and decidability of theories. The crucial technical tool for the pursuit of the consistency problem was Hilbert’s e-calculus. The metamathematical problems attracted the collaboration of young and quite brilliant mathematicians (with philosophical interests); among them were Paul Bernays, Wilhelm Ackermann, John von Neumann, Jacques Herbrand, Gerhard Gentzen, and Kurt Schütte. The results obtained in the 1920s were disappointing when measured against the hopes and ambitions: Ackermann, von Neumann, and Herbrand established essentially the consistency of arithmetic with a very restricted principle of induction. That limits of finitist considerations for consistency proofs had been reached became clear in 1931 through Gödel’s incompleteness theorems. Also, special cases of the decision problem for predicate logic (Hilbert’s Entscheidungsproblem) had been solved; its general solvability was made rather implausible by some of Gödel’s results in his 1931 paper. The actual proof of unsolvability had to wait until 1936 for a conceptual clarification of ‘mechanical procedure’ or ‘algorithm’; that was achieved through the work of Church and Turing. The further development of proof theory is roughly characterized by two complementary tendencies: (1) the extension of the metamathematical frame relative to which ‘constructive’ consistency proofs can be obtained, and (2) the refined formalization of parts of mathematics in theories much weaker than set theory or even full second-order arithmetic. The former tendency started with the work of Gödel and Gentzen in 1933 establishing the consistency of full classical arithmetic relative to intuitionistic arithmetic; it led in the 1970s and 1980s to consistency proofs of strong subsystems of secondorder arithmetic relative to intuitionistic theories of constructive ordinals. The latter tendency reaches back to Weyl’s book Das Kontinuum (1918) and culminated in the 1970s by showing that the classical results of mathematical analysis can be formally obtained in conservative extensions of first-order arithmetic. For the metamathematical work Gentzen’s introduction of sequent calculi and the use of transfinite induction along constructive ordinals turned out to be very important, as well as Gödel’s primitive recursive functionals of finite type. The methods and results of proof theory are playing, not surprisingly, a significant role in computer science.
Work in proof theory has been motivated by issues in the foundations of mathematics, with the explicit goal of achieving epistemological reductions of strong theories for mathematical practice (like set theory or second-order arithmetic) to weak, philosophically distinguished theories (like primitive recursive arithmetic). As the formalization of mathematics in strong theories is crucial for the metamathematical approach, and as the programmatic goal can be seen as a way of circumventing the philosophical issues surrounding strong theories, e.g., the nature of infinite sets in the case of set theory, Hilbert’s philosophical position is often equated with formalism – in the sense of Frege in his Über die Grundlagen der Geometrie (1903–06) and also of Brouwer’s inaugural address Intuitionism and Formalism (1912). Though such a view is not completely unsupported by some of Hilbert’s polemical remarks during the 1920s, on balance, his philosophical views developed into a sophisticated instrumentalism, if that label is taken in Ernest Nagel’s judicious sense (The Structure of Science, 1961). Hilbert’s is an instrumentalism emphasizing the contentual motivation of mathematical theories; that is clearly expressed in the first chapter of Hilbert and Bernays’s Grundlagen der Mathematik I (1934). A sustained philosophical analysis of proof-theoretic research in the context of broader issues in the philosophy of mathematics was provided by Bernays; his penetrating essays stretch over five decades and have been collected in Abhandlungen zur Philosophie der Mathematik (1976).
See also CONSISTENCY , FORMALIZATION, GÖDEL ‘s INCOMPLETENESS THEOREMS , HIL — BERT ‘s PROGRAM , METAMATHEMATIC. W.S. propensity, an irregular or non-necessitating causal disposition of an object or system to produce some result or effect. Propensities are usually conceived as essentially probabilistic in nature. A die may be said to have a propensity of ‘strength’ or magnitude 1/6 to turn up a 3 if thrown from a dice box, of strength 1/3 to turn up, say, a 3 or 4, etc. But propensity talk is arguably appropriate only when determinism fails. Strength is often taken to vary from 0 to 1. Popper regarded the propensity notion as a new physical or metaphysical hypothesis, akin to that of forces. Like Peirce, he deployed it to interpret probability claims about single cases: e.g., the probability of this radium atom’s decaying in 1,600 years is 1/2. On relative frequency interpretations, probability claims are about properties of large classes such as relative frequencies of outcomes in them, rather than about single cases. But single-case claims appear to be common in quantum theory. Popper advocated a propensity interpretation of quantum theory. Propensities also feature in theories of indeterministic or probabilistic causation. Competing theories about propensities attribute them variously to complex systems such as chance or experimental set-ups or arrangements (a coin and tossing device), to entities within such set-ups (the coin itself), and to particular trials of such set-ups. Long-run theories construe propensities as dispositions to give rise to certain relative frequencies of, or probability distributions over, outcomes in long runs of trials, which are sometimes said to ‘manifest’ or ‘display’ the propensities. Here a propensity’s strength is identical to some such frequency. By contrast, single-case theories construe propensities as dispositions of singular trials to bring about particular outcomes. Their existence, not their strength, is displayed by such an outcome. Here frequencies provide evidence about propensity strength. But the two can always differ; they converge with a limiting probability of 1 in an appropriate long run. See also CAUSATION, DETERMINISM , DISPO- SITION , PEIRCE , PROBABILITY , QUANTUM MECHANIC. D.S.