You are on page 1of 8

Proof-normalisation in Gentzen’s Nachlass∗

Adrian Rezuş
(Nijmegen, The Netherlands)

September 13, 2017, rev. October 17, 2017

Unlike for the pioneers of modern logic and for some of his contemporaries
involved in logic research, the name of Gerhard Gentzen (1909–1945) is less
known to the general public interested in mathematical logic or in the philosophy
of mathematics. Even the mathematician or the philosopher who is aware of
the main trends in contemporary logic would quote his name in connection with
facts that are superficially related to his scientific achievements, as, e.g., that
(1) he produced consistency proofs for classical arithmetic [Peano Arith-
metic, PA for short], intended to solve a famous problem – the second one in
a list of many ‘mathematical problems’ – raised by David Hilbert about thirty
years before, and that, during the process, he invented
(2) a graphical representation of mathematical proofs kown as ‘natural de-
duction’, as well as
(3) a rather weird way of re-arranging the formularian information ocurring
actually in proofs, in a graphical display (the so-called L-systems, with ‘L’ short
for ‘logistic’), with, as a result, a ‘Main Theorem’ (Hauptsatz, in German),
inadvertently referred to by his later readers as ‘[cut]-Elimination Theorem’.1
None of the above are characteristic – nor even relevant – for Gentzen’s main
(theoretical) concern, however, because
(1a) Hilbert’s original problem (viz. ‘prove the consistency of PA by elemen-
tary means’) was a pseudo-mathematical problem, since (1a.1) Hilbert omitted
to explain what he meant by ‘elementary’, in the statement of the task and,
mainly, (1a.2) in view of Gödel’s results of 19312 ;
(2a) by the time Gentzen was writing (1932–1933) his PhD Dissertation,
the essentials of the so-called ‘natural deduction’ were seven years old, already
∗ This note is based on a review of Jan von Plato, Saved from the Cellar. Gerhard Gentzen’s

Shorthand Notes on Logic and the Foundations of Mathematics, Springer International Pub-
lishing [CH] [April] 2017 [Sources and Studies in the History of Mathematics and Physical
Sciences] [x + 315 pp.] [ISBN 978-3-319-42119-3 (hbk)], scheduled to appear in Studia Log-
ica. The author is indebted to J. Roger Hindley (Swansea, Wales, UK) for useful remarks on
a preliminary draft of the paper.
1 As a matter of fact, Gentzen’s Hauptsatz says that the global cut [substitution] can be

replaced, systematically, by local cuts [substitutions introducing a specific proof-operator].


2 Gödel’s Incompleteness Theorem (colloquially: we can prove the consistency of a formal

system containing PA only by methods exceeding the means of the formal system under focus)
was actually confirmed by Gentzen’s consistency proofs.

1
invented (1926–1927) by a very young – and, by then, still undergraduated –
Polish student of Jan Lukasiewicz in Warsaw (Stanislaw Jaśkowski), and be-
cause, incidentally,
(3a) the ‘main’ result of Gentzen’s PhD Dissertation (defended in Göttingen
1933) can be construed as a rather trivial observation.3
This way of understanding Gentzen’s contributions to logic (and the histor-
ical accident that his L-systems have subsequently generated a veritable indus-
try) is superficial, highly detrimental and unfair to Gentzen, obliterating, in the
end, the fact that he is, actually, the founding father of modern proof theory.
The book under review – a copiously commented edition of Gentzen’s Nach-
lass, translated into English, consisting of MSS dated since the fall of 1931 until
the end of 1942, and including also some of Gentzen’s correspondence with
Arend Heyting and Paul Bernays – puts his work in prospect, both historically
and epistemologically. The editor / translator of the original texts – otherwise
one of the best connoisseurs of Gentzen’s work alive –, has spent a considerable
amount of time and effort in deciphering, translating, putting in order, and
assessing through judicious comments the laboratory of Gentzen’s pioneering
thinking about the main concept of logic, the concept of proof. In this guise,
Jan von Plato’s monograph is, very likely, one of the most important publica-
tions concerning the history and the epistemology of modern logic that have
been issued in print during the last decades.4
The editor has arranged the Nachlass more or less in chronological order,
in 17 distinct sections, each item being also qualified thematically. The edited
material covers a wealth of subjects Gentzen was concerned with during the
period 1931–1942, not all of them equally important in the epistemic order of
things, though.
Worth noting, inter alia, is the fact that Gentzen was deeply familiar with the
contemporary work on logic and foundations, as well with that of the pioneers5 .
This includes some early work on intuitionism (Heyting), as well. At a later
stage, he also studied in some detail the variant of ‘natural deduction’ due to
Jaśkowski (First Congress of the Polish Mathematicians, Warsaw 1927)6 . He
3 Gentzen’s Hauptsatz says, ultimately, that if we can prove a given mathematical theorem

by using lemmas, then we can also do it without using them, namely, by copying and pasting
– textually – the proofs of the lemmas in the body of the proof of the theorem, instead of
making backwards references to them. In short, proof-theoretically, Gentzen’s famous cut-
rule (Schnitt), borrowed from Paul Hertz, is nothing but a global substitution operator.
4 Incidentally, the editorial layout and the character of the publication prompted the editor

to leave out many technical details, so that the reader should also consult several papers he
published previously (2001–2014), mentioned in the bibliography, in order to get a complete
picture of the facts.
5 Frege’s ‘Grundgesetze der Arithmetik’, 1893–1903, excepted, perhaps. In fact, with the

exception of the young Russell, none of Gentzen’s predecessors – David Hilbert and Paul
Hertz included – did actually read Frege’s book. The latter contains, mutatis mutandis, a
pretty good approximation of Gentzen’s structural rules. Apparently, Gentzen obtained his
structural rules by analysing Hertz’s work on ‘Satzsystemen’, instead,
6 According to von Plato, Gentzen met Jaśkowski in Münster, in June 1936 [Saved from

the Cellar’, p. 24]. Otherwise, one of his ‘natural deduction’ systems he considered in 1932
is, practically, identical with the one published by Jaśkowski, in 1934.

2
was even planning, by the end of 1942, to write a book-length monograph – item
[17] in the von Plato collection –, on ‘Mathematical Foundational Research’, a
project which remained, unfortunately, in draft-stage.
Given this rather variegated, thematic cocktail, I shall focus, in what follows,
on what I think is the genuine aspect of Gentzen’s contribution to logic / proof-
theory, viz. the normalisation theorem for the classical and intuitionistic logic
(proofs), leaving the remaining topics to other would-be reviewers of the edition.
Mainly because, in this respect, Gentzen was far ahead of his times. The theme
(the ‘hillock theorem’ in his terminology) appears rather early in the Nachlass –
September 23, 1932 – and makes up a leit-motif during the whole period covered
by the published texts [1932–1942].7
In modern terms, a Gentzen ‘hillock’ amounts, more or less, to what a λ-
calculus expert would call, nowadays, a [β-] redex (shorthand for ‘reducible
expression’) and a proof-theorist would identify – in Gentzen’s original terms –
as a proof-détour (Beweisumweg, in German). A proof is said to be in normal
form (or just normal) if it contains no détours (‘hillocks’ or ‘redexes’). A nor-
malisation theorem says, roughly, that any proof can be ‘reduced’ to a normal
form, or else that it can be ‘normalised’. The distinction between a so-called
weak versus a strong normalisation result refers to the reduction paths: if we
can find a normal form on at least one reduction path, we have weak normalisa-
7 Historically, due to van Plato’s endeavours, we have now MS-evidence for the fact that

Gentzen precedes both H. B. Curry (in print: 1934) and W. Howard (MS 1969, in print
1980) in the current misnomer ‘Curry-Howard Correspondence’ [or ‘Isomorphism’]. Of course,
neither Curry nor Howard were familiar with the early Gentzen considerations of 1932 – so
their findings were, in fact, independent –, yet the ultimate textual archeology is definite:
Gentzen comes first. As pointed out by J. P. Seldin and J. R. Hindley, Curry must have
been aware, at least implicitly, of a crude form of the correspondence around 1930 – while
still busy writing up his Göttingen PhD Dissertation [1930] –, since he was using combinator
labels for intuitionistically valid propositional formulas. Otherwise, Curry’s ‘(generalised)
functionality theory’ [(G)FT] (earliest papers in print: 1934–1936) was not meant to be an
interpretation of proofs – but, rather, a piece of ‘illative logic’, i.e., type-free logic based on
combinators / λ-calculus, in the style of Alonzo Church (1931–1936) –, while Howard (1969)
had many predecessors, at least on the intuitionistic side of the story. Moreover, unlike
Gentzen, Howard – differently motivated – missed the classical couterpart of it, a fact that
had plenty of unintended and rather unwelcome epistemic consequence, in the long run. As
regards the minute history of the subject, worth mentioning is the fact that Saunders Mac
Lane defended – also in Göttingen, in July 1933, about two months after Gentzen – a PhD
Dissertation on a closely related subject (Abgekürtzte Beweise im Logikkalkul, in print 1934),
anticipating, on a slightly different line of thought though, the work of N. G. de Bruijn on
automath (Eindhoven, 1967–1968), meant to formalise classical mathematical proofs. (This
detail was pointed out by Erwin Engeler, in an unpublished paper of 2015.) As far as we can
tell, de Bruijn’s automath (more or less, a formalised variant of variant Curry’s GFT) was
yet another way of formulating, independently, the ‘Curry-Howard’ correspondence. Other
contributors to the same topic include Dag Prawitz (Stockholm 1965), Hans Läuchli (ETH
Zürich, circa 1965 and later), Dana S. Scott (Stanford CA, 1968–1969), etc. Notably, on
a slightly different plan, Dag Prawitz – who studied the published output of Gentzen in
depth – was the first to produce a (β-only) normalisation proof for classical logic (PhD Diss.
Stockholm, 1965). A clean, extensional variant of Prawitz’s proof-calculus for classical logic –
λγ-calculus – based on the early ideas of A. N. Kolmogorov (1925) and V. I. Glivenko (1928)
– has been obtained by the reviewer around the mid-eighties. (Cf., e.g., the Dirk van Dalen
Festschrift, edited by Henk Barendregt et al., Utrecht 1993.) In retrospect, the latter can be
viewed as a special case of the approach initiated by Gentzen in 1932.

3
tion [WN], if every reduction path is normalising, we have strong normalisation
[SN]. If, moreover, we can also show that the underlying ‘concept of reduction’
is confluent (i.e., ultimately, that the order of the applications of the reduction
‘rules’ is immaterial), then the two concepts of normalisation coincide. Usually,
one wants to prove confluence first, so that a specific normalisation ‘strategy’
[WN thus] should, in the end, suffice in order to get a SN-theorem.
In particular, a confluence result (for a given formal system) implies also
Post-consistency (non-triviality, for the corresponding notion of equality – proof-
conversion or proof-isomorphism – generated by the reduction ‘rules’ of the
system), while, for logic proof-systems, normalisation yields consistency in a
slightly different sense, viz. that one cannot prove an arbitrary false proposition.
Whence the interest in both confluence and normalisation theorems.
Of course, Gentzen did not use the modern theoretical terminology pertain-
ing to the theory of reduction systems8 , but his treatement of classical resp.
intuitionistic proofs amounts pretty much to the same thing.
In fact, Gentzen considered explicitly only what a modern proof-theorist
would qualify as β-redexes (no extensionality conditions thus). Most of the
later proof-theorists inherited from him the β-only restriction. This is, more
or less, a historical accident and can be explained by the fact the pioneeers of
modern (mathematical) logic were formularians, they used to think in terms of
formulas (expressing propositions and / or propositional schemes) or formula-
configurations and propositional connectives (resp. quantifiers), not in terms of
proofs and / or proof-operators (viz. rules of inference).
The so-called ‘Curry-Howard Correspondence’ simplifies the treatement of
the subject, modulo an appropriate proof-formalism based on λ-calculus: the
basic idea behind the correspondence consists of claiming that (1) a proof is an
object a bearing a specific decoration (or a ‘type’), viz. a formula A, formally
` a : A, reading ‘a is a witness for A’, and that (2) any rule of inference is a
(witness) operator that must be defined by explicit equational conditions, like
in algebra, say, whereby a proof would amount to the construction of a ‘witness
term’ built up from (decorated) proof-variables, representing assumptions, and
(witness) operators, representing rules of inference. Certainly, the witness terms
can be also represented graphically – as trees, nets or block structures –, but
the concrete representation does not supply more information than the abstract
terms.
For Gentzen, the equational conditions are, implictly, generated by ‘détour-
elimination rules’. He classified the rules of inference (the proof-operators) of
classical, resp. intuitionistic logic, in formularian terms – using the proposi-
tional connectives as guiding criteria –, into introduction and elimination rules
for a given connective, resp. quantifier (intelim rules, for short), and identified a
proof-détour as being a proof-segment consisting of (the application of) an intro-
duction rule followed immediately by (the application of) an elimination rule. As
noted above, Gentzen’s détours correspond to β-redexes in (an extended) typed
λ-calculus. This kind of division works smoothly for the ‘minimal’ fragment of
8 He also ignored confluence problems.

4
intuitionistic logic9 , but fails for the operators (rules) associated to both classi-
cal and intuitionistic negation. Of course, if one thinks in terms of proofs, resp.
proof-operators (rules of inference), every such a rule is an ‘introduction’ rule,
for a specific proof-operator. The absence of an explicit proof-notation explains
the fact that the (extensional) η-détours (‘eliminations’ followed immediately
by specific ‘introductions’) are not mentioned by Gentzen. The typically formu-
larian way of thinking is also at the root of the wrong choice of proof-primitives
for the classical disjunction and existence (borrowed from intuitionistic logic).
In particular, the intuitionistic falsum does not fit exactly the intelim-scheme,
unless one thinks of the (intuitionistic) ‘law of (non-) contradiction’ as being an
‘introduction’ rule.10
On the other hand, Gentzen’s failure to handle the proof-operators asso-
ciated to classical negation in conceptually simple terms is much similar to
Jaśkowski’s and was, likely, induced by economy reasons. This can be under-
stood as follows.
If one thinks in terms of proofs, resp. proof-operators, the intelim-scheme
associated to implication, for instance (corresponding to a pair consisting of
a monadic abstraction operator [λ, standing for the so-called ‘Deduction The-
orem’] and a binary ‘cut-operator’ [., representing modus ponens]11 ) has an
equationally isomorphic counterpart for classical negation, consisting of reduc-
tio ad absurdum, a monadic abstraction operator [∂, say; term form: ∂z.e], and
an associated binary ‘cut’ operator [?, say; term form: c?a], representing the
(classical) law of (non-) contradiction [sic], whereby the resulting proof-system
[based on f, falsum, implication, →, and negation, ¬] amounts to a (2-fold)
replication of the pure λ-calculus (with thus two abstractors and two ‘cuts’ as
primitives). Here, the pair [λ,.] is as usual, supposed to satisfy the characteris-
tic equational conditions: (βλ) ` (λx.b[x]) . a = b[x:=a], and (ηλ) ` λx.(c.x)
= c (if x is not free in c), while the pair [∂,?], bearing a decoration [or ‘typing’]
given by ` ∂z:¬A.e[x] : A, for [z:¬A] ` e[z] : f, resp. ` c?a : f, for ` a : ¬A,
and ` a : A, has equational behaviour stipulated by analogous conditions (β∂)
` c ? ∂z.e[x] = e[x:=c] and (η∂) ` ∂z.(z?a) = a (if z is not free in a) resp.12
This yields the simplest ‘Curry-Howard Correspondence’ for classical (pro-
positional) logic, based on the signature [f,→,¬].13
9 Gentzen used to call it ‘constructive’; this corresponds to intuitionistic logic without the

ex falso rule or, better, to the ‘Minimalkalkül’ of Ingebrigt Johansson 1937.


10 This is not a very good idea, since this rule can be also viewed as an ‘elimination’ rule

for (intuitionistic) negation. Proof-theoretically, the intuitionistic ex falso rule is just a spe-
cial case of a genuinely classical rule of inference known as [classical] reductio ad absurdum.
With the notation introduced below, the ex falso rule corresponds to witness terms $A (e) :=
∂z:¬A.e, where z is not free in e.
11 Curry’s main observation, in his ‘(basic) functionality theory’, inducing the decoration

[‘typing’] ` λx:A.b[x] : A→B, for [x:A] ` b[x], resp. ` c.a : B, for ` c : A→B, and ` a : A.
12 Here, we have written the ?-cuts in reversed order, in order to ease readability.
13 The resulting λ∂-calculus can be consistently extended with intelim-rules for double nega-

tion, by adding – redundantly though – primitive singulary (one-place) operators ∆ (double-


negation elimination) and ∇ (double-negation introduction), supposed to make up, equation-
ally, an inversion, i.e., ` ∇(∆(c)) = c, ` ∆(∇(a)) = a, for ` a : A and ` c : ¬¬A. (As an
instructive exercise in notational relativity, the reader may try to formulate a similar ‘typed’

5
Instead of the elementary pair [∂,?], meant to characterise classical negation,
on the primitive signature [f,→,¬], Gentzen chose, on the signature [→,¬] – like
Jaśkowski, before him –, a single, more complex, primitive proof-operator for
classical negation, χz:¬A.(d,c), say14 , with ` χz:¬A(d,c) : A, if [z:¬A] ` d :
¬C and [z:¬A] ` c : C, definable by χz:¬A.(d,c) := ∂z:¬A.(d?c). Given the
fact that he meant to have classical arithmetic [PA] in the system, the constant
f was available as a false arithmetic proposition [f := (0 = ˙ 1)], with t ≡ ¬f
witnessed by a constant Ω, say, as an instance of the PA-axiom stating that 0
is not a successor of anything else. In this setting, one can define both ∂ and ?
explicitly, by ∂z.e := χz:¬A.(Ω,e), and c?a := χz:t.(c,a), where z is not free in
c and a. If, moreover, Ω is subjected to equational conditions ` Ω ? e = e, for
` e : f and ` c = Ω , for ` c : t (i.e., t has a unique witness), we can easily find
equational conditions (βχ) and (ηχ) resp., making both systems equationally
equivalent (in the presence of the two conditions for Ω).15
In the end – if viewed as a reduction system, by orienting the equations from
left to right, say –, Gentzen’s classical λχΩ-calculus, based on [λ,.,χ,Ω], is not
more difficult to handle than the λ∂-calculus, based on [λ,.,∂,?] (where, other-
wise, Ω is explicitly definable), although the corresponding proofs of confluence
and normalisation are easier to obtain in the latter case.16
The technical remarks above give a good idea as to the way Gentzen used to
think about the proof theory of classical logic around 1932, i.e., before actually
completing his Göttingen PhD Dissertation (May 1933).
There is not very much to add about the extensions to quantifiers, as the
matter is relatively well known, from the published papers of Gentzen.17
For some reason unknwon to the reviewer, Gentzen decided to prove (β-)
normalisation for intuitionistic logic and to transfer the result (consistency) to
classical logic via a so-called ‘double-negation’ translation (he invented inde-
calculus, based on the primitive signature [f,∨,¬], with → [material implication] replaced by ∨
[classical disjunction], and prove equational equivalence for the corresponding double-negation
extensions.)
14 Corresponding to his classical rule RA2 [reductio ad absurdum]. Cf. the note on Five

different forms of natural calculi of 23 September 1932, item [4] in the von Plato collection,
pp. 113–115, and the later references to ‘the hillock theorem’ appearing passim in the text.
15 Specifically, abbreviating d? c := χz:¬A.(d,c), and z:¬C.a[z] := χz:¬C.(z,a[z]), one has
A
(βχ) ` f ?A χy:¬B.(d[y],c[y]) = χz:¬A.(d[x:=f],c[x:=f]), for [x:¬A] ` f : ¬B, and (ηχ)
` z:¬A.a = a, if z is not free in a, where ` a : A. Note also that, with εz:¬A.a[z] :=
∂z:¬A.(z?a[z]), and $A (e) := ∂z:¬A.e (z not free in e), as ever, where ∂ is defined in terms of
χ and Ω, as above, we have also ` εz:¬A.a[z] = z:¬A.a[z], and ` ∂z:¬.e[z] = εz:¬A.$A (e[z]),
as expected. Gentzen is neatly superior to Jaśkowski in this respect, as he was also concerned
explicitly with proof-complexity matters (proof-reduction) and proof-isomorphisms. Without
the latter kind of considerations, one has just a bare proof-notation (as in Jaśkowski 1934).
16 As a bonus, the λ∂-calculi (with or without double-negation proof-primitives) can be also

formulated in ‘type-free’ style, in which case consistency is straightforward, since they can
be both interpreted in the (‘type-free’) λπ-calculus (the extended λ-calculus with ‘surjective’
pairing), known to be consistent.
17 One could mention the fact that the ‘elimination’ rule for the (classical) existential quan-

tifier, stated in 1932, is slightly different from the ‘standard’ (intuitionistically valid) rule
chosen later on. As regards arithmetic [PA], Gentzen used, explicitly, only the Induction rule
[CI, short for ‘complete induction’]. But this is different subject.

6
pendenly of Kolmogorov and Gödel), whereas the consistency of classical logic
was handled separately by using a different formalism (an L-system with se-
quents ‘multiple on the right’) and a typically formularian way of reasoning
(‘sub-formula property’, derived from the ‘Hauptsatz’, for the latter system).
This induced, later on, the superstition that an L-system is more ‘suited’
for classical logic than a ‘natural deduction’ presentation of it (allegedly ‘suited’
for intuitionism). In fact, if properly understood, classical logic admits of more
transparent and simpler proof-calculi derived from ‘natural deduction’ presen-
tations than it is the case for intuitionistic logic.
Gentzen’s appeal to L-systems (with sequents ‘multiple on the right’), for
the classical case, is equally questionable in view of the fact that the original
task was to obtain a consistency proof for PA. The Haupsatz is, however, not
forthcoming in the presence of (non-logical) axioms, so that, in retrospect, the
use of L-systems (‘multiple on the right’) looks somewhat pointless, in this
respect. On the other hand, for logic alone, Gentzen’s Hauptsatz is nothing but
a roundabout way of obtaining a (β-) normalisation result (William Tait, circa
1968).
Finally, even though the book discussed here does not address explicitly
every (general) reader interested in modern logic and its history, it is, no doubt,
relevant for both the contemporary researcher in proof theory, who (as suggested
in the technical comments above) may find a source of inspiration in Genzen’s
unpublished notes of the thirties, as well as for the philosopher concerned with
epistemological aspects of modern logic.

7

c 2017 Adrian Rezuş (Nijmegen, The Netherlands)

c 2017 équivalences (Nijmegen, The Netherlands) [(pdf)LATEX]

draft: September 13, 2017; last revised: October 17, 2017


printed in the netherlands

You might also like