You are on page 1of 54

Justification Logic Reasoning with

Reasons Cambridge Tracts in


Mathematics 1st Edition Sergei
Artemov
Visit to download the full and correct content document:
https://textbookfull.com/product/justification-logic-reasoning-with-reasons-cambridge-t
racts-in-mathematics-1st-edition-sergei-artemov/
More products digital (pdf, epub, mobi) instant
download maybe you interests ...

Eigenvalues Multiplicities and Graphs Cambridge Tracts


in Mathematics 1st Edition Charles R. Johnson

https://textbookfull.com/product/eigenvalues-multiplicities-and-
graphs-cambridge-tracts-in-mathematics-1st-edition-charles-r-
johnson/

Cambridge International AS A Level Mathematics Pure


Mathematics 1 Worked Solutions Manual with Cambridge
Elevate Edition Muriel James

https://textbookfull.com/product/cambridge-international-as-a-
level-mathematics-pure-mathematics-1-worked-solutions-manual-
with-cambridge-elevate-edition-muriel-james/

Cambridge International AS A Level Mathematics


Mechanics Worked Solutions Manual with Cambridge
Elevate Edition Nick Hamshaw

https://textbookfull.com/product/cambridge-international-as-a-
level-mathematics-mechanics-worked-solutions-manual-with-
cambridge-elevate-edition-nick-hamshaw/

Logical Foundations of Computer Science International


Symposium LFCS 2016 Deerfield Beach FL USA January 4 7
2016 Proceedings 1st Edition Sergei Artemov

https://textbookfull.com/product/logical-foundations-of-computer-
science-international-symposium-lfcs-2016-deerfield-beach-fl-usa-
january-4-7-2016-proceedings-1st-edition-sergei-artemov/
Logical Foundations of Computer Science International
Symposium LFCS 2018 Deerfield Beach FL USA January 8 11
2018 Proceedings 1st Edition Sergei Artemov

https://textbookfull.com/product/logical-foundations-of-computer-
science-international-symposium-lfcs-2018-deerfield-beach-fl-usa-
january-8-11-2018-proceedings-1st-edition-sergei-artemov/

Logical Studies of Paraconsistent Reasoning in Science


and Mathematics 1st Edition Holger Andreas

https://textbookfull.com/product/logical-studies-of-
paraconsistent-reasoning-in-science-and-mathematics-1st-edition-
holger-andreas/

Cambridge International AS & A Level Mathematics


Probability and Statistics 1 Worked Solutions Manual
with Cambridge Elevate Edition Dean Chalmers

https://textbookfull.com/product/cambridge-international-as-a-
level-mathematics-probability-and-statistics-1-worked-solutions-
manual-with-cambridge-elevate-edition-dean-chalmers/

On Reasoning and Argument Essays in Informal Logic and


on Critical Thinking 1st Edition David Hitchcock
(Auth.)

https://textbookfull.com/product/on-reasoning-and-argument-
essays-in-informal-logic-and-on-critical-thinking-1st-edition-
david-hitchcock-auth/

Prolegomena to a science of reasoning phaneroscopy


semeiotic logic Charles Sanders Peirce

https://textbookfull.com/product/prolegomena-to-a-science-of-
reasoning-phaneroscopy-semeiotic-logic-charles-sanders-peirce/
Justification Logic

Classical logic is concerned, loosely, with the behavior of truths. Epistemic logic
similarly is about the behavior of known or believed truths. Justification logic is a
theory of reasoning that enables the tracking of evidence for statements and therefore
provides a logical framework for the reliability of assertions. This book, the first in the
area, is a systematic account of the subject, progressing from modal logic through to
the establishment of an arithmetic interpretation of intuitionistic logic. The
presentation is mathematically rigorous but in a style that will appeal to readers from a
wide variety of areas to which the theory applies. These include mathematical logic,
artificial intelligence, computer science, philosophical logic and epistemology,
linguistics, and game theory.

s e r g e i a rt e m ov is Distinguished Professor at the City University of New York.


He is a specialist in mathematical logic, logic in computer science, control theory,
epistemology, and game theory. He is credited with solving long-standing problems in
constructive logic that had been left open by Gödel and Kolmogorov since the 1930s.
He has pioneered studies in the logic of proofs and justifications that render a new,
evidence-based theory of knowledge and belief. The most recent focus of his interests
is epistemic foundations of game theory.

m e lv i n f i t t i n g is Professor Emeritus at the City University of New York. He has


written or edited a dozen books and has worked in intensional logic, semantics for
logic programming, theory of truth, and tableau systems for nonclassical logics. In
2012 he received the Herbrand Award from the Conference on Automated Deduction.
He was on the faculty of the City University of New York from 1969 to his retirement
in 2013, at Lehman College, and at the Graduate Center, where he was in the
Departments of Mathematics, Computer Science, and Philosophy.
CAMBRIDGE TRACTS IN MATHEMATICS

GENERAL EDITORS
B. BOLLOBÁS, W. FULTON, F. KIRWAN,
P. SARNAK, B. SIMON, B. TOTARO
A complete list of books in the series can be found at www.cambridge.org/mathematics.

Recent titles include the following:

181. Totally Positive Matrices. By A. Pinkus


182. Nonlinear Markov Processes and Kinetic Equations. By V. N. Kolokoltsov
183. Period Domains over Finite and p-adic Fields. By J.-F. Dat, S. Orlik, and M. Rapoport
184. Algebraic Theories. By J. Adámek, J. Rosický, and E. M. Vitale
185. Rigidity in Higher Rank Abelian Group Actions I: Introduction and Cocycle Problem.
By A. Katok and V. Niţică
186. Dimensions, Embeddings, and Attractors. By J. C. Robinson
187. Convexity: An Analytic Viewpoint. By B. Simon
188. Modern Approaches to the Invariant Subspace Problem. By I. Chalendar
and J. R. Partington
189. Nonlinear Perron–Frobenius Theory. By B. Lemmens and R. Nussbaum
190. Jordan Structures in Geometry and Analysis. By C.-H. Chu
191. Malliavin Calculus for Lévy Processes and Infinite-Dimensional Brownian Motion.
By H. Osswald
192. Normal Approximations with Malliavin Calculus. By I. Nourdin and G. Peccati
193. Distribution Modulo One and Diophantine Approximation. By Y. Bugeaud
194. Mathematics of Two-Dimensional Turbulence. By S. Kuksin and A. Shirikyan
195. A Universal Construction for Groups Acting Freely on Real Trees. By I. Chiswell
and T. Müller
196. The Theory of Hardy’s Z-Function. By A. Ivić
197. Induced Representations of Locally Compact Groups. By E. Kaniuth and K. F. Taylor
198. Topics in Critical Point Theory. By K. Perera and M. Schechter
199. Combinatorics of Minuscule Representations. By R. M. Green
200. Singularities of the Minimal Model Program. By J. Kollár
201. Coherence in Three-Dimensional Category Theory. By N. Gurski
202. Canonical Ramsey Theory on Polish Spaces. By V. Kanovei, M. Sabok, and J. Zapletal
203. A Primer on the Dirichlet Space. By O. El-Fallah, K. Kellay, J. Mashreghi,
and T. Ransford
204. Group Cohomology and Algebraic Cycles. By B. Totaro
205. Ridge Functions. By A. Pinkus
206. Probability on Real Lie Algebras. By U. Franz and N. Privault
207. Auxiliary Polynomials in Number Theory. By D. Masser
208. Representations of Elementary Abelian p-Groups and Vector Bundles. By D. J. Benson
209. Non-homogeneous Random Walks. By M. Menshikov, S. Popov, and A. Wade
210. Fourier Integrals in Classical Analysis (Second Edition). By C. D. Sogge
211. Eigenvalues, Multiplicities and Graphs. By C. R. Johnson and C. M. Saiago
212. Applications of Diophantine Approximation to Integral Points and Transcendence.
By P. Corvaja and U. Zannier
213. Variations on a Theme of Borel. By S. Weinberger
214. The Mathieu Groups. By A. A. Ivanov
215. Slender Modules and Rings I: Foundations. By R. Dimitric
Justification Logic
Reasoning with Reasons

S E R G E I A RT E M OV
Graduate Center, City University of New York

M E LV I N F I T T I N G
Graduate Center, City University of New York
University Printing House, Cambridge CB2 8BS, United Kingdom
One Liberty Plaza, 20th Floor, New York, NY 10006, USA
477 Williamstown Road, Port Melbourne, VIC 3207, Australia
314–321, 3rd Floor, Plot 3, Splendor Forum, Jasola District Centre,
New Delhi – 110025, India
79 Anson Road, #06–04/06, Singapore 079906

Cambridge University Press is part of the University of Cambridge.


It furthers the University’s mission by disseminating knowledge in the pursuit of
education, learning, and research at the highest international levels of excellence.

www.cambridge.org
Information on this title: www.cambridge.org/9781108424912
DOI: 10.1017/9781108348034
© Sergei Artemov and Melvin Fitting 2019
This publication is in copyright. Subject to statutory exception
and to the provisions of relevant collective licensing agreements,
no reproduction of any part may take place without the written
permission of Cambridge University Press.
First published 2019
Printed and bound in Great Britain by Clays Ltd, Elcograf S.p.A.
A catalogue record for this publication is available from the British Library.
Library of Congress Cataloging-in-Publication Data
Names: Artemov, S. N., author. | Fitting, Melvin, 1942- author.
Title: Justification logic : reasoning with reasons / Sergei Artemov
(Graduate Center, City University of New York), Melvin Fitting (Graduate
Center, City University of New York).
Description: Cambridge ; New York, NY : Cambridge University Press, 2019. |
Series: Cambridge tracts in mathematics ; 216 | Includes bibliographical
references and index.
Identifiers: LCCN 2018058431 | ISBN 9781108424912 (hardback : alk. paper)
Subjects: LCSH: Logic, Symbolic and mathematical. | Inquiry (Theory of
knowledge) | Science–Theory reduction. | Reasoning.
Classification: LCC QA9 .A78 2019 | DDC 511.3–dc23
LC record available at https://lccn.loc.gov/2018058431
ISBN 978-1-108-42491-2 Hardback
Cambridge University Press has no responsibility for the persistence or accuracy
of URLs for external or third-party internet websites referred to in this publication
and does not guarantee that any content on such websites is, or will remain,
accurate or appropriate.
To our wives, Lena and Roma.
Contents

Introduction page x
1 What Is This Book About? xii
2 What Is Not in This Book? xvii
1 Why Justification Logic? 1
1.1 Epistemic Tradition 1
1.2 Mathematical Logic Tradition 4
1.3 Hyperintensionality 8
1.4 Awareness 9
1.5 Paraconsistency 10
2 The Basics of Justification Logic 11
2.1 Modal Logics 11
2.2 Beginning Justification Logics 12
2.3 J0 , the Simplest Justification Logic 14
2.4 Justification Logics in General 15
2.5 Fundamental Properties of Justification Logics 20
2.6 The First Justification Logics 23
2.7 A Handful of Less Common Justification Logics 27
3 The Ontology of Justifications 31
3.1 Generic Logical Semantics of Justifications 31
3.2 Models for J0 and J 36
3.3 Basic Models for Positive and Negative Introspection 38
3.4 Adding Factivity: Mkrtychev Models 39
3.5 Basic and Mkrtychev Models for the Logic of Proofs LP 42
3.6 The Inevitability of Possible Worlds: Modular Models 42
3.7 Connecting Justifications, Belief, and Knowledge 45
3.8 History and Commentary 46

vii
viii Contents

4 Fitting Models 48
4.1 Modal Possible World Semantics 48
4.2 Fitting Models 49
4.3 Soundness Examples 52
4.4 Canonical Models and Completeness 60
4.5 Completeness Examples 65
4.6 Formulating Justification Logics 72
5 Sequents and Tableaus 75
5.1 Background 75
5.2 Classical Sequents 76
5.3 Sequents for S4 79
5.4 Sequent Soundness, Completeness, and More 81
5.5 Classical Semantic Tableaus 84
5.6 Modal Tableaus for K 90
5.7 Other Modal Tableau Systems 91
5.8 Tableaus and Annotated Formulas 93
5.9 Changing the Tableau Representation 95
6 Realization – How It Began 100
6.1 The Logic LP 100
6.2 Realization for LP 103
6.3 Comments 108
7 Realization – Generalized 110
7.1 What We Do Here 110
7.2 Counterparts 112
7.3 Realizations 113
7.4 Quasi-Realizations 116
7.5 Substitution 118
7.6 Quasi-Realizations to Realizations 120
7.7 Proving Realization Constructively 126
7.8 Tableau to Quasi-Realization Algorithm 128
7.9 Tableau to Quasi-Realization Algorithm Correctness 131
7.10 An Illustrative Example 133
7.11 Realizations, Nonconstructively 135
7.12 Putting Things Together 138
7.13 A Brief Realization History 139
8 The Range of Realization 141
8.1 Some Examples We Already Discussed 141
8.2 Geach Logics 142
8.3 Technical Results 144
Contents ix

8.4 Geach Justification Logics Axiomatically 147


8.5 Geach Justification Logics Semantically 149
8.6 Soundness, Completeness, and Realization 150
8.7 A Concrete S4.2/JT4.2 Example 152
8.8 Why Cut-Free Is Needed 155
9 Arithmetical Completeness and BHK Semantics 158
9.1 Arithmetical Semantics of the Logic of Proofs 158
9.2 A Constructive Canonical Model for the Logic of Proofs 161
9.3 Arithmetical Completeness of the Logic of Proofs 165
9.4 BHK Semantics 174
9.5 Self-Referentiality of Justifications 179
10 Quantifiers in Justification Logic 181
10.1 Free Variables in Proofs 182
10.2 Realization of FOS4 in FOLP 186
10.3 Possible World Semantics for FOLP 191
10.4 Arithmetical Semantics for FOLP 212
11 Going Past Modal Logic 222
11.1 Modeling Awareness 223
11.2 Precise Models 225
11.3 Justification Awareness Models 226
11.4 The Russell Scenario as a JAM 228
11.5 Kripke Models and Master Justification 231
11.6 Conclusion 233
References 234
Index 244
Introduction

Why is this thus? What is the reason of this thusness?1

Modal operators are commonly understood to qualify the truth status of a


proposition: necessary truth, proved truth, known truth, believed truth, and so
on. The ubiquitous possible world semantics for it characterizes things in uni-
versal terms: X is true in some state if X is true in all accessible states,
where various conditions on accessibility are used to distinguish one modal
logic from another. Then (X → Y) → (X → Y) is valid, no matter what
conditions are imposed, by a simple and direct argument using universal quan-
tification. Suppose both (X → Y) and X are true at an arbitrary state. Then
both X and X → Y are true at all accessible states, whatever “accessible” may
mean. By the usual understanding of →, Y is true at all accessible states too,
and so Y is true at the arbitrary state we began with. Although arguments like
these have a strictly formal nature and are studied as modal model theory, they
also give us some insights into our informal, everyday use of modalities. Still,
something is lacking.
Suppose we think of  as epistemic, and to emphasize this we use K instead
of  for the time being. For some particular X, if you assert the colloquial
counterpart of KX, that is, if you say you know X, and I ask you why you know
X, you would never tell me that it is because X is true in all states epistemically
compatible with this one. You would, instead, give me some sort of explicit
reason: “I have a mathematical proof of X,” or “I read X in the encyclopedia,”
or “I observed that X is the case.” If I asked you why K(X → Y) → (KX →
KY) is valid you would probably say something like “I could use my reason
for X and combine it with my reason for X → Y, and infer Y.” This, in effect,
would be your reason for Y, given that you had reasons for X and for X → Y.
1 Charles Farrar Browne (1834–1867) was an American humorist who wrote under the pen
name Artemus Ward. He was a favorite writer of Abraham Lincoln, who would read his
articles to his Cabinet. This quote is from a piece called Moses the Sassy, Ward (1861).

x
Introduction xi

Notice that this neatly avoids the logical omniscience problem: that we know
all the consequences of what we know. It replaces logical omniscience with the
more acceptable claim that there are reasons for the consequences of what we
know, based on the reasons for what we know, but reasons for consequences are
more complicated things. In our example, the reason for Y has some structure
to it. It combines reasons for X, reasons for X → Y, and inference as a kind of
operation on reasons. We will see more examples of this sort; in fact, we have
just seen a fundamental paradigm.
In place of a modal operator, , justification logics have a family of justifi-
cation terms, informally intended to represent reasons, or justifications. Instead
of X we will see t:X, where t is a justification term and the formula is read “X
is so for reason t,” or more briefly, “t justifies X.” At a minimum, justification
terms are built up from justification variables, standing for arbitrary justifica-
tions. They are built up using a set of operations that, again at a minimum,
contains a binary operation ·. For example, x · (y · x) is a justification term,
where x and y are justification variables. The informal understanding of · is
that t · u justifies Y provided t justifies an implication with Y as its consequent,
and u justifies the antecedent. In justification logics the counterpart of

(X → Y) → (X → Y)

is
t:(X → Y) → (u:X → [t · u]:Y)

where, as we will often do, we have added square brackets to enhance read-
ability. Note that this exactly embodies the informal explanation we gave in
the previous paragraph for the validity of K(X → Y) → (KX → KY). That
is, Y has a justification built from justifications for X and for X → Y using an
inference that amounts to a modus ponens application—we can think of the
· operation as an abstract representation of this inference. Other behaviors of
modal operators, X → X for instance, will require operators in addition
to ·, and appropriate postulated behavior, in order to produce justification log-
ics that correspond to modal logics in which X → X is valid. Examples,
general methods for doing this, and what it means to “correspond” all will be
discussed during the course of this book.
One more important point. Suppose X and Y are equivalent formulas, that is,
we have X ↔ Y. Then in any normal modal logic we will also have X ↔ Y.
Let us interpret the modal operator epistemically again, and write KX ↔ KY.
In fact, KX ↔ KY, when read in the usual epistemic way, can sometimes be
quite an absurd assertion. Consider some astronomically complicated tautol-
ogy X of classical propositional logic. Because it is a tautology, it is equivalent
xii Introduction

to P ∨ ¬P, which we may take for Y. Y is hardly astronomically complicated.


However, because X ↔ Y, we will have KX ↔ KY. Clearly, we know Y es-
sentially by inspection and hence KY holds, while KX on the other hand will
involve an astronomical amount of work just to read it, let alone to verify it.
Informally we see that, while both X and Y are tautologies, and so both are
knowable in principle, any justification we might give for knowing one, com-
bined with quite a lot of formula manipulation, can give us some justification
for knowing the other. The two justifications may not, indeed will not, be the
same. One is simple, the other very complex.
Modal logic is about propositions. Propositions are, in a sense, the content
of formulas. Propositions are not syntactical objects. “It’s good to be the king”
and “Being the king is good” express the same proposition, but not in the same
way. Justifications apply to formulas. Equivalent formulas determine the same
proposition, but can be quite different as formulas. Syntax must play a funda-
mental role for us, and you will see that it does, even in our semantics. Consider
one more very simple example. A → (A∧ A) is an obvious tautology. We might
expect KA → K(A ∧ A). But we should not expect t:A → t:(A ∧ A). If t does,
in fact, justify A, a justification of A ∧ A may involve t, but also should in-
volve facts about the redundancy of repetition; t by itself cannot be expected
to suffice.
Modal logics can express, more or less accurately, how various modal opera-
tors behave. This behavior is captured axiomatically by proofs, or semantically
using possible world reasoning. These sorts of justifications for modal operator
behavior are not within a modal logic, but are outside constructs. Justification
logics, on the other hand, can represent the whys and wherefores of modal
behavior quite directly, and from within the formal language itself. We will
see that most standard modal logics have justification counterparts that can be
used to give a fine-grained, internal analysis of modal behavior. Perhaps, this
will help make clear why we used the quotation we did at the beginning of this
Introduction.

1 What Is This Book About?


How did justification logics originate? It is an interesting story, with reveal-
ing changes of direction along the way. Going back to the days when Gödel
was a young logician, there was a dream of finding a provability interpretation
for intuitionistic logic. As part of his work on that project, in Gödel (1933),
Gödel showed that one could abstract some of the key features of provabil-
ity and make a propositional modal logic using them. Then, remarkably but
Introduction xiii

naturally, one could embed propositional intuitionistic logic into the resulting
system. C. I. Lewis had pioneered the modern formal study of modal logics
(Lewis, 1918; Lewis and Langford, 1932), and Gödel observed that his system
was equivalent to the Lewis system S4. All modern axiomatizations of modal
logics follow the lines pioneered in Gödel’s note, while Lewis’s original for-
mulation is rarely seen today. Gödel showed that propositional intuitionistic
logic embedded into S4 using a mapping that inserted  in front of every sub-
formula. In effect, intuitionistic logic could be understood using classical logic
plus an abstract notion of provability: a propositional formula X is an intuition-
istic theorem if and only if the result of applying Gödel’s mapping is a theo-
rem of S4. (This story is somewhat simplified. There are several versions of
the Gödel translation—we have used the simplest one to describe. And Gödel
did not use the symbol  but rather an operator Bew, short for beweisbar, or
provability in the German language. None of this affects our main points.) Un-
fortunately, the story breaks off at this point because Gödel also noted that S4
does not behave like formal provability (e.g., in arithmetic), by using the meth-
ods he had pioneered in his work on incompleteness. Specifically, S4 validates
X → X, so in particular we have ⊥ → ⊥ (where ⊥ is falsehood). This is
equivalent to ¬⊥, which is thus provable in S4. If we had an embedding of
S4 into formal arithmetic under which  corresponded to Gödel’s arithmetic
formula representing provability, we would be able to prove in arithmetic that
falsehood was not provable. That is, we would be able to show provability of
consistency, violating Gödel’s second incompleteness theorem. So, work on an
arithmetic semantics for propositional intuitionistic logic paused for a while.

Although it did not solve the problem of a provability semantics for in-
tuitionistic logic, an important modal/arithmetic connection was eventually
worked out. One can define a modal logic by requiring that its validities are
those that correspond to arithmetic validities when reading  as Gödel’s prov-
ability formula. It was shown in Solovay (1976) that this was a modal logic
already known in the literature, though as noted earlier, it is not S4. Today, the
logic is called GL, standing for Gödel–Löb logic. GL is like S4 except that the
T axiom X → X, an essential part of S4, is replaced by a modal formula
abstractly representing Löb’s theorem: (X → X) → X. S4 and GL are
quite different logics.

By now the project for finding an arithmetic interpretation of intuitionistic


logic had reached an impasse. Intuitionistic logic embedded into S4, but S4
did not embed into formal arithmetic. GL embedded into formal arithmetic,
but the Gödel translation does not embed intuitionistic logic into GL.
xiv Introduction

In his work on incompleteness for Peano arithmetic, Gödel gave a formula


Bew(x, y)

that represents the relation: x is the Gödel number of a proof of a formula with
Gödel number y. Then, a formal version of provability is
∃xBew(x, y)
which expresses that there is a proof of (the formula whose Gödel number is)
y. If this formula is what corresponds to  in an embedding from a modal lan-
guage to Peano arithmetic, we get the logic GL. But in a lecture in 1938 Gödel
pointed out that we might work with explicit proof representatives instead of
with provability (Gödel, 1938). That is, instead of using an embedding trans-
lating every occurrence of  by ∃xBew(x, y), we might associate with each oc-
currence of  some formal term t that somehow represents a particular proof,
allowing different occurrences of  to be associated with different terms t.
Then in the modal embedding, we could make the occurrence of  associated
with t correspond to Bew(ptq, y), where ptq is a Gödel number for t. For each
occurrence of  we would need to find some appropriate term t, and then each
occurrence of  would be translated into arithmetic differently. The existential
quantifier in ∃xBew(x, y) has been replaced with a meta-existential quantifier,
outside the formal language. We provide an explicit proof term, rather than
just asserting that one exists. Gödel believed that this approach should lead to
a provability embedding of S4 into Peano arithmetic.
Gödel’s proposal was not published until 1995 when Volume 3 of his col-
lected works appeared. By this time the idea of using a modal-like language
with explicit representatives for proofs had been rediscovered independently
by Sergei Artemov, see Artemov (1995, 2001). The logic that Artemov created
was called LP, which stood for logic of proofs. It was the first example of a
justification logic. What are now called justification terms were called proof
terms in LP.
Crucially, Artemov showed LP filled the gap between modal S4 and Peano
arithmetic. The connection with S4 is primarily embodied in a Realization
Theorem, which has since been shown to hold for a wide range of justification
logic, modal logic pairs. It will be extensively examined in this book. The con-
nection between LP and formal arithmetic is Artemov’s Arithmetic Complete-
ness Theorem, which also will be examined in this book. Its range is primarily
limited to the original justification logic, LP, and a few close relatives. This
should not be surprising, though. Gödel’s motivation for his formulation of S4
was that  should embody properties of a formal arithmetic proof predicate.
This connection with arithmetic provability is not present for almost all modal
Introduction xv

logics and is consequently also missing for corresponding justification logics,


when they exist. Nonetheless, the venerable goal of finding a provability in-
terpretation for propositional intuitionistic logic had been attained. The Gödel
translation embeds propositional intuitionistic logic into the modal logic S4.
The Realization Theorem establishes an embedding of S4 into the justification
logic LP. And the Arithmetic Completeness Theorem shows that LP embeds
into formal arithmetic.
It was recognized from the very beginning that the connection between S4
and LP could be weakened to sublogics of S4 and LP. Thus, there were justi-
fication logic counterparts for the standard modal logics, K, K4, T, and a few
others. These justification logics had arithmetic connections because they were
sublogics of LP. The use of proof term was replaced with justification term. Al-
though the connection with arithmetic was weaker than it had been with LP,
justification terms still had the role of supplying explicit justifications for epis-
temically necessary statements. One can consult Artemov (2008) and Artemov
and Fitting (2012) for survey treatments, though the present book includes the
material found there.
Almost all of the early work on justification logics was proof-theoretically
based. Realization theorems were shown constructively, making use of a se-
quent calculus. The existence of an algorithm to compute what are called re-
alizers is important, but this proof-theoretic approach limits the field to those
logics known to have sequent calculus proof systems. For a time it was hoped
that various extensions of sequent and tableau calculi would be useful and,
to some extent, this has been the case. The most optimistic version of this
hope was expressed in Artemov (2001) quite directly, “Gabbay’s Labelled De-
ductive Systems, Gabbay (1994), may serve as a natural framework for LP.”
Unfortunately this seems to have been too optimistic. While the formats had
similarities, the goals were different, and the machinery did not interact well.
A semantics for LP and its near relatives, not based on arithmetic provabil-
ity, was introduced in Mkrtychev (1997) and is discussed in Chapter 3. (A con-
structive version of the canonical model for LP with a completeness theorem
can be found already in Artemov (1995).) Mkrtychev’s semantics did not use
possible worlds and had a strong syntactic flavor. Possible worlds were added
to the mix in Fitting (2005), producing something that potentially applied much
more broadly than the earlier semantics. This is the subject of Chapter 4. Us-
ing this possible world semantics, a nonconstructive, semantic-based, proof of
realization was given. It was now possible to avoid the use of a sequent cal-
culus, though the algorithmic nature of realization was lost. More recently, a
semantics with a very simple structure was created, Artemov’s basic semantics
(Artemov, 2012). It is presented in Chapter 3. Its machinery is almost minimal
xvi Introduction

for the purpose. In this book, we will use possible world semantics to establish
very general realization results, but basic models will often be used when we
simply want to show some formula fails to be a theorem.
Though its significance was not properly realized at the time, in 2005 the
subject broadened when a justification logic counterpart of S5 was introduced
in Pacuit (2005) and Rubtsova (2006a, b), with a connecting realization theo-
rem. There was no arithmetical interpretation for this justification logic. Also
there is no sequent calculus for S5 of the standard kind, so the proof given for
realization was nonconstructive, using a version of the semantics from Fitting
(2005). The semantics needed some modification to what is called its evidence
function, and this turned out to have a greater impact than was first realized.
Eventually constructive proofs connecting S5 and its justification counterpart
were found. These made use of cut-free proof systems that were not exactly
standard sequent calculi. Still, the door to a larger room was beginning to open.
Out of the early studies of the logics of proofs and its variants a general
logical framework for reasoning about epistemic justification at large natu-
rally emerged, and the name, Justification Logic, was introduced (cf. Arte-
mov, 2008). Justification Logic is based on justification assertions, t:F, that are
read t is a justification for F, with a broader understanding of the word jus-
tification going beyond just mathematical proofs. The notion of justification,
which has been an essential component of epistemic studies since Plato, had
been conspicuously absent in the mathematical models of knowledge within
the epistemic logic framework. The Justification Logic framework fills in this
void.
In Fitting (2016a) the subject expanded abruptly. Using nonconstructive se-
mantic methods it was shown that the family of modal logics having justifica-
tion counterparts is infinite. The justification phenomenon is not the relatively
narrow one it first seemed to be. While that work was nonconstructive, there
are now cut-free proof systems of various kinds for a broader range of modal
logics than was once the case, and these have been used successfully to create
realization algorithms, in Kuznets and Goetschi (2012), for instance. It may be
that the very general proof methodologies of Fitting (2015) and especially Ne-
gri (2005) and Negri and von Plato (2001) will extend the constructive range
still further, perhaps even to the infinite family that nonconstructive methods
are known to work for. This is active current work.
Work on quantified justification logics exists, but the subject is considerably
behind its propositional counterpart. An important feature of justification log-
ics is that they can, in a very precise sense, internalize their own proofs. Doing
this for axioms is generally simple. Rules of inference are more of a problem.
Earlier we discussed a justification formula as a simple, representative exam-
Introduction xvii

ple: t:(X → Y) → (u:X → [t · u]:Y). This, in effect, internalizes the axiomatic


modus ponens rule. The central problem in developing quantified justification
logics was how to internalize the rule of universal generalization. It turned out
that the key was the clear separation between two roles played by individual
variables. On the one hand, they are formal symbols, and one can simply infer
∀xϕ(x) from a proof of ϕ(x). On the other hand, they can be thought of as open
for substitution, that is, throughout a proof one can replace free occurrences
of x with a term t to produce a new proof (subject to appropriate freeness of
substitution conditions, of course). These two roles for variables are actually
incompatible. It was the introduction of specific machinery to keep track of
which role a variable occurrence had that made possible the internalization of
proofs, and thus a quantified justification logic.
An axiomatic version of first-order LP was introduced in Artemov and Ya-
vorskaya (Sidon) (2011) and a possible world semantics for it in Fitting (2011a,
2014b). A connection with formal arithmetic was established. There is a con-
structive proof of a Realization Theorem, connecting first-order LP with first-
order S4. Unlike propositionally, no nonconstructive proof is currently known
The possible world semantics includes the familiar monotonicity condition on
world domains. It is likely that all this can be extended to a much broader range
of quantified modal logics than just first-order S4, provided monotonicity is
appropriate. A move to constant domain models, to quantified S5 in particu-
lar, has been made, and a semantics, but not yet a Realization Theorem, can
be found in Fitting and Salvatore (2018). Much involving quantification is still
uncharted territory.

This book will cover the whole range of topics just described. It will not do
so in the historical order that was followed in this Introduction, but will make
use of the clearer understanding that has emerged from study of the subject
thus far. We will finish with the current state of affairs, standing on the edge of
unknown lands. We hope to prepare some of you for the journey, should you
choose to explore further on your own.

2 What Is Not in This Book?


There are several historical works and pivotal developments in justification
logic that will not be covered in the book due to natural limitations, and in this
section we will mention them briefly. We are confident that other books and
surveys will do justice to these works in more detail.
Apart from Gödel’s lecture, Gödel (1938), which remained unpublished
xviii Introduction

until 1995 and thus could not influence development in this area, the first re-
sults and publications on the logic of proofs are dated 1992: a technical report,
Artemov and Straßen (1992), based on work done in January of 1992 in Bern,
and a conference presentation of this work at CSL’92 published in Springer
Lecture Notes in Computer Science as Artemov and Straßen (1993a). In this
work, the basic logic of proofs was presented: it had proof variables, and the
format t is a proof of F, but without operations on proofs. However, it already
had the first installment of the fixed-point arithmetical completeness construc-
tion together with an observation that, unlike provability logic, the logic of
proofs cannot be limited to one standard proof predicate “from the textbook”
or to any single-conclusion proof predicate.
This line was further developed in Artemov and Straßen (1993b), where the
logic of single-conclusion proof predicates (without operations on proofs) was
studied. This work introduced the unification axiom, which captures single-
conclusioness by propositional tools. After the full-scale logic of proofs with
operations had been discovered (Artemov, 1995), the logic of single-conclusion
proofs with operations was axiomatized in V. Krupski (1997, 2001). A similar
technique was used recently to characterize so-called sharp single-conclusion
justification models in Krupski (2018).
Another research direction pursued after the papers on the basic logic of
proofs was to combine provability and explicit proofs. Such a combination,
with new provability principles, was given in Artemov (1994). Despite its ti-
tle, this paper did not introduce what is known now as The Logic of Proofs,
but rather a fusion of the provability logic GL and the basic logic of proofs
without operations, but with new arithmetical principles combining proofs and
provability and an arithmetical completeness theorem. After the logic of proofs
paper (Artemov, 1995), the full-scale logic of provability and proofs (with op-
erations), LPP, was axiomatized and proved arithmetically complete in Sidon
(1997) and Yavorskaya (Sidon) (2001). A leaner logic combining provability
and explicit proofs, GLA, was introduced and proved arithmetically complete
in Nogina (2006, 2014b). Unlike LPP, the logic GLA did not use additional op-
erations on proofs other than those inherited from LP. Later, GLA was used to
find a complete classification of reflection principles in arithmetic that involve
provability and explicit proofs (Nogina, 2014a).
The first publication of the full-scale logic of proofs with operations, LP,
which became the first justification logic in the modern sense, was Artemov
(1995). This paper contains all the results needed to complete Gödel’s program
of characterizing intuitionistic propositional logic IPC and its BHK semantics
via proofs in classical arithmetic: internalization, the realization theorem for
S4 in LP, arithmetical semantics for LP, and the arithmetical completeness the-
Introduction xix

orem. It took six years for the corresponding journal paper to appear: Artemov
(2001). In Goris (2008), the completeness of LP for the semantics of proofs in
Peano arithmetic was extended to the semantics of proofs in Buss’s bounded
arithmetic S12 . In view of applications in epistemology, this result shows that
explicit knowledge in the propositional framework can be made computation-
ally feasible. Kuznets and Studer (2016) extend the arithmetical interpretation
of LP from the original finite constant specifications to a wide class of con-
stant specifications, including some infinite ones. In particular, this “weak”
arithmetical interpretation captures the full logic of proofs LP with the total
constant specification.
Decidability of LP (with the total constant specification) was also estab-
lished in Mkrtychev (1997), and this opened the door to decidability and com-
plexity studies in justification logics using model-theoretic and other means.
Among the milestones are complexity estimates in Kuznets (2000), Brezh-
nev and Kuznets (2006), Krupski (2006a), Milnikel (2007), Buss and Kuznets
(2012), and Achilleos (2014a).
The arithmetical provability semantics for the Logic of Proofs, LP, natu-
rally generalizes to a first-order version with conventional quantifiers and to a
version with quantifiers over proofs. In both cases, axiomatizability questions
were answered negatively in Artemov and Yavorskaya (2001) and Yavorsky
(2001). A natural and manageable first-order version of the logic of proofs,
FOLP, has been studied in Artemov and Yavorskaya (Sidon) (2011), Fitting
(2014a), and Fitting and Salvatore (2018) and will be covered in Chapter 10.
Originally, the logic of proofs was formulated as a Hilbert-style axiomatic
system, but this has gradually broadened. Early attempts were tableau based
(which could equivalently be presented using sequent calculus machinery).
Tableaus generally are analytic, meaning that everything entering into a proof
is a subformula of what is being proved. This was problematic for attempts
at LP tableaus because of the presence of the · operation, which represented
an application of modus ponens, a rule that is decidedly not analytic. Suc-
cessful tableau systems, though not analytic, for LP and closely related logics
can be found in Fitting (2003, 2005) and Renne (2004, 2006). The analyticity
problem was overcome in Ghari (2014, 2016a). Broader proof systems have
been investigated: hypersequents in Kurokawa (2009, 2012), prefixed tableaus
in Kurokawa (2013), and labeled deductive systems in Ghari (2017). Indeed
some of this has led to new realization results (Artemov, 1995, 2001, 2002,
2006; Artemov and Bonelli, 2007; Ghari, 2012; Kurokawa, 2012).
Finding a computational reading of justification logics has been a natural
research goal. There were several attempts to use the ideas of LP for building
a lambda-calculus with internalization, cf. Alt and Artemov (2001), Artemov
xx Introduction

(2002), Artemov and Bonelli (2007), Pouliasis and Primiero (2014), and oth-
ers. Corresponding combinatory logic systems with internalization were stud-
ied in Artemov (2004), Krupski (2006b), and Shamkanov (2011). These and
other studies can serve as a ground for further applications in typed program-
ming languages. A version of the logic of proofs with a built-in verification
predicate was considered in Protopopescu (2016a, b).
The aforementioned intuition that justification logic naturally avoids the log-
ical omniscience problem has been formalized and studied in Artemov and
Kuznets (2006, 2009, 2014). The key idea there was to view logical omni-
science as a proof complexity problem: The logical omniscience defect oc-
curs if an epistemic system assumes knowledge of propositions, which have
no feasible proofs. Through this prism, standard modal logics are logically
omniscient (modulo some common complexity assumptions), and justification
logics are not logically omniscient. The ability of justification logic to track
proof complexity via time bounds led to another formal definition of logical
omniscience in Wang (2011a) with the same conclusion: Justification logic
keeps logical omniscience under control.
Shortly after the first paper on the logic of proofs, it became clear that the
logical tools developed are capable of evidence tracking in a general setting
and as such can be useful in epistemic logic. Perhaps, the first formal work in
this direction was Artemov et al. (1999), in which modal logic S5 was equiv-
alently modified and supplied with an LP-style explicit counterpart. Applica-
tions to epistemology have benefited greatly from Fitting semantics, which
connected justification logics to mainstream epistemology via possible worlds
models. In addition to applications discussed in this book, we would like to
mention some other influential work. Game semantics of justification logic
was studied in Renne (2008) and dynamic epistemic logic with justifications
in Renne (2008) and Baltag et al. (2014). In Sedlár (2013), Fitting semantics
for justification models was elaborated to a special case of the models of gen-
eral awareness. Multiagent justification logic and common knowledge has been
studied in Artemov (2006), Antonakos (2007), Yavorskaya (Sidon) (2008),
Bucheli et al. (2010, 2011), Bucheli (2012), Antonakos (2013), and Achilleos
(2014b, 2015a, b). In Dean and Kurokawa (2010), justification logic was used
for the analysis of Knower and Knowability paradoxes. A fast-growing and
promising area is probabilistic justification logic, cf. Milnikel (2014), Artemov
(2016b), Kokkinis et al. (2016), Ghari (2016b), and Lurie (2018).
Introduction xxi

We are deeply indebted to all contributors to the exciting justification logic


project, without whom there would not be this book.
Very special thanks to our devoted readers for their sharp eyes and their
useful comments: Vladimir Krupski, Vincent Alexis Peluce, and Tatiana Ya-
vorskaya (Sidon).

I think there is no sense in forming an opinion when there is no evidence to form it


on. If you build a person without any bones in him he may look fair enough to the
eye, but he will be limber and cannot stand up; and I consider that evidence is the
bones of an opinion.2

2 Mark Twain (1835–1910). The quote is from his last novel, Personal Recollections of Joan of
Arc, Twain (1896).
1
Why Justification Logic?

The formal details of justification logic will be presented starting with the next
chapter, but first we give some background and motivation for why the subject
was developed in the first place. We will see that it addresses, or at least par-
tially addresses, many of the fundamental problems that have been found in
epistemic logic over the years. We will also see in more detail how it relates
to our understanding of intuitionistic logic. And finally, we will see how it can
be used to mitigate some well-known issues that have arisen in philosophical
investigations.

1.1 Epistemic Tradition


The properties of knowledge and belief have been a subject for formal logic
at least since von Wright and Hintikka (Hintikka, 1962; von Wright, 1951).
Knowledge and belief are both treated as modalities in a way that is now very
familiar—Epistemic Logic. But of the celebrated three criteria for knowledge
(usually attributed to Plato), justified, true, belief, Gettier (1963); Hendricks
(2005), epistemic modal logic really works with only two of them. Possible
worlds and indistinguishability model belief—one believes what is so under
all circumstances thought possible. Factivity brings a trueness component into
play—if something is not so in the actual world it cannot be known, only be-
lieved. But there is no representation for the justification condition. Nonethe-
less, the modal approach has been remarkably successful in permitting the de-
velopment of rich mathematical theory and applications (Fagin et al., 1995;
van Ditmarsch et al., 2007). Still, it is not the whole picture.
The modal approach to the logic of knowledge is, in a sense, built around
the universal quantifier: X is known in a situation if X is true in all situations
indistinguishable from that one. Justifications, on the other hand, bring an ex-

1
2 Why Justification Logic?

istential quantifier into the picture: X is known in a situation if there exists a


justification for X in that situation. This universal/existential dichotomy is a
familiar one to logicians—in formal logics there exists a proof for a formula
X if and only if X is true in all models for the logic. One thinks of models as
inherently nonconstructive, and proofs as constructive things. One will not go
far wrong in thinking of justifications in general as much like mathematical
proofs. Indeed, the first justification logic was explicitly designed to capture
mathematical proofs in arithmetic, something that will be discussed later.
In justification logic, in addition to the category of formulas, there is a sec-
ond category of justifications. Justifications are formal terms, built up from
constants and variables using various operation symbols. Constants represent
justifications for commonly accepted truths—axioms. Variables denote un-
specified justifications. Different justification logics differ on which operations
are allowed (and also in other ways too). If t is a justification term and X is a
formula, t:X is a formula, and is intended to be read

t is a justification for X.

One operation, common to all justification logics, is application, written


like multiplication. The idea is, if s is a justification for A → B and t is a
justification for A, then [s · t] is a justification for B.1 That is, the validity of the
following is generally assumed

s:(A → B) → (t:A → [s · t]:B). (1.1)

This is the explicit version of the usual distributivity of knowledge operators,


and modal operators generally, across implication

K(A → B) → (KA → KB). (1.2)

How adequately does the traditional modal form (1.2) embody epistemic
closure? We argue that it does so poorly! In the classical logic context, (1.2)
only claims that it is impossible to have both K(A → B) and KA true, but
KB false. However, because (1.2), unlike (1.1), does not specify dependencies
between K(A → B), KA, and KB, the purely modal formulation leaves room
for a counterexample.
The distinction between (1.1) and (1.2) can be exploited in a discussion of
the paradigmatic Red Barn Example of Goldman and Kripke; here is a simpli-
fied version of the story taken from Dretske (2005).
1 For better readability brackets will be used in terms, “[,]”, and parentheses in formulas, “(,).”
Both will be avoided when it is safe.
1.1 Epistemic Tradition 3

Suppose I am driving through a neighborhood in which, unbeknownst to me, papier-


mâché barns are scattered, and I see that the object in front of me is a barn. Because
I have barn-before-me percepts, I believe that the object in front of me is a barn. Our
intuitions suggest that I fail to know barn. But now suppose that the neighborhood
has no fake red barns, and I also notice that the object in front of me is red, so I
know a red barn is there. This juxtaposition, being a red barn, which I know, entails
there being a barn, which I do not, “is an embarrassment.”

In the first formalization of the Red Barn Example, logical derivation will
be performed in a basic modal logic in which  is interpreted as the “belief”
modality. Then some of the occurrences of  will be externally interpreted as
a knowledge modality K according to the problem’s description. Let B be the
sentence “the object in front of me is a barn,” and let R be the sentence “the
object in front of me is red.”

(1) B, “I believe that the object in front of me is a barn.” At the metalevel,
by the problem description, this is not knowledge, and we cannot claim KB.
(2) (B ∧ R), “I believe that the object in front of me is a red barn.” At the
metalevel, this is actually knowledge, e.g., K(B ∧ R) holds.
(3) (B∧R → B), a knowledge assertion of a logical axiom. This is obviously
knowledge, i.e., K(B ∧ R → B).

Within this formalization, it appears that epistemic closure in its modal form
(1.2) is violated: K(B∧R), and K(B∧R → B) hold, whereas, by (1), we cannot
claim KB. The modal language here does not seem to help resolving this issue.
Next consider the Red Barn Example in justification logic where t:F is inter-
preted as “I believe F for reason t.” Let u be a specific individual justification
for belief that B, and v for belief that B ∧ R. In addition, let a be a justification
for the logical truth B ∧ R → B. Then the list of assumptions is

(i) u:B, “u is a reason to believe that the object in front of me is a barn”;


(ii) v:(B ∧ R), “v is a reason to believe that the object in front of me is a red
barn”;
(iii) a:(B ∧ R → B).

On the metalevel, the problem description states that (ii) and (iii) are cases of
knowledge, and not merely belief, whereas (i) is belief, which is not knowl-
edge. Here is how the formal reasoning goes:

(iv) a:(B ∧ R → B) → (v:(B ∧ R) → [a·v]:B), by principle (1.1);


(v) v:(B ∧ R) → [a·v]:B, from 3 and 4, by propositional logic;
(vi) [a·v]:B, from 2 and 5, by propositional logic.
4 Why Justification Logic?

Notice that conclusion (vi) is [a · v]:B, and not u:B; epistemic closure holds.
By reasoning in justification logic it was concluded that [a·v]:B is a case of
knowledge, i.e., “I know B for reason a · v.” The fact that u:B is not a case of
knowledge does not spoil the closure principle because the latter claims knowl-
edge specifically for [a·v]:B. Hence after observing a red façade, I indeed know
B, but this knowledge has nothing to do with (i), which remains a case of belief
rather than of knowledge. The justification logic formalization represents the
situation fairly.
Tracking justifications represents the structure of the Red Barn Example in a
way that is not captured by traditional epistemic modal tools. The justification
logic formalization models what seems to be happening in such a case; closure
of knowledge under logical entailment is maintained even though “barn” is not
perceptually known.
One could devise a formalization of the Red Barn Example in a bimodal lan-
guage with distinct modalities for knowledge and belief. However, it seems that
such a resolution must involve reproducing justification tracking arguments in
a way that obscures, rather than reveals, the truth. Such a bimodal formal-
ization would distinguish u:B from [a · v]:B not because they have different
reasons (which reflects the true epistemic structure of the problem), but rather
because the former is labeled “belief” and the latter “knowledge.” But what
if one needs to keep track of a larger number of different unrelated reasons?
By introducing a multiplicity of distinct modalities and then imposing var-
ious assumptions governing the interrelationships between these modalities,
one would essentially end up with a reformulation of the language of justi-
fication logic itself (with distinct terms replaced by distinct modalities). This
suggests that there may not be a satisfactory “halfway point” between a modal
language and the language of justification logic, at least inasmuch as one tries
to capture the essential structure of examples involving the deductive nature of
knowledge.

1.2 Mathematical Logic Tradition


According to Brouwer, truth in constructive (intuitionistic) mathematics means
the existence of a proof, cf. Troelstra and van Dalen (1988). In 1931–34, Heyt-
ing and Kolmogorov gave an informal description of the intended proof-based
semantics for intuitionistic logic (Kolmogoroff, 1932; Heyting, 1934), which
is now referred to as the Brouwer–Heyting–Kolmogorov (BHK) semantics. Ac-
cording to the BHK conditions, a formula is “true” if it hasa proof. Further-
1.2 Mathematical Logic Tradition 5

more, a proof of a compound statement is connected to proofs of its compo-


nents in the following way:

• a proof of A ∧ B consists of a proof of proposition A and a proof of proposi-


tion B,
• a proof of A∨B is given by presenting either a proof of A or a proof of B,
• a proof of A → B is a construction transforming proofs of A into proofs of
B,
• falsehood ⊥ is a proposition, which has no proof; ¬A is shorthand for
A → ⊥.

This provides a remarkably useful informal way of understanding what is


and what is not intuitionistically acceptable. For instance, consider the classical
tautology (P ∨ Q) ↔ (P ∨ (Q ∧ ¬P)), where we understand ↔ as mutual
implication. And we understand ¬P as P → ⊥, so that a proof of ¬P would
amount to a construction converting any proof of P into a proof of ⊥. Because
⊥ has no proof, this amounts to a proof that P has no proof—a refutation of P.
According to BHK semantics the implication from right to left in (P ∨ Q) ↔
(P ∨ (Q ∧ ¬P)) should be intuitionistically valid, by the following argument.
Given a proof of P ∨ (Q ∧ ¬P) it must be that we are given a proof of one of
the disjuncts. If it is P, we have a proof of one of P ∨ Q. If it is Q ∧ ¬P, we
have proofs of both conjuncts, hence a proof of Q, and hence again a proof of
one of P ∨ Q. Thus we may convert a proof of P ∨ (Q ∧ ¬P) into a proof of
P ∨ Q.
On the other hand, (P ∨ Q) → (P ∨ (Q ∧ ¬P)) is not intuitionistically valid
according to the BHK ideas. Suppose we are given a proof of P ∨ Q. If we have
a proof of the disjunct P, we have a proof of P ∨ Q. But if we have a proof of
Q, there is no reason to suppose we have a refutation of P, and so we cannot
conclude we have a proof of Q ∧ ¬P, and things stop here.
Kolmogorov explicitly suggested that the proof-like objects in his interpre-
tation (“problem solutions”) came from classical mathematics (Kolmogoroff,
1932). Indeed, from a foundational point of view this reflects Kolmogorov’s
and Gödel’s goal to define intuitionism within classical mathematics. From
this standpoint, intuitionistic mathematics is not a substitute for classical math-
ematics, but helps to determine what is constructive in the latter.
The fundamental value of the BHK semantics for the justification logic
project is that informally but unambiguously BHK suggests treating justifi-
cations, here mathematical proofs, as objects with operations.
In Gödel (1933), Gödel took the first step toward developing a rigorous
proof-based semantics for intuitionism. Gödel considered the classical modal
logic S4 to be a calculus describing properties of provability:
6 Why Justification Logic?

(1) Axioms and rules of classical propositional logic,


(2) (F → G) → (F → G),
(3) F → F,
(4) F → F,
`F
(5) Rule of necessitation: .
` F

Based on Brouwer’s understanding of logical truth as provability, Gödel de-


fined a translation tr(F) of the propositional formula F in the intuitionistic lan-
guage into the language of classical modal logic: tr(F) is obtained by prefixing
every subformula of F with the provability modality . Informally speaking,
when the usual procedure of determining classical truth of a formula is applied
to tr(F), it will test the provability (not the truth) of each of F’s subformulas,
in agreement with Brouwer’s ideas. From Gödel’s results and the McKinsey-
Tarski work on topological semantics for modal logic (McKinsey and Tarski,
1948), it follows that the translation tr(F) provides a proper embedding of the
Intuitionistic Propositional Calculus, IPC, into S4, i.e., an embedding of intu-
itionistic logic into classical logic extended by the provability operator.

IPC ` F ⇔ S4 ` tr(F). (1.3)


Conceptually, this defines IPC in S4.
Still, Gödel’s original goal of defining intuitionistic logic in terms of clas-
sical provability was not reached because the connection of S4 to the usual
mathematical notion of provability was not established. Moreover, Gödel noted
that the straightforward idea of interpreting modality F as F is provable in
a given formal system T contradicted his second incompleteness theorem. In-
deed, (F → F) can be derived in S4 by the rule of necessitation from the
axiom F → F. On the other hand, interpreting modality  as the predicate
of formal provability in theory T and F as contradiction converts this formula
into a false statement that the consistency of T is internally provable in T .
The situation after Gödel (1933) can be described by the following figure
where “X ,→ Y” should be read as “X is interpreted in Y”:
IPC ,→ S4 ,→ ? ,→ CLASSICAL PROOFS.

In a public lecture in Vienna in 1938, Gödel observed that using the format of
explicit proofs
t is a proof of F (1.4)
can help in interpreting his provability calculus S4 (Gödel, 1938). Unfortu-
nately, Gödel (1938) remained unpublished until 1995, by which time the
Another random document with
no related content on Scribd:
became J. M. L. & W. H. Scovill. J. M. L. Scovill did the selling and his
brother ran the shop and the finances. In 1850 the firm was incorporated
as the present Scovill Manufacturing Company.
Meantime, Aaron Benedict established, in 1812, a factory at Waterbury
for making bone and ivory buttons, and, in 1823, he too began making
brass buttons. About 1820 James Croft, a brass worker from
Birmingham, England, came to the Scovills. A year later Benedict
secured him, and when Benedict and Israel Coe formed the firm of
Benedict & Coe, in 1829, Croft became one of the partners. Croft’s
coming marks a vital point in the history of the industry. On his advice,
both Scovill and Benedict began to do their own rolling. It was his
influence which induced them to import from Birmingham workmen,
processes and machinery. He went to England seven times for Benedict,
and Israel Holmes went three times for Scovill to bring back English
machinery, rollers and finishers. Israel Coe also went to England when
the Wolcottville Brass Company was started.[198] From that time, the
business may be said to have passed the experimental stage, and its
growth from 1830 was rapid. William Lathrop, who has made a study of
it, has traced, perhaps better than any other, the coincident growth of the
market and the industry. The raw material was at first mainly scrap
copper, old ship sheathing, kettles, boilers and stills, collected by the
Connecticut peddlers. More and more copper was imported until after
1850 when the mining of western copper began developing. All of the
zinc was imported until about 1870.
[198] Lathrop, p. 89.
Figure 48. Hiram W. Hayden
Figure 49. Israel Holmes
At first they rolled brass only for their own use, but new demands for it
were arising. By 1840 Chauncey Jerome had developed the cheap brass
clock. The discovery and refining of petroleum created a lamp industry.
The pin machinery, invented by Dr. J. I. Howe, Fowler, and Slocum &
Jillson, opened up another great outlet. Daguerreotype plates gave
another, and metallic cartridges another, while the invention of the
telegraph enormously extended the use of copper wire. The Waterbury
men were best able to meet these new demands, as they were the only
ones in the country with the facilities and experience needed, and they
“got in first.” While the rolling and the drawing processes were imported
bodily from England, and have continued almost unchanged, Yankee
ingenuity was constantly at work devising new articles made of brass
and improving the machinery for making the old ones.
With the increasing demand, firms began to multiply. Israel Holmes,
who had been with the Scovills for ten years, started Holmes &
Hotchkiss in 1830, and with English workmen and machinery made wire
and tubing for the market. After several changes in partnership, the firm
became Brown & Elton in 1838. Meantime, Holmes, with Israel Coe,
Anson Phelps and John Hungerford, started the Wolcottville Brass
Company in 1834, in what is now Torrington. They built up a prosperous
business in sheet-brass kettles, but lost heavily when Hiram W. Hayden,
then with Scovill, invented the spinning process. Their property was
eventually sold to Lyman Coe, and became the Coe Brass Company.
The Waterbury Brass Company was started in 1845, with Holmes as
president. Associated with him were H. W. Hayden, Elton, and Lyman
Coe, son of Israel Coe. In 1853 Holmes founded Holmes, Booth &
Haydens,[199] and in 1869 Holmes, Booth & Atwood, which two years
later was forced to change its name to Plume & Atwood on account of its
resemblance to the older company.
[199] There were two Haydens in the firm, H. W. Hayden was in charge of the
manufacturing and H. H. Hayden in charge of marketing the product.

Israel Holmes stands out among the indomitable personalities who


built up the brass industry. In addition to his invaluable work for the
Scovills, he started five of the strongest firms in the valley, and was the
first president of three.
Benedict & Coe became Benedict & Burnham in 1834, and from this
firm has come the American Pin Company, the Waterbury Button
Company and the Waterbury Clock and Watch companies. Anson
Phelps soon withdrew from the Wolcottville Brass Company and started
Smith & Phelps at Derby in 1836. Encouraged by its success, Phelps
planned to organize a large manufacturing community there, but he was
held up by a man who raised the price of some necessary land from
$5,000 to $30,000, so he moved two miles up the river and founded what
is now the city of Ansonia. In 1854 the firm was incorporated as the
Ansonia Brass & Copper Company. Mr. George P. Cowles, who came
from Wolcottville in 1848, was its executive head for forty years until his
death. From it sprang the Ansonia Clock Company, of Brooklyn, Wallace
& Sons, which failed in 1896 and became part of the Coe Brass
Company, and a number of other companies. The Chase Rolling Mills
Company developed from the Waterbury Manufacturing Company,
Benedict & Burnham, and Holmes, Booth & Haydens. Randolph &
Clowes came down through Brown & Brothers, and Brown & Elton from
the old Holmes & Hotchkiss firm.
HENRY AARON BENEDICT,
ABEL PORTER &
GRILLEY, 1790 1812
CO., 1802
Pewter Buttons Bone and ivory buttons.
Begins making brass
Started making brass
buttons
ones 1823
LEAVENWORTH,
HAYDEN &
SCOVILL, 1811
Fred’k Leavenworth, BENEDICT & COE,
David Hayden, J. M. 1829
L. Scovill, James Aaron Benedict, Israel
Croft, Israel Holmes. Coe, Jas. Croft
J. M. L. & W. H. HOLMES &
SCOVILL, 1827 HOTCHKISS,
Israel Holmes, H. W. 1830
Hayden Israel Holmes,
Horace
Hotchkiss,
Philo Brown
Howe Mfg. Co., BENEDICT & WOLCOTTVILLE BRASS CO., 1834
Darby; Slocum & BURNHAM, 1834 Israel Coe, Israel Holmes, Anson G.
Jillson, A. & Chas. Phelps, John Hungerford, Geo. P. Cowles,
Poughkeepsie, N. Benedict, G. W. John Davol—Torrington, Conn.
Y., and Fowler Burnham,
Bros., A. S. Chase, F. A.
Northford, Ct., Mason, Geo.
developed Somers
successful pin
machinery
BROWN & ELTON,
1838
Philo Brown, J. P.
Elton—Sold to SMITH & PHELPS, 1836
Brown & Bros. and WATERBURY Anson G. Phelps, Geo. P.
Holmes, Booth & BUTTON Cowles,
Haydens CO., 1849 Thos. Wallace, Thos.
Davol organised James.
AMERICAN Brooklyn First at Derby—Co. founded
PIN CO., 1846 Brass & Copper at
Co. in 1853 Ansonia in 1844
WATERBURY HUMPHREYSVILLE
BRASS CO., COPPER CO., 1849
1845 Thos. James,
Israel WALLACE & Seymour, Ct.
Holmes, J. P. SON, 1848
Elton, H. W. Thos. Wallace—
Hayden, Co. failed in
Lyman W. 1895—Plant sold
Coe, son of to Coe Br.
Israel Coe Co. in 1896
SCOVILL MFG. NEW HAVEN
CO., 1850 COPPER CO., 1855
The Scovill Brothers, WATERBURY
F. J. CLOCK CO.,
Kingsbery, Sr. 1857
BROWN & BROS.,
1851 COE BRASS CO.,
Philo Brown— 1863
Failed in 1886. L. W. Coe, Chas.
Geo H. Clowes F. Brooker
HOLMES, BOOTH & ANSONIA BRASS & COPPER CO., 1854
HAYDENS, 1853 Anson Phelps, Geo. P. Cowles
Israel Holmes, G. W.
Burnham, A. S.
Chase,
L. J. Atwood
Holmes, Booth & BRIDGEPORT BRASS CO., ANSONIA CLOCK CO.,
Atwood, 1869. 1865 1878
Name changed to John Davol, F. A. Mason, Brooklyn, N. Y.
PLUME & Geo. Somers,
ATWOOD, 1871 F. J. Kingsbury, Jr.
Israel Holmes, J. C.
Booth, L. J. Atwood
WATERBURY MFG. WATERBURY WATCH CO.,
CO., 1876 1880
A. S. Chase, Henry CHASE ROLLING
S. Chase, Fredk. S. MILL CO., 1900
Chase Henry S. Chase, Fredk. S.
WATERBURY BRASS Chase
SEYMOUR-MFG.
GOODS
CO., 1878
CORPORATION
AMERICAN
RANDOLPH &
BRASS CO.,
CLOWES, 1886
1899 CHASE METAL WORKS, 1914
R. F. Randolph, Geo.
Chas. F.
H. Clowes
Brooker

Figure 50. Genealogy of the Naugatuck Brass Industry

The American Brass Company was formed in 1899, and now


comprises the Waterbury Brass Company, Holmes, Booth & Haydens,
Benedict & Burnham, the Coe Brass Company and the Ansonia Brass &
Copper Company. This is the largest brass company in the world.
Such, briefly, is an outline of the history of the larger companies. To an
outsider their interrelations are almost inextricable. The chart (Fig. 50)
does little more than indicate them. As phases of the business grew,
there was a clearly defined policy of setting them off as separate
enterprises. The American Pin Company, the Clock, Watch and Button
companies, and the Brass Goods Corporation are examples. Only the
more important of these manufacturing companies are shown. While
there has been at times sharp competition among them, it always
stopped short of war, and when facing outside competition the
companies pull together.
Many of the heavy stockholders, as Holmes, Elton, Burnham and
Chase, were interested in several companies. Nearly all of the leaders
were born and grew up in the valley and were full of local spirit. Israel
Coe, Holmes, the Scovill brothers and Phelps, of the earlier order, were
men of great ability, as also Lyman Coe, Cowles and Charles F. Brooker,
of the later generation. The inventions of Hiram W. Hayden vitally
affected the history of four companies. They seriously undermined the
old Wolcottville company, shut the Brooklyn Brass Company out of
important phases of its business, and built up the prosperity of the
Waterbury Brass Company, and Holmes, Booth & Haydens. L. J.
Atwood, of Holmes, Booth & Haydens, and, later, Plume & Atwood, L. S.
White, of Brown & Brothers, and W. N. Weeden, of Benedict & Burnham,
were prolific inventors, and their work contributed to the growth of the
industry.
Other influential men have built machinery for the large companies.
Almon Farrel founded the Farrel Foundry & Machine Company in 1851,
and had two establishments, one in Ansonia and one in Waterbury. The
Waterbury plant was operated for many years by E. C. Lewis as agent.
He bought it in 1880, but as a matter of sentiment retained the old name,
prefixing the word Waterbury. The two plants have come to specialize
somewhat, the Waterbury one building mainly presses and stamping
machines, and the Ansonia one rolling mills and heavy machinery. E. J.
Manville, an expert mechanic from the Pratt & Whitney shop in Hartford,
with his five sons, founded the E. J. Manville Machine Company, which
has built up a wide reputation for its presses and headers. Among the
many others are William Wallace of Wallace & Sons, Ansonia, Charles
Johnson and A. C. Campbell.
The answer, then, to our question as to origin and success of the
Naugatuck brass industry is as follows. It sprang from the local
manufacture of buttons. A small group of able, forceful and ingenious
men developed the best facilities in the country for rolling and drawing
brass, and when new demands came they were the only ones with
experience prepared to meet them. They were originally well situated for
raw material. Later they bought their copper in Baltimore. By the time
copper began coming from the West, the Waterbury companies were
firmly established. Copper is expensive, its unit of weight is the pound
and not the ton, and freight rates are far less important than with steel,
so the industry’s detached location did not outweigh the advantage of its
early start. A large force of workmen skilled in handling brass has been
developed in these factories and no large enterprise could now be
started elsewhere without drawing upon them. Many of these workmen
own their homes, and their relations with the employers have generally
been so friendly that higher wages elsewhere do not seem to attract
them.
Finally, and perhaps most important of all, are the men who have
designed and built the tools used. Connecticut leads all other states in
the ratio of patents to the population, and Waterbury has led the rest of
Connecticut in the proportion of nearly two to one. All the finishing or
“cutting up” shops, as they are locally known, contain highly developed
machinery—nearly all of it special, much of it designed and built in the
shop where it is used.
This includes machinery for making eyelets, hooks and eyes, pins,
cartridges, wire forming machinery, thread rolling, and headers and
stamping machinery. Some of these machines, as, for instance, the last
two mentioned, are more or less standard, but their tool equipment has
been wonderfully developed and is bewildering in its variety. Much of this
machinery has never been made public, and nearly all of it is too special,
too intricate and too varied for description here. It is natural, under these
circumstances, that the mechanics who developed these tools should be
comparatively little known. They have, however, been a vital element in
the Naugatuck brass industry, and should be recognized as successful
American tool builders.
CHAPTER XIX
PHILADELPHIA
Although the commercial manufacture of machinery began in New
England, Philadelphia became, and for a long time remained, the
largest tool building center in the country. Its large population and
nearness to coal, iron and tide water, made this almost inevitable,
but it was hastened by the work of two brilliant mechanics, William
Sellers and William Bement.
Bishop, in his “History of American Manufactures,”[200] says: “In
the invention and construction of machinery and instruments for
practical and scientific purposes, Philadelphia mechanics early
acquired a reputation for skill. The records of original American
invention contain few names more distinguished than those of
Godfrey, the inventor of the quadrant, of Rittenhouse, who made the
first telescope constructed in America, and whose orrery and other
scientific instruments displayed unusual mechanical and
mathematical genius; of Franklin, Evans, Fulton, Fitch, and others
whose inventive and constructive skill have added to the permanent
wealth of the State and the Union.” Of these, Oliver Evans seems to
have affected modern manufacturing methods the most.
[200] Vol. I, p. 576.

Evans was born in Delaware in 1755. He was apprenticed to a


wheelwright, and invented a card machine as early as 1777, but
never followed it up. In 1785 he built a flour mill in Newcastle County,
Del.; and, impatient at the crude methods in use, he began a series
of improvements which form the basis of the modern art of handling
materials. It has been claimed that Evans stole many of these ideas
from the Ellicotts in Maryland. This does not seem probable. Thomas
Ellicott wrote a portion of the “Millwright and Millers Guide” which
Evans published to help introduce his machinery, and in this Ellicott
himself refers to “the elevators, hopper-boys, etc., invented by Oliver
Evans, late of Delaware, though now of Philadelphia.” Evans
developed a number of closely related transporting devices about
which no question is raised and no claims on behalf of the Ellicotts
made; and many years later, in 1812, Evans sued those Ellicotts who
were then operating for infringement of his patents and obtained
judgment. If they could have proved priority it seems natural that
they would have done so.
Evans’ improvements related chiefly to the movement of materials
during the processes of manufacture. He modified the ancient
Egyptian chain of pots, used for irrigation, by using an endless belt
carrying iron buckets so arranged as to fill with dry material from a
boot at the bottom and to empty by gravity into a hopper as they
went over the head pulley. He used a belt conveyor for horizontal
movement, without however the troughing feature which is a later
improvement. When the discharge end was lowest, Evans utilized
gravity to drive it, and called it a “descender.” What Evans called a
“drill” was an “elevator laid horizontally” and provided with wooden
cleats which scraped the grain along the bottom of the box in which it
ran, and was nothing more nor less than our modern flight or scraper
conveyor. Evans’ “conveyor” was a round, wooden shaft on which he
nailed a sheet-iron spiral which pushed the grain along a trough in
which the shaft rotated; and when he wished to stir or dry the
material as he moved it, he broke up the continuous helix into a
number of separate arms arranged spirally. These of course
correspond to the modern screw conveyor. His so-called “hopper-
boy” consisted of a vertical shaft with a horizontal cross bar at its
lower end provided on its lower side with flights which spread the
meal for drying, and slowly worked it in a spiral toward a hopper at
the center. The angle of the flights was adjustable so that the time
allowed for cooling could be varied. He also used pivoted wooden
spouts at the discharge of the elevators to deliver the grain into
different bins. These improvements are said to have effected a
saving of over $30,000 a year in the Ellicott Mill at Patapsco, Md., on
an output of 325 barrels a day.[201] In his patents and various books
Evans shows nearly all of the modern transporting devices in
substantially their present forms.[202]
[201] Paper by Coleman Sellers on “Oliver Evans and his Inventions.”
Journal of the Franklin Institute, Vol. CXXII, p. 1.
[202] Sections of Evans’ mills are shown in the American Machinist of
November 7, 1907, and December 17, 1914. In both cases the conveyor
system was arranged to take material either from wagons on one side of the
mill or from boats on the opposite side.

Evans moved to Philadelphia some time prior to 1790. In 1800 he


had a mill near Third and Market Streets and the next year was
selling mill supplies at Ninth and Market Streets. As a boy he had
become interested in the steam engine. A description of the
Newcomen engine fell into his hands and he was struck with the fact
that the steam was used only to produce a vacuum and saw that
more power could be obtained if it were used to produce pressure.
After his removal to Philadelphia he made an engine 6 inches
diameter by 18 inches stroke, which was running in 1802, grinding
plaster of Paris and sawing wood. It cost, boiler and all, more than
$3700 and nearly impoverished him. Its successful operation,
however, led to an order for an engine to drive a steamboat on the
Mississippi, which was sent to New Orleans but never used for its
original purpose. The boat it was intended for had been stranded
high and dry during a flood, so the engine was set to running a saw
mill and later a cotton press. In 1803 Evans began business as a
regular engine builder and was unquestionably the first one in the
United States. He advocated long stroke engines operating under
high steam pressure, of which he had built fifty by 1816. In 1804 he
built a flat bottomed boat, fitted with a steam driven, chain bucket
dredge, which he called the “Oruktor Amphibolos” or in good English
the Amphibious Digger. It was built more than a mile from the river,
and was mounted on rollers connected with the engine. After moving
around what is now the City Hall Square each day for several days,
the boat walked out Market Street to the Schuykill and into the water;
its rollers were disconnected, a stern paddle wheel substituted, and it
steamed down the Schuykill and around up the Delaware to the city.
In 1805 he advertised a new book, “The Young Engineer’s Guide,”
which he intended to be very complete and “abstruse.” Disappointed
in his application for the extension of his patents and crippled by his
first engine ventures, he issued it much abridged, with only part of
the illustrations planned, and called it “The Abortion of the Young
Engineer’s Guide.” The proportions given in this book for one of his
steam engines were: diameter of cylinder, 20 inches; stroke, 5 feet;
steam pressure, 194 to 220 pounds. The boilers were of cast iron, 30
inches in diameter, 24 feet long, fired at one end, with a single return
flue. An engine was actually built to these proportions for the
Fairmount Water Works. It may be added that the boilers burst on
three different occasions.
In 1807 Evans was established as “Millwright and Engineer” at the
Mars Works at Ninth and Vine Streets. An old description of the plant
says that it consisted of
an iron foundry, mould-maker’s shop, steam engine manufactory, blacksmith’s
shop, and mill-stone manufactory, and a steam engine used for grinding materials
for the use of the works, and for turning and boring heavy cast or wrought iron
work. The buildings occupy one hundred and eighty-eight feet front and about
thirty-five workmen are daily employed. They manufacture all cast or wrought-iron
work for machinery for mills, for grinding grain or sawing timber; for forges, rolling
and slitting-mills, sugar-mills, apple-mills, bark-mills, &c. Pans of all dimensions
used by sugar-boilers, soap-boilers, &c. Screws of all sizes for cotton-presses,
tobacco-presses, paper-presses, cast iron gudgeons, and boxes for mills and
wagons, carriage-boxes, &c., and all kinds of small wheels and machinery for
Cotton and Wool spinning, &c. Mr. Evans also makes steam engines on improved
principles, invented and patented by the proprietor, which are more powerful and
less complicated, and cheaper than others; requiring less fuel, not more than one-
fiftieth part of the coals commonly used. The small one in use at the works is on
this improved principle, and it is of great use in facilitating the manufacture of
others. The proprietor has erected one of his improved steam engines in the town
of Pittsburg, and employed it to drive three pair of large millstones with all the
machinery for cleaning the grain, elevating, spreading, and stirring and cooling the
meal, gathering and bolting, &c., &c. The power is equal to twenty-four horses and
will do as much work as seventy-two horses in twenty-four hours; it would drive
five pair of six-feet millstones, and grind five hundred bushels of wheat in twenty-
four hours.[203]
[203] Freedley: “Philadelphia and its Manufactures,” pp. 54-55.
Philadelphia, 1858.

Incidentally the last sentence is an admirable illustration of the


origin of the term “horse power.” This business was carried on until
Evans’ death. It would be interesting to know how far he influenced
the design of Mississippi river boat engines which have retained the
proportions characteristic of the engines which he first built for that
service.
Evans’ best-known book, “The Young Millwright and Miller’s
Guide,” went through a number of editions and was translated and
published abroad. In this he gives his idea of “The True Path to
Inventions.” It is well worth quoting, as it explains, in part, his own
success as an inventor.
Necessity is called the mother of Invention—but upon inquiry we shall find that
Reason and Experiment bring them forth.—For almost all inventions have resulted
from such steps as the following:
Step I. Is to investigate the fundamental principles of the theory, and process, of
the art or manufacture we wish to improve.
II. To consider what is the best plan, in theory, that can be deduced from, or
founded on, those principles to produce the effect we desire.
III. To inquire whether the theory is already put in practice to the best advantage;
and what are the imperfections or disadvantages of the common process, and
what plans are likely to succeed better.
IV. To make experiments in practice, upon any plans that these speculative
reasonings may suggest, or lead to.—Any ingenious artist, taking the foregoing
steps, will probably be led to improvements on his own art: for we see by daily
experience that every art may be improved. It will, however, be in vain to attempt
improvements unless the mind be freed from prejudice, in favour of established
plans.[204]
[204] pp. 345-346.

Evans was certainly an independent, and probably the first,


inventor of the high pressure steam engine, a type of engine which
he saw was well suited to American pioneer conditions. He was
interested in steam locomotion, and predicted the development of
railways with singular accuracy.
The time will come when people will travel in stages moved by steam engines
from one city to another almost as fast as birds fly—fifteen to twenty miles an hour.
Passing through the air with such velocity—changing the scenes in such rapid
succession—will be the most exhilarating, delightful exercise. A carriage will set
out from Washington in the morning, and the passengers will breakfast at
Baltimore, dine at Philadelphia, and sup at New York the same day.
To accomplish this, two sets of railways will be laid so nearly level as not in any
place to deviate more than two degrees from a horizontal line, made of wood or
iron, on smooth paths of broken stone or gravel, with a rail to guide the carriages
so that they may pass each other in different directions and travel by night as well
as by day; and the passengers will sleep in these stages as comfortably as they do
now in steam stage-boats. A steam engine that will consume from one-quarter to
one-half a cord of wood will drive a carriage 180 miles in twelve hours, with twenty
or thirty passengers, and will not consume six gallons of water. The carriages will
not be overloaded with fuel or water.... And it shall come to pass that the memory
of those sordid and wicked wretches who oppose such improvements will be
execrated by every good man, as they ought to be now.
Posterity will not be able to discover why the Legislature or Congress did not
grant the inventor such protection as might have enabled him to put in operation
these great improvements sooner—he having asked neither money nor a
monopoly of any existing thing.[205]
[205] Evans: Extract from “Address to the people of the United States,”
quoted in the Journal of the Franklin Institute, Vol. CXXII, p. 13.

He practically initiated the modern science of handling materials.


While many of his theories were faulty, his mechanical practice was
seldom wrong. He was a restless man, discontented and inclined to
air his grievances in public. Once, in a fit of pique, he destroyed the
drawings and records of, it is said, more than eighty inventions—an
act which he regretted later. Though frequently disappointed, he was
in the end fairly successful, and was unquestionably one of the
foremost of the early American mechanics.
In Philadelphia, as in New England, many of the early shops made
textile machinery. Arkwright machines were built by James
Davenport at the Globe Mills at the north end of Second Street,
which Washington visited in 1797 when the new Federal
Government inaugurated its policy of developing American
industries. Davenport died soon after, the business ceased and the
factory was sold in 1798. Cotton machinery is said to have been
manufactured also by Eltonhead in 1803.
Philadelphia was then the largest city in the country, with an active
industrial life. It was natural, therefore, that the tools and methods
developed in and about Pawtucket should, sooner or later, take root
there. In 1810 Alfred Jenks, a direct descendant of Joseph Jenks, of
Lynn, having served his time with Samuel Slater in Pawtucket,
moved to Holmesburg, Pa., and started the first factory in
Pennsylvania for making textile machinery. His business grew rapidly
and in 1820 he moved to Bridesburg, now a part of Philadelphia,
bringing his shop along with him on rollers. By 1825 there were thirty
cotton mills in and about the city, most of which he had equipped. As
the demand for woolen machinery arose Jenks met it and he
equipped the first woolen mill built in the state. Under his leadership
and that of his son, Barton H. Jenks, the shop had a wide influence
and was the foremost of the early Philadelphia plants building textile
machinery. Other early shops in this field were those of J. & T. Wood,
Hindle & Sons, James Smith & Company, W. P. Uhlinger &
Company.[206]
[206] Freedley: “Philadelphia and its Manufactures,” pp. 299-302, 427.
Philadelphia, 1858.

The two plants which gave Philadelphia its great reputation for tool
building were those established by Sellers and Bement. Probably no
one has had a greater influence on machine tools in America than
William Sellers. He has been called the Whitworth of America, and
there is a singular parallelism in the work and influence of the two
men. Sellers was born in Pennsylvania in 1824, was educated in a
private school maintained by his father, and later apprenticed to his
uncle, John M. Poole, at Wilmington, Del.[207] When only twenty-one
he took charge of the machine shop of Fairbanks, Bancroft &
Company in Providence, R. I. Three years later he began the
manufacture of machine tools and mill gearing at Thirtieth and
Chestnut Streets, Philadelphia, and was soon after joined by Edward
Bancroft, who moved from Providence, the firm becoming Bancroft &
Sellers. John Sellers, Jr., a brother of William, became a partner and
in 1853 they moved to Sixteenth Street and Pennsylvania Avenue.
Mr. Bancroft died in 1855 and the firm became William Sellers &
Company, which was incorporated in 1886, with William Sellers as
president.
[207] Journal of the Franklin Institute, Vol. CLIX, pp. 365-383.

Bancroft was the inventive member of the firm and Mr. Sellers the
executive. Sellers’ designing ability did not develop until after
Bancroft’s death. His first patent was granted in 1857. In all he was
granted over ninety United States patents and many others in foreign
countries, covering a wide variety of subjects; machine tools of all
kinds, injectors which he introduced into the United States, rifling
machines, riveters, boilers, hydraulic machinery, hoisting cranes,
steam hammers and engines, ordnance, turntables, etc. One of the
best known and most original of Sellers’ machines was the spiral
geared planer patented in 1862, which has always been associated
with his name.
Almost from the first Sellers cut loose from the accepted designs
of the day. He was among the first to realize that red paint, beads
and mouldings, and architectural embellishments were false in
machine design. He introduced the “machine gray” paint which has
become universal; made the form of the machine follow the function
to be performed and freed it from all pockets and beading. Like
Bement he realized that American tools then being built were too
light; and they both put more metal into their machines than was the
practice elsewhere. From the first he adopted standards and
adhered to them so closely that repair parts can be supplied today
for machines made fifty years ago.
In April, 1864, Sellers, as president of the Franklin Institute, read a
paper on “A System of Screw Threads and Nuts,” in which he
proposed the system of screw threads since variously known as the
Sellers, Franklin Institute, or U. S. Standard.[208] They embodied the
sixty degree angle and a flat of one-eighth of the pitch at the top and
bottom of the thread. In this paper Sellers stated clearly the need for
some generally accepted standard, reviewed the various threads
then used, particularly the Whitworth thread, with its fifty-five degree
angle and rounded corners, which he disapproved of on three
grounds; first, that it was difficult to secure a fit at the top and bottom;
second, that the angle was difficult to verify; and third, the high cost
of making cutting tools which would conform accurately to the
standard. He proposed the sixty degree angle as easier to make and
already in general use in this country, and the flat top as easy to
generate and to verify. He went a step further, and proposed at the
same time a standard for bolt-heads and nuts, in which the
dimensions were derived from a simple formula and the distance
across flats was the same for square and hexagon nuts, so that the
same wrench would do for either style of nut.
[208] Journal of the Franklin Institute, Vol. LXXVII, p. 344.

You might also like