Professional Documents
Culture Documents
1. Knowledge
Metaphysics is the study of what there is, what exists, and how we know that it exists. The
ancients described it as the problem of Being. We cannot know what there is without knowing
how we can knowanything.
These circular definitions and inferences need not be vicious circles. They may simply be
a coherent set of ideas that we use to describe ourselves and the external world. If the
descriptions are logically valid and/or verifiable empirically, we think we are approaching the
"truth" about things and acquiring knowledge.
How then do we describe the knowledge itself - as an existing thing in our existent minds and in
the existing external world. Information philosophy does it by basing everything on the abstract
but quantitative notion of information.
Information is stored or encoded in physical and biological structures. Structures in the world
build themselves, following natural laws, including physical and biological laws. Structures in
the mind are partly built by biological processes and partly built by human intelligence, which
is free, creative, and unpredictable.
For information philosophy, knowledge is information created and stored in minds and in human
artifacts like stories, books, and internetworked computers.
Knowledge is actionable information that forms the basis for thoughts and actions.
Knowledge includes all the cultural information created by human societies. We call it the Sum.
It includes the theories and experiments of scientists, who collaborate to establish our knowledge
of the external world. Scientific knowledge comes the closest of any knowledge to being
independent of any human mind, though it is still dependent on an open interdependent
community of fundamentally subjective inquirers.
Analytic language philosophers have a much narrower definition of knowledge. They identify it
with language, logic, and human beliefs. For them, epistemology has been reduced to the "truth"
of statements and propositions that can be logically analyzed and validated.
Our English word for knowledge comes from the Indo-European and later Greek gno as in
gnosis. In Greek it meant a mark or token that was familiar and immediately recognizable, with
an act of cognition or cognizance. It gives us the word ken (our close relatives are "kin"), the
German cognate kennen, and the French connaisance.
(i) A belief is an internal mental state beyond the full comprehension of expert external
observers. Even the subject herself has limited immediate access to all she knows or believes. On
deeper reflection, or consulting external sources of knowledge, she might "change her mind."
(ii) The truth about any fact in the world is vulnerable to skeptical or sophistical attack. The
concept of truth should be limited to uses within logical and mathematical systems of thought.
Real world "truths" are always fallible and revisable in the light of new knowledge.
(iii) The notion of justification of a belief by providing reasons is vague, circular or an infinite
regress. What reasons can be given that themselves do not have just reasons? In view of (i) and
(ii) what value is there in a "justification" that is fallible, or worse false?
(iv) Epistemologists have primarily studied personal or subjective beliefs. Fearful of competition
from empirical science and its method for establishing knowledge, they emphasize that
justification must be based on reasons internally accessible to the subject.
(v) The emphasis on logic has led some epistemologists to claim that knowledge is closed under
(strict or material) implication. This assumes that the process of ordinary knowing is informed by
logic, in particular that
Perhaps overreacting to the threat from science as a demonstrably more successful method for
establishing knowledge, epistemologists have hoped to differentiate and preserve their own
philosophical approach. Some have held on to the goal of logical positivism (e.g., Russell,
early Wittgenstein, and the Vienna Circle) that philosophical analysis would provide an a
priorinormative ground for merely empirical scientific knowledge.
Logical positivist arguments for the non-inferential self-validation of logical atomic perceptions
like "red, here, now" have perhaps misled some epistemologists to think that personal
perceptions can directly justify some "foundationalist" beliefs.
The philosophical method of linguistic analysis (inspired by the later Wittgenstein) has not
achieved much more. It is unlikely that knowledge of any kind reduces simply to the
careful conceptual analysis of sentences, statements, and propositions.
Information philosophy distinguishes at least three kinds of knowledge, each requiring its own
special epistemological analysis:
This last kind of knowledge is based on the "scientific method," roughly defined as a
combination of
The totality of scientific knowledge gives us our most reliable "information" about the world.
How exactly do we acquire and maintain this knowledge?
When information is stored in any structure, whether in the world, in human artifacts like books
and the Internet, or in human minds, two fundamental physical processes occur. These are the
two parts of the cosmic creative process.
First is a collapse of a quantum mechanical wave function that is needed to create even a single
"bit" of new information in an experimental measurement.
Second is a local decrease in the entropy corresponding to the increase in information. Without
this, the new bit would be erased and the system returned to equilibrium. Entropy greater than
the increase in information (negative entropy) must be transferred away from the location of the
new information to satisfy the second law of thermodynamics.
Leo Szilard calculated the mean value of the quantity of entropy produced by a 1-bit
measurement as
S = k log 2,
where k is Boltzmann's constant. The base-2 logarithm reflects the binary decision. The amount
of entropy generated by the measurement may, of course, always be greater than this
fundamental amount, but not smaller, or the second law would be violated.
These quantum level processes are susceptible to noise. Information stored may have errors.
When information is retrieved, it is again susceptible to noise, This may garble the information
content. In information science, noise is generally the enemy of information. But some noise is
the friend of freedom, since it is the source of novelty, of creativity and invention, and of
variation in the biological gene pool.
Biological systems have maintained and increased their invariant information content over
billions of generations. Humans increase our knowledge of the external world, despite logical,
mathematical, and physical uncertainty or indeterminacy. Both do it in the face of random
noise, bringing order (or cosmos) out of chaos. Both do it with sophisticated error detection and
correction schemes that limit the effects of chance.