You are on page 1of 5

EPISTEMOLOGY

1. Knowledge
Metaphysics is the study of what there is, what exists, and how we know that it exists. The
ancients described it as the problem of Being. We cannot know what there is without knowing
how we can knowanything.

Knowing how we know is the sub-discipline of metaphysics called epistemology. What


there is is the study of ontology.

Knowing how we know is a fundamentally circular problem when it is described in human


language, normally as a set of logical propositions. And knowing something about what
exists adds another complex circle, if the knowing being must itself be one of those things that
exists.

These circular definitions and inferences need not be vicious circles. They may simply be
a coherent set of ideas that we use to describe ourselves and the external world. If the
descriptions are logically valid and/or verifiable empirically, we think we are approaching the
"truth" about things and acquiring knowledge.

How then do we describe the knowledge itself - as an existing thing in our existent minds and in
the existing external world. Information philosophy does it by basing everything on the abstract
but quantitative notion of information.

Information is stored or encoded in physical and biological structures. Structures in the world
build themselves, following natural laws, including physical and biological laws. Structures in
the mind are partly built by biological processes and partly built by human intelligence, which
is free, creative, and unpredictable.

For information philosophy, knowledge is information created and stored in minds and in human
artifacts like stories, books, and internetworked computers.

Knowledge is actionable information that forms the basis for thoughts and actions.

Knowledge includes all the cultural information created by human societies. We call it the Sum.
It includes the theories and experiments of scientists, who collaborate to establish our knowledge
of the external world. Scientific knowledge comes the closest of any knowledge to being
independent of any human mind, though it is still dependent on an open interdependent
community of fundamentally subjective inquirers.

To the extent of the correspondence, the isomorphism, the one-to-one mapping, between


information structures (and processes) in the world and representative structures and functions in
the mind, information philosophy claims that we as individuals have quantifiable personal or
subjective knowledge of the world.

To the extent of the agreement (again a correspondence or isomorphism) between information in


the minds of an open community of inquirers seeking the best explanations for phenomena,
information philosophy further claims that we have quantifiable inter-subjective knowledge of
other minds and of an external world. Although science depends on their inter-
subjectiveagreement, this is as close as we come to "objective" knowledge. This is as close as we
come to "objective" knowledge, and knowledge of objects, the Kantian "things in themselves."
Empiricists like Locke thought "primary" qualities of objects are inaccessible, our senses
believed only able to receive "secondary" qualities. Information philosophy makes this a
distinction without a difference.

Analytic language philosophers have a much narrower definition of knowledge. They identify it
with language, logic, and human beliefs. For them, epistemology has been reduced to the "truth"
of statements and propositions that can be logically analyzed and validated.

Epistemologists say persons have knowledge only 1) if a statement is true, 2) if they believethat a


statement is true, and 3) if their belief is "justified," where justification may be because their
belief was the consequence of a "reliable" cognitive process, or because the belief was "caused"
by facts in the world about the belief.

They trace their three-step conditions for knowledge back


to Plato's Theaetetus and Aristotle's Posterior Analytics. Plato did talk about opinions, which
could be true or false. The true or "right" opinions could be further supported by giving an
"account" of the reasons why an opinion is "true" and not "false." But like many Platonic
dialogues, there was no resolution or agreement in the Theaetetus that these three elements could
indeed produce knowledge. The Greek word Plato used for knowledge was episteme, which
translates more nearly as "know how" than the "know that" associated with knowledge of the
"facts" in propositions.

Our English word for knowledge comes from the Indo-European and later Greek gno as in
gnosis. In Greek it meant a mark or token that was familiar and immediately recognizable, with
an act of cognition or cognizance. It gives us the word ken (our close relatives are "kin"), the
German cognate kennen, and the French connaisance.

Bertrand Russell distinguished "knowledge by acquaintance" as immediate (viz. non-mediated)


direct awareness of a particular thing. He contrasted such basic knowledge with knowledge of
concepts, ideas or "universals," which can be used to describe many particular things. He called
this "knowledge by description." He included the sense data of "red, here, now" in immediate
knowledge, knowledge we less likely to doubt and that serves as a logical foundation.
All this works well for one idea of knowledge, but unfortunately for analytic language
philosophy, the English language is philosophically impoverished, lacking another word for
knowledge that is found in all other European languages, one based on words whose root means
"to have seen."

Justified True Belief


Nevertheless, the modern field of epistemology has generally defined knowledge in three parts as
"justified true belief," specifically the truth of beliefs about statements or propositions. For
example,
S knows that P if and only if

(i) S believes that P, 


(ii) P is true, and
(iii) S is justified in believing that P.

In the long history of the problem of knowledge, all three of these knowledge or belief


"conditions" have proven very difficult for epistemologists. Among the reasons...

(i) A belief is an internal mental state beyond the full comprehension of expert external
observers. Even the subject herself has limited immediate access to all she knows or believes. On
deeper reflection, or consulting external sources of knowledge, she might "change her mind."

(ii) The truth about any fact in the world is vulnerable to skeptical or sophistical attack. The
concept of truth should be limited to uses within logical and mathematical systems of thought.
Real world "truths" are always fallible and revisable in the light of new knowledge.

(iii) The notion of justification of a belief by providing reasons is vague, circular or an infinite
regress. What reasons can be given that themselves do not have just reasons? In view of (i) and
(ii) what value is there in a "justification" that is fallible, or worse false?

(iv) Epistemologists have primarily studied personal or subjective beliefs. Fearful of competition
from empirical science and its method for establishing knowledge, they emphasize that
justification must be based on reasons internally accessible to the subject.

(v) The emphasis on logic has led some epistemologists to claim that knowledge is closed under
(strict or material) implication. This assumes that the process of ordinary knowing is informed by
logic, in particular that

(Closure) If S knows that P, and P implies Q, then S knows that Q.


We can only say that S is in a position to deduce Q, if she is trained in logic.
It is no surprise that epistemologists have failed in every effort to put knowledge on a sound
basis, let alone establish knowledge with apodeictic certainty, as Plato and Aristotle expected
and René Descartes thought he had established beyond any reasonable doubt.

Perhaps overreacting to the threat from science as a demonstrably more successful method for
establishing knowledge, epistemologists have hoped to differentiate and preserve their own
philosophical approach. Some have held on to the goal of logical positivism (e.g., Russell,
early Wittgenstein, and the Vienna Circle) that philosophical analysis would provide an a
priorinormative ground for merely empirical scientific knowledge.

Logical positivist arguments for the non-inferential self-validation of logical atomic perceptions
like "red, here, now" have perhaps misled some epistemologists to think that personal
perceptions can directly justify some "foundationalist" beliefs.

The philosophical method of linguistic analysis (inspired by the later Wittgenstein) has not
achieved much more. It is unlikely that knowledge of any kind reduces simply to the
careful  conceptual analysis  of sentences, statements, and propositions.

Information philosophy looks deeper than the surface ambiguities of language.

Information philosophy distinguishes at least three kinds of knowledge, each requiring its own
special epistemological analysis:

 Subjective or personal knowledge, including introspection and intuition, as well as


communications with and perceptions of other persons ("other minds").
 Communal or social knowledge of cultural creations, including fiction, myths,
conventions, laws, history, etc.
 Knowledge of a mind-independent physical external world.

This last kind of knowledge is based on the "scientific method," roughly defined as a
combination of

 Systematic observations of the external world.


 Arbitrary, even random, hypotheses (theories) that might explain the observations.
 Logical, rational deductions from the hypotheses that make (usually quantitative)
predictions about further observations.
 Experiments  (measurements) that can be reproduced by other scientists in an open-
minded community of inquirers to confirm (verify) or deny (falsify) those predictions, and
thus, the theories.
 A combination of the theories to reduce their number. Theories  that grow to explain
greater and greater numbers of predictions are considered closer to the "truth" about
reality, and are often described as "laws of nature".

The totality of scientific knowledge gives us our most reliable "information" about the world.
How exactly do we acquire and maintain this knowledge?

When information is stored in any structure, whether in the world, in human artifacts like books
and the Internet, or in human minds, two fundamental physical processes occur. These are the
two parts of the  cosmic creative process.

First is a collapse of a quantum mechanical wave function that is needed to create even a single
"bit" of new information in an experimental measurement.

Second is a local decrease in the entropy corresponding to the increase in information. Without
this, the new bit would be erased and the system returned to equilibrium. Entropy greater than
the increase in information (negative entropy) must be transferred away from the location of the
new information to satisfy the second law of thermodynamics.

Leo Szilard calculated the mean value of the quantity of entropy produced by a 1-bit
measurement as

S = k log 2,

where k is Boltzmann's constant. The base-2 logarithm reflects the binary decision. The amount
of entropy generated by the measurement may, of course, always be greater than this
fundamental amount, but not smaller, or the second law would be violated.

These quantum level processes are susceptible to noise. Information stored may have errors.
When information is retrieved, it is again susceptible to noise, This may garble the information
content. In information science, noise is generally the enemy of information. But some noise is
the friend of freedom, since it is the source of novelty, of creativity and invention, and of
variation in the biological gene pool.

Biological systems have maintained and increased their invariant information content over
billions of generations. Humans increase our knowledge of the external world, despite logical,
mathematical, and physical  uncertainty or indeterminacy. Both do it in the face of random
noise, bringing order (or cosmos) out of chaos. Both do it with sophisticated error detection and
correction schemes that limit the effects of chance.

You might also like