You are on page 1of 5

Emma Choe

19 April 2023

STEM Level 1: Using Mathematics to Try to Understand Information Math Talk

a. Name: Emma Choe

b. Presentation Topic/Title: Using Mathematics to Try to Understand Information

c. Name of Presenter: Keith Devlin

d. Date: 19 April 2023

e. Summary of presentation:

Between 1985 and 2000, an international group of researchers from several different disciplines

came to Stanford University's newly formed Center for the Study of Language and Information

(CSLI) to work on developing a scientific theory of information. Keith Devlin was invited to join

them in 1987, and has remained part of that research community ever since. This talk follows

one thread through the work Keith Devlin has been doing (with various others), long after the

original goal had been met, using the ideas developed in those early days. In the 14th century, the

word “information” first appeared, meaning the result of being informed. A “man of

information” meant a knowledgeable or learned person. In the mid 19th century, with the growth

of communications media such as (newspapers, dictionaries, reference books, libraries, postal

services, telegraph, telephone, etc.) “information” came to have a public meaning: something

that can be measured, shared, exchanged, sold. In the mid 20th century, with the development of

computers, “information” came to be understood as referring to an abstract concept, separate

from the various media on which it is represented or encoded. This new kind of information

could be processed. In 1958, Hal Leavitt (1922-2007), a managerial psychologist, and Thomas

Whisler (1920-2006), professor of industrial relations, both at the University of Chicago,


published an article in the Harvard Business Review titled “Management in the 1980s.” In that

article, they introduced a new term: “information technology.” A new term was needed, they

said, because recent, rapid developments of electronic technologies had brought a revolution in

the way people worked and communicated with one another. They defined several different types

of “IT” (as it rapidly became known): techniques for the fast processing of information, the use

of statistical and mathematical models for decision-making, the “simulation of higher-order

thinking through computer programs.” Soon, journalists started using the term, “The Information

Age.” So, what even is information? Information seems to arise when people communicate “at a

distance” (across space or time). In the early 1980s, Jon Barwise and John Perry introduced a

new mathematical theory – situation theory – to support an analysis of the way things in the

world can represent and convey information. Applications of situation theory are natural

language semantics, text analysis, conversation analysis, education (“situated learning”),

developing/improving workplace communication protocols, product-line design, software

design, computer-chip design, design of office information systems, management structure for

large scale (global) engineering projects, intelligence analysis tools and real-time decision

making systems, understanding narrative, learning video-games design, and treatment of PTSD.

Situation theory includes basic framework (ontology): individuals, relations, situations, types,

infons. Situations are limited parts of the world: it may often be impossible to specify them

extensionally. Cognitive agents use types to classify the world. Key ideas of this talk are

information depends on types, rational action depends on type recognition, for a given input

stimulus a rational agent produces an appropriate response, and constraints connect types of

input stimuli to types of responses.


f. Reflection: Please write about your impressions of the presentation. What did you find

particularly interesting or important? Explain why. Use examples or evidence from the

talk in your discussion. This section should be equivalent to at least ½ of a typed page.

The presentation on the development of a scientific theory of information at Stanford

University's Center for the Study of Language and Information (CSLI) was very enlightening. It

provided a fascinating glimpse into the evolution of the concept of "information" throughout

history and shed light on the profound impact of technological advancements on our

understanding of this abstract entity. One aspect that particularly captivated me was the historical

perspective on the term "information." Learning that it originated in the 14th century to describe

the outcome of being informed and referred to knowledgeable individuals as "men of

information" was intriguing. It highlighted how our perception of information has transformed

over time, especially with the growth of communication media in the mid-19th century. The shift

from a personal meaning to a public one, encompassing measurability, shareability,

exchangeability, and marketability, demonstrated the social and economic significance that

information has acquired. The introduction of various types of IT, such as fast information

processing techniques, statistical and mathematical decision-making models, and computer

program simulations, marked a turning point in how information is processed and manipulated. It

was intriguing to observe how these advancements paved the way for the popularization of the

term "The Information Age," highlighting the profound societal shift brought about by

technology. Equally fascinating was the exploration of situation theory, introduced by Jon

Barwise and John Perry in the early 1980s. The mathematical framework provided a

comprehensive analysis of how the world represents and conveys information. The wide range of
applications for situation theory, including natural language semantics, conversation analysis,

workplace communication protocols, software design, and even large-scale engineering projects,

exemplified its versatility and practical implications. The key ideas presented, focusing on the

dependence of information on types, the role of type recognition in rational action, and the

connection between input stimuli and responses, provided valuable insights into the fundamental

aspects of information processing. The recognition of constraints as crucial elements linking

these components further highlighted the intricate nature of information. Overall, the

presentation deepened my understanding of the multifaceted nature of information and its

significance in our evolving society. From its historical roots to its technological transformations

and mathematical frameworks, the study of information offers a captivating journey through

human knowledge and the mechanisms that underlie our interactions with the world.

g. Current Article analysis:

I. Author: Peter Janich

II. Date: 18 May 2019

III. Title: “What is Information?”

IV. URL Link:

https://www.tandfonline.com/doi/abs/10.1080/02698595.2019.1615665?journalCode=cis

p20

V. In summary, the author explores the concept of information and criticizes the common

belief that information is a natural object. The argument highlights four icons associated

with this belief: Norbert Wiener's statement on information, genetic information,

communications technology, and neuroscience. The author contends that these icons

contribute to the misguided idea that information resides solely in the material structures
of living beings and machines. The discussion further examines three legacies that have

shaped this perspective: the naturalization of natural science through empiricism, the

formalization of theory through formalism, and the mechanization of communication

through systems theory. The author asserts that these legacies, influenced by scientific

dogmatism, have led to errors in understanding information. The semiotics aspect, for

example, is criticized for giving primacy to syntax over semantics and pragmatics,

neglecting the complex structures required for meaningful communication. Similarly, in

the realm of cybernetics, metaphorical descriptions of cybernetic systems can lead to

confusion between information as a scientific practice and information as a metaphor.

The role of politics is also considered, particularly regarding the relationship between

Shannon information and entropy in thermodynamics. The author argues that while there

are similarities, the models of information theory and thermodynamics are fundamentally

different and should not be equated. Throughout the summary, the author emphasizes the

need for clarity and precise understanding when discussing information, as the lack

thereof contributes to the naturalization of information and its associated misconceptions.

I believe this article is an adequate article that relates to the topic of the math talk I

attended.

You might also like