Professional Documents
Culture Documents
19 April 2023
e. Summary of presentation:
Between 1985 and 2000, an international group of researchers from several different disciplines
came to Stanford University's newly formed Center for the Study of Language and Information
(CSLI) to work on developing a scientific theory of information. Keith Devlin was invited to join
them in 1987, and has remained part of that research community ever since. This talk follows
one thread through the work Keith Devlin has been doing (with various others), long after the
original goal had been met, using the ideas developed in those early days. In the 14th century, the
word “information” first appeared, meaning the result of being informed. A “man of
information” meant a knowledgeable or learned person. In the mid 19th century, with the growth
services, telegraph, telephone, etc.) “information” came to have a public meaning: something
that can be measured, shared, exchanged, sold. In the mid 20th century, with the development of
from the various media on which it is represented or encoded. This new kind of information
could be processed. In 1958, Hal Leavitt (1922-2007), a managerial psychologist, and Thomas
article, they introduced a new term: “information technology.” A new term was needed, they
said, because recent, rapid developments of electronic technologies had brought a revolution in
the way people worked and communicated with one another. They defined several different types
of “IT” (as it rapidly became known): techniques for the fast processing of information, the use
thinking through computer programs.” Soon, journalists started using the term, “The Information
Age.” So, what even is information? Information seems to arise when people communicate “at a
distance” (across space or time). In the early 1980s, Jon Barwise and John Perry introduced a
new mathematical theory – situation theory – to support an analysis of the way things in the
world can represent and convey information. Applications of situation theory are natural
design, computer-chip design, design of office information systems, management structure for
large scale (global) engineering projects, intelligence analysis tools and real-time decision
making systems, understanding narrative, learning video-games design, and treatment of PTSD.
Situation theory includes basic framework (ontology): individuals, relations, situations, types,
infons. Situations are limited parts of the world: it may often be impossible to specify them
extensionally. Cognitive agents use types to classify the world. Key ideas of this talk are
information depends on types, rational action depends on type recognition, for a given input
stimulus a rational agent produces an appropriate response, and constraints connect types of
particularly interesting or important? Explain why. Use examples or evidence from the
talk in your discussion. This section should be equivalent to at least ½ of a typed page.
University's Center for the Study of Language and Information (CSLI) was very enlightening. It
provided a fascinating glimpse into the evolution of the concept of "information" throughout
history and shed light on the profound impact of technological advancements on our
understanding of this abstract entity. One aspect that particularly captivated me was the historical
perspective on the term "information." Learning that it originated in the 14th century to describe
information" was intriguing. It highlighted how our perception of information has transformed
over time, especially with the growth of communication media in the mid-19th century. The shift
exchangeability, and marketability, demonstrated the social and economic significance that
information has acquired. The introduction of various types of IT, such as fast information
program simulations, marked a turning point in how information is processed and manipulated. It
was intriguing to observe how these advancements paved the way for the popularization of the
term "The Information Age," highlighting the profound societal shift brought about by
technology. Equally fascinating was the exploration of situation theory, introduced by Jon
Barwise and John Perry in the early 1980s. The mathematical framework provided a
comprehensive analysis of how the world represents and conveys information. The wide range of
applications for situation theory, including natural language semantics, conversation analysis,
workplace communication protocols, software design, and even large-scale engineering projects,
exemplified its versatility and practical implications. The key ideas presented, focusing on the
dependence of information on types, the role of type recognition in rational action, and the
connection between input stimuli and responses, provided valuable insights into the fundamental
these components further highlighted the intricate nature of information. Overall, the
significance in our evolving society. From its historical roots to its technological transformations
and mathematical frameworks, the study of information offers a captivating journey through
human knowledge and the mechanisms that underlie our interactions with the world.
https://www.tandfonline.com/doi/abs/10.1080/02698595.2019.1615665?journalCode=cis
p20
V. In summary, the author explores the concept of information and criticizes the common
belief that information is a natural object. The argument highlights four icons associated
communications technology, and neuroscience. The author contends that these icons
contribute to the misguided idea that information resides solely in the material structures
of living beings and machines. The discussion further examines three legacies that have
shaped this perspective: the naturalization of natural science through empiricism, the
through systems theory. The author asserts that these legacies, influenced by scientific
dogmatism, have led to errors in understanding information. The semiotics aspect, for
example, is criticized for giving primacy to syntax over semantics and pragmatics,
The role of politics is also considered, particularly regarding the relationship between
Shannon information and entropy in thermodynamics. The author argues that while there
are similarities, the models of information theory and thermodynamics are fundamentally
different and should not be equated. Throughout the summary, the author emphasizes the
need for clarity and precise understanding when discussing information, as the lack
I believe this article is an adequate article that relates to the topic of the math talk I
attended.