You are on page 1of 21

Wiki Loves Monuments: Photograph a monument, help Wikipedia and

win!Learn more

Information Age
From Wikipedia, the free encyclopedia
Jump to navigationJump to search

A laptop connects to the Internet to display information from Wikipedia; long-distance communication between
computer systems is a hallmark of the Information Age

The Information Age (also known as the Computer Age, Digital Age, Silicon Age,
or New Media Age) is a historical period that began in the mid-20th century,
characterized by a rapid epochal shift from traditional industry established by
the Industrial Revolution to an economy primarily based upon information
technology.[1][2][3][4] The onset of the Information Age has been associated with the
development of the transistor in 1947,[4] the basic building block of modern electronics,
the optical amplifier in 1957, the basis of long-distance fiber optic
communications,[5] and Unix time measured from the start of Jan. 1, 1970,[6] the basis
of Coordinated Universal Time and Network Time Protocol which now synchronizes all
computers connected to the Internet.
According to the United Nations Public Administration Network, the Information Age was
formed by capitalizing on computer microminiaturization advances,[7] which led
to modernized information systems and internet communications as the driving force
of social evolution.[2]
Contents

• 1Overview of early developments


o 1.1Library expansion and Moore's law
o 1.2Information storage and Kryder's law
o 1.3Information transmission
o 1.4Computation
o 1.5Genetic information
• 2Different stage conceptualizations
• 3Economics
o 3.1Jobs and income distribution
o 3.2Automation, productivity, and job gain
o 3.3Information-intensive industry
• 4Innovations
o 4.1Transistors
o 4.2Computers
o 4.3Data
o 4.4Personal computers
▪ 4.4.1Apple II
o 4.5Optical networking
• 5Economy, society and culture
• 6See also
• 7References
• 8Further reading
• 9External links

Overview of early developments[edit]

A timeline of major milestones of the Information Age, from the first message sent by the Internet protocol
suite to global Internet access

Library expansion and Moore's law[edit]


Library expansion was calculated in 1945 by Fremont Rider to double in capacity every
16 years where sufficient space made available.[8] He advocated replacing bulky,
decaying printed works with miniaturized microform analog photographs, which could be
duplicated on-demand for library patrons and other institutions.
Rider did not foresee, however, the digital technology that would follow decades later to
replace analog microform with digital imaging, storage, and transmission media,
whereby vast increases in the rapidity of information growth would be made possible
through automated, potentially-lossless digital technologies. Accordingly, Moore's law,
formulated around 1965, would calculate that the number of transistors in a
dense integrated circuit doubles approximately every two years.[9][10]
By the early 1980s, along with improvements in computing power, the proliferation of
the smaller and less expensive personal computers allowed for immediate access to
information and the ability to share and store it. Connectivity between computers within
organizations enabled access to greater amounts of information.
Information storage and Kryder's law[edit]
Main articles: Data storage and Computer data storage

Hilbert & López (2011). The World's Technological Capacity to Store, Communicate, and Compute Information.
Science, 332(6025), 60–65. https://www.science.org/doi/pdf/10.1126/science.1200970

The world's technological capacity to store information grew from 2.6


(optimally compressed) exabytes (EB) in 1986 to 15.8 EB in 1993; over 54.5 EB in
2000; and to 295 (optimally compressed) EB in 2007.[11][12] This is the informational
equivalent to less than one 730-megabyte (MB) CD-ROM per person in 1986 (539 MB
per person); roughly four CD-ROM per person in 1993; twelve CD-ROM per person in
the year 2000; and almost sixty-one CD-ROM per person in 2007.[13] It is estimated that
the world's capacity to store information has reached 5 zettabytes in 2014,[14] the
informational equivalent of 4,500 stacks of printed books from the earth to the sun.
The amount of digital data stored appears to be growing approximately exponentially,
reminiscent of Moore's law. As such, Kryder's law prescribes that the amount of storage
space available appears to be growing approximately exponentially.[15][16][17][10]
Information transmission[edit]
The world's technological capacity to receive information through one-way broadcast
networks was 432 exabytes of (optimally compressed) information in 1986; 715
(optimally compressed) exabytes in 1993; 1.2 (optimally compressed) zettabytes in
2000; and 1.9 zettabytes in 2007, the information equivalent of 174 newspapers per
person per day.[13]
The world's effective capacity to exchange information through two-
way telecommunication networks was 281 petabytes of (optimally compressed)
information in 1986; 471 petabytes in 1993; 2.2 (optimally compressed) exabytes in
2000; and 65 (optimally compressed) exabytes in 2007, the information equivalent of 6
newspapers per person per day.[13] In the 1990s, the spread of the Internet caused a
sudden leap in access to and ability to share information in businesses and homes
globally. Technology was developing so quickly that a computer costing $3000 in 1997
would cost $2000 two years later and $1000 the following year.
Computation[edit]
The world's technological capacity to compute information with humanly guided general-
purpose computers grew from 3.0 × 108 MIPS in 1986, to 4.4 × 109 MIPS in 1993; to 2.9
× 1011 MIPS in 2000; to 6.4 × 1012 MIPS in 2007.[13] An article featured in
the journal Trends in Ecology and Evolution in 2016 reported that:[14]
Digital technology has vastly exceeded the cognitive capacity of any single human
being and has done so a decade earlier than predicted. In terms of capacity, there are
two measures of importance: the number of operations a system can perform and the
amount of information that can be stored. The number of synaptic operations per
second in a human brain has been estimated to lie between 10^15 and 10^17. While
this number is impressive, even in 2007 humanity's general-purpose computers were
capable of performing well over 10^18 instructions per second. Estimates suggest that
the storage capacity of an individual human brain is about 10^12 bytes. On a per capita
basis, this is matched by current digital storage (5x10^21 bytes per 7.2x10^9 people).

Genetic information[edit]
Genetic code may also be considered part of the information revolution. Now that
sequencing has been computerized, genome can be rendered and manipulated as
data. This started with DNA sequencing, invented by Walter Gilbert and Allan
Maxam[18] in 1976-1977 and Frederick Sanger in 1977, grew steadily with the Human
Genome Project, initially conceived by Gilbert and finally, the practical applications of
sequencing, such as gene testing, after the discovery by Myriad Genetics of
the BRCA1 breast cancer gene mutation. Sequence data in Genbank has grown from
the 606 genome sequences registered in December 1982 to the 231 million genomes in
August 2021. An additional 13 trillion incomplete sequences are registered in the Whole
Genome Shotgun submission database as of August 2021. The information contained
in these registered sequences has doubled every 18 months. [19]

Different stage conceptualizations[edit]


During rare times in human history, there have been periods of innovation that have
transformed human life. The Neolithic Age, the Scientific Age and the Industrial Age all,
ultimately, induced discontinuous and irreversible changes in the economic, social and
cultural elements of the daily life of most people. Traditionally, these epochs have taken
place over hundreds, or in the case of the Neolithic Revolution, thousands of years,
whereas the Information Age swept to all parts of the globe in just a few years. The
reason for its rapid adoption is the rapidly advancing speed of information exchange.
Between 7,000 and 10,000 years ago during the Neolithic period, humans began to
domesticate animals, began to farm grains and to replace stone tools with ones made of
metal. These innovations allowed nomadic hunter-gatherers to settle down. Villages
formed along the Yangtze River in China in 6,500 B.C., the Nile River region
of Africa and in Mesopotamia (Iraq) in 6,000 B.C. Cities emerged between 6,000 B.C.
and 3,500 B.C. The development of written communication
(cuneiform in Sumeria and hieroglyphs in Egypt in 3,500 B.C. and writing in Egypt in
2,560 B.C. and in Minoa and China around 1,450 B.C.) enabled ideas to be preserved
for extended periods to spread extensively. In all, Neolithic developments, augmented
by writing as an information tool, laid the groundwork for the advent of civilization.
The Scientific Age began in the period between Galileo's 1543 proof that the planets
orbit the sun and Newton's publication of the laws of motion and gravity in Principia in
1697. This age of discovery continued through the 18th Century, accelerated by
widespread use of the moveable type printing press by Johannes Gutenberg.
The Industrial Age began in Great Britain in 1760 and continued into the mid-19th
Century. It altered many aspects of life around the world. The invention of machines
such as the mechanical textile weaver by Edmund Cartwrite, the rotating shaft steam
engine by James Watt and the cotton gin by Eli Whitney, along with processes for mass
manufacturing, came to serve the needs of a growing global population. The Industrial
Age harnessed steam and waterpower to reduce the dependence on animal and human
physical labor as the primary means of production. Thus, the core of the Industrial
Revolution was the generation and distribution of energy from coal and water to
produce steam and, later in the 20th Century, electricity.
The Information Age also requires electricity to power the global
networks of computers that process and store data. However, what dramatically
accelerated the pace of adoption of The Information Age, as compared to previous
ones, was the speed by which knowledge could be transferred and pervaded the entire
human family in a few short decades. This acceleration came about with the adoptions
of a new form of power. Beginning in 1972, engineers devised ways to harness light to
convey data through fiber optic cable. Today, light-based optical networking systems at
the heart of telecom networks and the Internet span the globe and carry most of the
information traffic to and from users and data storage systems.
Three stages of the Information Age

There are different conceptualizations of the Information Age. Some focus on the
evolution of information over the ages, distinguishing between the Primary Information
Age and the Secondary Information Age. Information in the Primary Information Age
was handled by newspapers, radio and television. The Secondary Information Age was
developed by the Internet, satellite televisions and mobile phones. The Tertiary
Information Age was emerged by media of the Primary Information Age interconnected
with media of the Secondary Information Age as presently experienced. [20][21][22]

Others classify it in terms of the well-established Schumpeterian long


waves or Kondratiev waves. Here authors distinguish three different long-term
metaparadigms, each with different long waves. The first focused on the transformation
of material, including stone, bronze, and iron. The second, often referred to as industrial
revolution, was dedicated to the transformation of energy,
including water, steam, electric, and combustion power. Finally, the most recent
metaparadigm aims at transforming information. It started out with the proliferation
of communication and stored data and has now entered the age of algorithms, which
aims at creating automated processes to convert the existing information into actionable
knowledge.[23]

Economics[edit]
Eventually, Information and communication technology (ICT)—
i.e. computers, computerized machinery, fiber optics, communication satellites,
the Internet, and other ICT tools—became a significant part of the world economy, as
the development of optical networking and microcomputers greatly changed many
businesses and industries.[24][25] Nicholas Negroponte captured the essence of these
changes in his 1995 book, Being Digital, in which he discusses the similarities and
differences between products made of atoms and products made of bits.[26]
Jobs and income distribution[edit]
The Information Age has affected the workforce in several ways, such as compelling
workers to compete in a global job market. One of the most evident concerns is the
replacement of human labor by computers that can do their jobs faster and more
effectively, thus creating a situation in which individuals who perform tasks that can
easily be automated are forced to find employment where their labor is not as
disposable.[27] This especially creates issue for those in industrial cities, where solutions
typically involve lowering working time, which is often highly resisted. Thus, individuals
who lose their jobs may be pressed to move up into joining "mind workers"
(e.g. engineers, doctors, lawyers, teachers, professors, scientists, executives, journalist
s, consultants), who are able to compete successfully in the world market and receive
(relatively) high wages.[28]
Along with automation, jobs traditionally associated with the middle class (e.g. assembly
line, data processing, management, and supervision) have also begun to disappear as
result of outsourcing.[29] Unable to compete with those in developing
countries, production and service workers in post-industrial (i.e. developed)
societies either lose their jobs through outsourcing, accept wage cuts, or settle for low-
skill, low-wage service jobs.[29] In the past, the economic fate of individuals would be tied
to that of their nation's. For example, workers in the United States were once well paid
in comparison to those in other countries. With the advent of the Information Age and
improvements in communication, this is no longer the case, as workers must now
compete in a global job market, whereby wages are less dependent on the success or
failure of individual economies.[29]
In effectuating a globalized workforce, the internet has just as well allowed for increased
opportunity in developing countries, making it possible for workers in such places to
provide in-person services, therefore competing directly with their counterparts in other
nations. This competitive advantage translates into increased opportunities and higher
wages.[30]
Automation, productivity, and job gain[edit]
The Information Age has affected the workforce in that automation and computerization
have resulted in higher productivity coupled with net job loss in manufacturing. In the
United States, for example, from January 1972 to August 2010, the number of people
employed in manufacturing jobs fell from 17,500,000 to 11,500,000 while manufacturing
value rose 270%.[31] Although it initially appeared that job loss in the industrial
sector might be partially offset by the rapid growth of jobs in information technology,
the recession of March 2001 foreshadowed a sharp drop in the number of jobs in the
sector. This pattern of decrease in jobs would continue until 2003,[32] and data has
shown that, overall, technology creates more jobs than it destroys even in the short
run.[33]
Information-intensive industry[edit]
Main article: Information industry
Industry has become more information-intensive while less labor- and capital-intensive.
This has left important implications for the workforce, as workers have become
increasingly productive as the value of their labor decreases. For the system
of capitalism itself, the value of labor decreases, the value of capital increases.
In the classical model, investments in human and financial capital are important
predictors of the performance of a new venture.[34] However, as demonstrated by Mark
Zuckerberg and Facebook, it now seems possible for a group of relatively inexperienced
people with limited capital to succeed on a large scale.[35]

Innovations[edit]

A visualization of the various routes through a portion of the Internet.

The Information Age was enabled by technology developed in the Digital Revolution,
which was itself enabled by building on the developments of the Technological
Revolution.
Transistors[edit]
Main articles: Transistor, History of the transistor, and MOSFET
Further information: Semiconductor device
The onset of the Information Age can be associated with the development
of transistor technology.[4] The concept of a field-effect transistor was first theorized
by Julius Edgar Lilienfeld in 1925.[36] The first practical transistor was the point-contact
transistor, invented by the engineers Walter Houser Brattain and John Bardeen while
working for William Shockley at Bell Labs in 1947. This was a breakthrough that laid the
foundations for modern technology.[4] Shockley's research team also invented
the bipolar junction transistor in 1952.[37][36] The most widely used type of transistor is
the metal–oxide–semiconductor field-effect transistor (MOSFET), invented by Mohamed
M. Atalla and Dawon Kahng at Bell Labs in 1960.[38] The complementary MOS (CMOS)
fabrication process was developed by Frank Wanlass and Chih-Tang Sah in 1963.[39]
Computers[edit]
Main articles: Computer and History of computers
Further information: Integrated circuit, Invention of the integrated
circuit, Microprocessor, and Moore's law
Before the advent of electronics, mechanical computers, like the Analytical Engine in
1837, were designed to provide routine mathematical calculation and simple decision-
making capabilities. Military needs during World War II drove development of the first
electronic computers, based on vacuum tubes, including the Z3, the Atanasoff–Berry
Computer, Colossus computer, and ENIAC.
The invention of the transistor enabled the era of mainframe computers (1950s–1970s),
typified by the IBM 360. These large, room-sized computers provided data calculation
and manipulation that was much faster than humanly possible, but were expensive to
buy and maintain, so were initially limited to a few scientific institutions, large
corporations, and government agencies.
The germanium integrated circuit (IC) was invented by Jack Kilby at Texas
Instruments in 1958.[40] The silicon integrated circuit was then invented in 1959
by Robert Noyce at Fairchild Semiconductor, using the planar process developed
by Jean Hoerni, who was in turn building on Mohamed Atalla's silicon surface
passivation method developed at Bell Labs in 1957.[41][42] Following the invention of
the MOS transistor by Mohamed Atalla and Dawon Kahng at Bell Labs in
1959,[38] the MOS integrated circuit was developed by Fred Heiman and Steven Hofstein
at RCA in 1962.[43] The silicon-gate MOS IC was later developed by Federico Faggin at
Fairchild Semiconductor in 1968.[44] With the advent of the MOS transistor and the MOS
IC, transistor technology rapidly improved, and the ratio of computing power to size
increased dramatically, giving direct access to computers to ever smaller groups of
people.
The first commercial single-chip microprocessor launched in 1971, the Intel 4004, which
was developed by Federico Faggin using his silicon-gate MOS IC technology, along
with Marcian Hoff, Masatoshi Shima and Stan Mazor.[45][46]
Along with electronic arcade machines and home video game consoles pioneered
by Nolan Bushnell in the 1970s, the development of personal computers like
the Commodore PET and Apple II (both in 1977) gave individuals access to the
computer. But data sharing between individual computers was either non-existent or
largely manual, at first using punched cards and magnetic tape, and later floppy disks.
Data[edit]
Further information: History of telecommunications, Computer memory, Computer data
storage, Data compression, Internet access, and Social media
The first developments for storing data were initially based on photographs, starting
with microphotography in 1851 and then microform in the 1920s, with the ability to store
documents on film, making them much more compact. Early information
theory and Hamming codes were developed about 1950, but awaited technical
innovations in data transmission and storage to be put to full use.
Magnetic-core memory was developed from the research of Frederick W. Viehe in 1947
and An Wang at Harvard University in 1949.[47][48] With the advent of the MOS transistor,
MOS semiconductor memory was developed by John Schmidt at Fairchild
Semiconductor in 1964.[49][50] In 1967, Dawon Kahng and Simon Sze at Bell Labs
described in 1967 how the floating gate of an MOS semiconductor device could be used
for the cell of a reprogrammable ROM.[51] Following the invention of flash memory
by Fujio Masuoka at Toshiba in 1980,[52][53] Toshiba commercialized NAND flash memory
in 1987.[54][51]
Copper wire cables transmitting digital data connected computer
terminals and peripherals to mainframes, and special message-sharing systems leading
to email, were first developed in the 1960s. Independent computer-to-computer
networking began with ARPANET in 1969. This expanded to become
the Internet (coined in 1974). Access to the Internet improved with the invention of
the World Wide Web in 1991. The capacity expansion from dense wave division
multiplexing, optical amplification and optical networking in the mid-1990s led to record
data transfer rates. By 2018, optical networks routinely delivered 30.4 terabits/s over a
fiber optic pair, the data equivalent of 1.2 million simultaneous 4K HD video streams. [55]
MOSFET scaling, the rapid miniaturization of MOSFETs at a rate predicted by Moore's
law,[56] led to computers becoming smaller and more powerful, to the point where they
could be carried. During the 1980s–1990s, laptops were developed as a form of
portable computer, and personal digital assistants (PDAs) could be used while standing
or walking. Pagers, widely used by the 1980s, were largely replaced by mobile phones
beginning in the late 1990s, providing mobile networking features to some computers.
Now commonplace, this technology is extended to digital cameras and other wearable
devices. Starting in the late 1990s, tablets and then smartphones combined and
extended these abilities of computing, mobility, and information sharing. Metal–oxide–
semiconductor (MOS) image sensors, which first began appearing in the late 1960s, led
to the transition from analog to digital imaging, and from analog to digital cameras,
during the 1980s–1990s. The most common image sensors are the charge-coupled
device (CCD) sensor and the CMOS (complementary MOS) active-pixel sensor (CMOS
sensor).
Electronic paper, which has origins in the 1970s, allows digital information to appear as
paper documents.
Personal computers[edit]
Main article: History of personal computers
By 1976, there were several firms racing to introduce the first truly successful
commercial personal computers. Three machines, the Apple II, PET 2001 and TRS-
80 were all released in 1977,[57] becoming the most popular by late
1978.[58] Byte magazine later referred to Commodore, Apple, and Tandy as the "1977
Trinity".[59] Also in 1977, Sord Computer Corporation released the Sord M200 Smart
Home Computer in Japan.[60]
Apple II[edit]
Main article: Apple II

Apr. 1977: Apple II.

Steve Wozniak (known as "Woz"), a regular visitor to Homebrew Computer


Club meetings, designed the single-board Apple I computer and first demonstrated it
there. With specifications in hand and an order for 100 machines at US$500 each
from the Byte Shop, Woz and his friend Steve Jobs founded Apple Computer.
About 200 of the machines sold before the company announced the Apple II as a
complete computer. It had color graphics, a full QWERTY keyboard, and internal slots
for expansion, which were mounted in a high quality streamlined plastic case. The
monitor and I/O devices were sold separately. The original Apple II operating
system was only the built-in BASIC interpreter contained in ROM. Apple DOS was
added to support the diskette drive; the last version was "Apple DOS 3.3".
Its higher price and lack of floating point BASIC, along with a lack of retail distribution
sites, caused it to lag in sales behind the other Trinity machines until 1979, when it
surpassed the PET. It was again pushed into 4th place when Atari introduced its
popular Atari 8-bit systems.[61]
Despite slow initial sales, the Apple II's lifetime was about eight years longer than other
machines, and so accumulated the highest total sales. By 1985 2.1 million had sold and
more than 4 million Apple II's were shipped by the end of its production in 1993. [62]
Optical networking[edit]
Further information: Optical communication, Image sensor, and Optical fiber
Optical communication plays a crucial role in communication networks. Optical
communication provides the transmission backbone for the telecommunications and
computer networks that underlie the Internet, the foundation for the Digital
Revolution and Information Age.
The two core technologies are the optical fiber and light amplification (the optical
amplifier). In 1953, Bram van Heel demonstrated image transmission through bundles
of optical fibers with a transparent cladding. The same year, Harold
Hopkins and Narinder Singh Kapany at Imperial College succeeded in making image-
transmitting bundles with over 10,000 optical fibers, and subsequently achieved image
transmission through a 75 cm long bundle which combined several thousand fibers.
Gordon Gould invented the optical amplifier and the laser, and also established the first
optical telecommunications company, Optelecom, to design communication systems.
The firm was a co-founder in Ciena Corp., the venture that popularized the optical
amplifier with the introduction of the first dense wave division
multiplexing system.[63] This massive scale communication technology has emerged as
the common basis of all telecommunication networks[64] and, thus, a foundation of the
Information Age.[65][66]

Economy, society and culture[edit]


Manuel Castells captures the significance of the Information Age in The Information
Age: Economy, Society and Culture when he writes of our global interdependence and
the new relationships between economy, state and society, what he calls "a new
society-in-the-making." He cautions that just because humans have dominated the
material world, does not mean that the Information Age is the end of history: [67]
It is in fact, quite the opposite: history is just beginning, if by history we understand the
moment when, after millennia of a prehistoric battle with Nature, first to survive, then to
conquer it, our species has reached the level of knowledge and social organization that
will allow us to live in a predominantly social world. It is the beginning of a new
existence, and indeed the beginning of a new age, The Information Age, marked by the
autonomy of culture vis-à-vis the material basis of our existence.[68]

See also[edit]
• Attention economy
• Attention inequality
• Big data
• Cognitive-cultural economy
• Computer crime
• Cyberterrorism
• Cyberwarfare
• Datamation – first print magazine dedicated solely to
covering information technology[69]
• Digital dark age
• Digital detox
• Digital divide
• Digital transformation
• Digital world
• Imagination age, the hypothesized successor of the
information age: a period in which creativity and
imagination become the primary creators of
economic value
• Indigo Era
• Information explosion
• Information revolution
• Information society
• Internet governance
• Netocracy
• Social Age
• Technological determinism
• Telecommunications
• Zettabyte Era
• The Hacker Ethic and the Spirit of the Information
Age
• Information and communication technologies for
environmental sustainability

References[edit]
1. ^ Zimmerman, Kathy Ann (September 7, 2017). "History of
Computers: A Brief Timeline". livescience.com. Archived from
the original on August 9, 2020. Retrieved October 17, 2019.
2. ^ Jump up to:a b "The History of
Computers". thought.co. Archived from the original on 2020-
08-01. Retrieved 2019-10-17.
3. ^ "The 4 industrial revolutions". sentryo.net. February 23,
2017. Archived from the original on October 17, 2019.
Retrieved October 17, 2019.
4. ^ Jump up to:a b c d Manuel, Castells (1996). The information
age : economy, society and culture. Oxford:
Blackwell. ISBN 978-0631215943. OCLC 43092627.
5. ^ Grobe, Klaus; Eiselt, Michael (2013). Wavelength Division
Multiplexing: A Practical Engineering Guide. John T Wiley &
Sons. p. 2.
6. ^ "General Concepts - Seconds Since the
Epoch". pubs.opengroup.org. Archived from the original on
2017-12-22. Retrieved 2022-08-29.
7. ^ Kluver, Randy. "Globalization, Informatization, and
Intercultural Communication". un.org. Archived from the
original on 19 July 2013. Retrieved 18 April 2013.
8. ^ Rider, Fredmont (1944). The Scholar and the Future of the
Research Library. New York City: Hadham Press.
9. ^ "Moore's Law to roll on for another decade". Archived from
the original on 2015-07-09. Retrieved 2011-11-27. Moore also
affirmed he never said transistor count would double every 18
months, as is commonly said. Initially, he said transistors on a
chip would double every year. He then recalibrated it to every
two years in 1975. David House, an Intel executive at the
time, noted that the changes would cause computer
performance to double every 18 months.
10. ^ Jump up to:a b Roser, Max, and Hannah Ritchie. 2013.
"Technological Progress Archived 2021-09-10 at the Wayback
Machine." Our World in Data. Retrieved on 9 June 2020.
11. ^ Hilbert, M.; Lopez, P. (2011-02-10). "The World's
Technological Capacity to Store, Communicate, and Compute
Information". Science. 332 (6025): 60–
65. Bibcode:2011Sci...332...60H. doi:10.1126/science.120097
0. ISSN 0036-8075. PMID 21310967. S2CID 206531385.
12. ^ Hilbert, Martin R. (2011). Supporting online material for the
world's technological capacity to store, communicate, and
compute infrormation. Science/AAAS. OCLC 755633889.
13. ^ Jump up to:a b c d Hilbert, Martin; López, Priscila (2011). "The
World's Technological Capacity to Store, Communicate, and
Compute Information". Science. 332 (6025): 60–
65. Bibcode:2011Sci...332...60H. doi:10.1126/science.120097
0. ISSN 0036-8075. PMID 21310967. S2CID 206531385.
14. ^ Jump up to:a b Gillings, Michael R.; Hilbert, Martin; Kemp,
Darrell J. (2016). "Information in the Biosphere: Biological and
Digital Worlds". Trends in Ecology & Evolution. 31 (3): 180–
189. doi:10.1016/j.tree.2015.12.013. PMID 26777788. S2CID
3561873. Archived from the original on 2016-06-04.
Retrieved 2016-08-22.
15. ^ Gantz, John, and David Reinsel. 2012. "The Digital
Universe in 2020: Big Data, Bigger Digital Shadows, and
Biggest Growth in the Far East Archived 2020-06-10 at
the Wayback Machine." IDC iView. S2CID 112313325. View
multimedia content Archived 2020-05-24 at the Wayback
Machine.
16. ^ Rizzatti, Lauro. 14 September 2016. "Digital Data Storage is
Undergoing Mind-Boggling Growth." EE Times. Archived from
the original on 16 September 2016.
17. ^ "The historical growth of data: Why we need a faster
transfer solution for large data sets Archived 2019-06-02 at
the Wayback Machine." Signiant. 2020. Retrieved 9 June
2020.
18. ^ Gilbert, Walter, Md, and Allan Maxam, Md.
"Biochemistry." Proceedings of the National Academy of
Sciences, USA. Vol. 74. No 2. p 560-64.
19. ^ Lathe III, Warren C.; Williams, Jennifer M.; Mangan, Mary
E.; Karolchik, Donna (2008). "Genomic Data Resources:
Challenges and Promises". Nature Education. Archived from
the original on 2021-12-06. Retrieved 2021-12-05.
20. ^ Iranga, Suroshana (2016). Social Media Culture. Colombo:
S. Godage and Brothers. ISBN 978-9553067432.
21. ^ Jillianne Code, Rachel Ralph, Kieran Forde et al. A
Disorienting Dilemma: Teaching and Learning in Technology
Education During a Time of Crisis, 14 September 2021,
PREPRINT (Version 1). https://doi.org/10.21203/rs.3.rs-
899835/v1
22. ^ Goodarzi, M., Fahimifar, A., Shakeri Daryani, E. (2021).
New Media and Ideology: A Critical Perspective. Journal of
Cyberspace Studies, 5(2), 137-162. doi:
10.22059/jcss.2021.327938.1065
23. ^ Hilbert, M. (2020). Digital technology and social change:
The digital transformation of society from a historical
perspective. Dialogues in Clinical Neuroscience, 22(2), 189–
194. https://doi.org/10.31887/DCNS.2020.22.2/mhilbert
24. ^ "Information Age Education Newsletter". Information Age
Education. August 2008. Archived from the original on 14
September 2015. Retrieved 4 December 2019.
25. ^ Moursund, David. "Information Age". IAE-
Pedia. Archived from the original on 1 August 2020.
Retrieved 4 December 2019.
26. ^ "Negroponte's articles". Archives.obs-us.com. 1996-12-
30. Archived from the original on 2011-09-04. Retrieved 2012-
06-11.
27. ^ Porter, Michael. "How Information Gives You Competitive
Advantage". Harvard Business Review. Archived from the
original on 23 June 2015. Retrieved 9 September 2015.
28. ^ Geiger, Christophe (2011), "Copyright and Digital
Libraries", E-Publishing and Digital Libraries, IGI Global,
pp. 257–272, doi:10.4018/978-1-60960-031-
0.ch013, ISBN 978-1-60960-031-0
29. ^ Jump up to:a b c McGowan, Robert. 1991. "The Work of
Nations by Robert Reich" (book review). Human Resource
Management 30(4):535–
38. doi:10.1002/hrm.3930300407. ISSN 1099-050X.
30. ^ Bhagwati, Jagdish N. (2005). In defense of Globalization.
New York: Oxford University Press.
31. ^ Smith, Fran. 5 Oct 2010. "Job Losses and Productivity
Gains Archived 2010-10-13 at the Wayback
Machine." Competitive Enterprise Institute.
32. ^ Cooke, Sandra D. 2003. "Information Technology Workers
in the Digital Economy Archived 2017-06-21 at the Wayback
Machine." In Digital Economy. Economics and Statistics
Administration, Department of Commerce.
33. ^ Yongsung, Chang; Jay H. Hong (2013). "Does Technology
Create Jobs?". SERI Quarterly. 6 (3): 44–53. Archived
from the original on 2014-04-29. Retrieved 29 April 2014.
34. ^ Cooper, Arnold C.; Gimeno-Gascon, F. Javier; Woo,
Carolyn Y. (1994). "Initial human and financial capital as
predictors of new venture performance". Journal of Business
Venturing. 9 (5): 371–395. doi:10.1016/0883-9026(94)90013-
2.
35. ^ Carr, David (2010-10-03). "Film Version of Zuckerberg
Divides the Generations". The New York Times. ISSN 0362-
4331. Archived from the original on 2020-11-14.
Retrieved 2016-12-20.
36. ^ Jump up to:a b Lee, Thomas H. (2003). "A Review of MOS
Device Physics" (PDF). The Design of CMOS Radio-
Frequency Integrated Circuits. Cambridge University
Press. ISBN 9781139643771. Archived (PDF) from the
original on 2019-12-09. Retrieved 2019-07-21.
37. ^ "Who Invented the Transistor?". Computer History Museum.
4 December 2013. Archived from the original on 13 December
2013. Retrieved 20 July 2019.
38. ^ Jump up to:a b "1960 - Metal Oxide Semiconductor (MOS)
Transistor Demonstrated". The Silicon Engine. Computer
History Museum. Archived from the original on 2019-10-27.
Retrieved 2019-07-21.
39. ^ "1963: Complementary MOS Circuit Configuration is
Invented". Archived from the original on 2019-07-23.
40. ^ Kilby, Jack (2000), Nobel lecture (PDF), Stockholm: Nobel
Foundation, archived (PDF) from the original on 29 May 2008,
retrieved 15 May 2008
41. ^ Lojek, Bo (2007). History of Semiconductor
Engineering. Springer Science & Business Media.
p. 120. ISBN 9783540342588.
42. ^ Bassett, Ross Knox (2007). To the Digital Age: Research
Labs, Start-up Companies, and the Rise of MOS Technology.
Johns Hopkins University Press.
p. 46. ISBN 9780801886393. Archived from the original on
2020-07-27. Retrieved 2019-07-31.
43. ^ "Tortoise of Transistors Wins the Race - CHM
Revolution". Computer History Museum. Archived from the
original on 10 March 2020. Retrieved 22 July 2019.
44. ^ "1968: Silicon Gate Technology Developed for
ICs". Computer History Museum. Archived from the original
on 29 July 2020. Retrieved 22 July 2019.
45. ^ "1971: Microprocessor Integrates CPU Function onto a
Single Chip". Computer History Museum. Archived from the
original on 12 August 2021. Retrieved 22 July 2019.
46. ^ Colinge, Jean-Pierre; Greer, James C.; Greer, Jim
(2016). Nanowire Transistors: Physics of Devices and
Materials in One Dimension. Cambridge University Press.
p. 2. ISBN 9781107052406. Archived from the original on
2020-03-17. Retrieved 2019-07-22.
47. ^ "1953: Whirlwind computer debuts core memory". Computer
History Museum. Archived from the original on 3 October
2019. Retrieved 31 July 2019.
48. ^ "1956: First commercial hard disk drive shipped". Computer
History Museum. Archived from the original on 31 July 2019.
Retrieved 31 July 2019.
49. ^ "1970: MOS Dynamic RAM Competes with Magnetic Core
Memory on Price". Computer History Museum. Archived from
the original on 26 October 2021. Retrieved 29 July 2019.
50. ^ Solid State Design - Vol. 6. Horizon House.
1965. Archived from the original on 2021-06-09.
Retrieved 2020-11-12.
51. ^ Jump up to:a b "1971: Reusable semiconductor ROM
introduced". Computer History Museum. Archived from the
original on 3 October 2019. Retrieved 19 June 2019.
52. ^ Fulford, Benjamin (24 June 2002). "Unsung
hero". Forbes. Archived from the original on 3 March 2008.
Retrieved 18 March 2008.
53. ^ US 4531203 Fujio Masuoka
54. ^ "1987: Toshiba Launches NAND Flash". eWeek. April 11,
2012. Retrieved 20 June 2019.
55. ^ Saarinen, Juha (January 24, 2018). "Telstra trial claims
world's fasts transmission speed". ITNews
Australia. Archived from the original on October 17, 2019.
Retrieved December 5, 2021.
56. ^ Sahay, Shubham; Kumar, Mamidala Jagadesh
(2019). Junctionless Field-Effect Transistors: Design,
Modeling, and Simulation. John Wiley &
Sons. ISBN 9781119523536. Archived from the original on
2019-12-21. Retrieved 2019-10-31.
57. ^ Chandler, Alfred Dupont; Hikino, Takashi; Nordenflycht,
Andrew Von; Chandler, Alfred D. (2009-06-30). Inventing the
Electronic Century. ISBN 9780674029392. Archived from the
original on 2022-01-18. Retrieved 11 August 2015.
58. ^ Schuyten, Peter J. (6 December 1978). "Technology; The
Computer Entering Home". Business & Finance. The New
York Times. p. D4. ISSN 0362-4331. Archived from the
original on 22 July 2018. Retrieved 9 September 2019.
59. ^ "Most Important Companies". Byte. September 1995.
Archived from the original on 2008-06-18. Retrieved 2008-06-
10.
60. ^ "M200 Smart Home Computer Series-Computer
Museum". Archived from the original on 2020-01-03.
Retrieved 2022-01-18.
61. ^ Reimer, Jeremy (14 December 2005). "Total share: 30
years of personal computer market share figures; The new
era (2001– )". Ars Technica. p. 9. Archived from the original
on 21 February 2008. Retrieved 13 February 2008.
62. ^ Reimer, Jeremy (December 2005). "Personal Computer
Market Share: 1975–2004". Ars Technica. Archived from the
original on 6 June 2012. Retrieved 13 February 2008.
63. ^ Markoff, John (March 3, 1997). "Fiber-Optic Technology
Draws Record Stock Value". The New York
Times. Archived from the original on November 9, 2021.
Retrieved December 5, 2021.
64. ^ Grobe, Klaus; Eiselt, Michael (2013). Wavelength Division
Multiplexing: A Practical Engineering Guide. John T Wiley &
Sons. p. 2.
65. ^ Sudo, Shoichi (1997). Optical Fiber Amplifiers: Materials
Devices, and Applications. Artech House, Inc. pp. xi.
66. ^ George, Gilder (April 4, 1997). "Fiber Keeps its
Promise". Forbes ASAP.
67. ^ Castells, Manuel. The Power of Identity, The Information
Age: Economy, Society and Culture Vol. II. Cambridge, MA;
Oxford, UK: Blackwell.
68. ^ Castells, Manuel. The Power of Identity, The Information
Age: Economy, Society and Culture Vol. II. Cambridge, MA;
Oxford, UK: Blackwell
69. ^ "Newspapers News and News Archive Resources:
Computer and Technology Sources". Temple
University. Archived from the original on 5 October 2017.
Retrieved 9 September 2015.

Further reading[edit]
• Oliver Stengel et al. (2017). Digitalzeitalter -
Digitalgesellschaft, Springer ISBN 978-3658117580
• Mendelson, Edward (June 2016). In the Depths of
the Digital Age, The New York Review of Books
• Bollacker, Kurt D. (2010) Avoiding a Digital Dark
Age, American Scientist, March–April 2010, Volume
98, Number 2, p. 106ff
• Castells, Manuel. (1996–98). The Information Age:
Economy, Society and Culture, 3 vols. Oxford:
Blackwell.
• Gelbstein, E. (2006) Crossing the Executive Digital
Divide. ISBN 99932-53-17-0

External links[edit]

Wikibooks has a book on the topic of: The Information Age

Wikiquote has quotations related to Information Age.

Wikimedia Commons has media related to Information Age.

• Articles on the impact of the Information Age on


business – at Information Age magazine
• Beyond the Information Age by Dave Ulmer
• Information Age Anthology Vol I by Alberts and Papp
(CCRP, 1997) (PDF)
• Information Age Anthology Vol II by Alberts and
Papp (CCRP, 2000) (PDF)
• Information Age Anthology Vol III by Alberts and
Papp (CCRP, 2001) (PDF)
• Understanding Information Age Warfare by Alberts
et al. (CCRP, 2001) (PDF)
• Information Age Transformation by Alberts (CCRP,
2002) (PDF)
• The Unintended Consequences of Information Age
Technologies by Alberts (CCRP, 1996) (PDF)
• History & Discussion of the Information Age
• Science Museum - Information Age Archived 2015-
10-04 at the Wayback Machine

show

Computer science

show
Western world and culture
Portals:
Internet Technology
Categories:
• Information Age
• Digital media
• Digital divide
• Contemporary history
• Historical eras
• Postmodernism
• Cultural trends
Navigation menu
• Not logged in
• Talk
• Contributions
• Create account
• Log in
• Article
• Talk
• Read
• Edit
• View history
Search
Search Go

• Main page
• Contents
• Current events
• Random article
• About Wikipedia
• Contact us
• Donate
Contribute
• Help
• Learn to edit
• Community portal
• Recent changes
• Upload file
Tools
• What links here
• Related changes
• Special pages
• Permanent link
• Page information
• Cite this page
• Wikidata item
Print/export
• Download as PDF
• Printable version
In other projects
• Wikimedia Commons
• Wikiquote
• Wikiversity
Languages
• ‫العربية‬
• Español
• Français
• हिन्दी
• Bahasa Indonesia
• Português
• Русский
• ‫اردو‬
• 中文
33 more
Edit links
• This page was last edited on 22 October 2022, at 20:34 (UTC).
• Text is available under the Creative Commons Attribution-ShareAlike License 3.0; additional terms may
apply. By using this site, you agree to the Terms of Use and Privacy Policy. Wikipedia® is a registered
trademark of the Wikimedia Foundation, Inc., a non-profit organization.
• Privacy policy

• About Wikipedia

• Disclaimers

• Contact Wikipedia

• Mobile view

• Developers

• Statistics

• Cookie statement


Get citation

You might also like