Professional Documents
Culture Documents
win!Learn more
Information Age
From Wikipedia, the free encyclopedia
Jump to navigationJump to search
A laptop connects to the Internet to display information from Wikipedia; long-distance communication between
computer systems is a hallmark of the Information Age
The Information Age (also known as the Computer Age, Digital Age, Silicon Age,
or New Media Age) is a historical period that began in the mid-20th century,
characterized by a rapid epochal shift from traditional industry established by
the Industrial Revolution to an economy primarily based upon information
technology.[1][2][3][4] The onset of the Information Age has been associated with the
development of the transistor in 1947,[4] the basic building block of modern electronics,
the optical amplifier in 1957, the basis of long-distance fiber optic
communications,[5] and Unix time measured from the start of Jan. 1, 1970,[6] the basis
of Coordinated Universal Time and Network Time Protocol which now synchronizes all
computers connected to the Internet.
According to the United Nations Public Administration Network, the Information Age was
formed by capitalizing on computer microminiaturization advances,[7] which led
to modernized information systems and internet communications as the driving force
of social evolution.[2]
Contents
A timeline of major milestones of the Information Age, from the first message sent by the Internet protocol
suite to global Internet access
Hilbert & López (2011). The World's Technological Capacity to Store, Communicate, and Compute Information.
Science, 332(6025), 60–65. https://www.science.org/doi/pdf/10.1126/science.1200970
Genetic information[edit]
Genetic code may also be considered part of the information revolution. Now that
sequencing has been computerized, genome can be rendered and manipulated as
data. This started with DNA sequencing, invented by Walter Gilbert and Allan
Maxam[18] in 1976-1977 and Frederick Sanger in 1977, grew steadily with the Human
Genome Project, initially conceived by Gilbert and finally, the practical applications of
sequencing, such as gene testing, after the discovery by Myriad Genetics of
the BRCA1 breast cancer gene mutation. Sequence data in Genbank has grown from
the 606 genome sequences registered in December 1982 to the 231 million genomes in
August 2021. An additional 13 trillion incomplete sequences are registered in the Whole
Genome Shotgun submission database as of August 2021. The information contained
in these registered sequences has doubled every 18 months. [19]
There are different conceptualizations of the Information Age. Some focus on the
evolution of information over the ages, distinguishing between the Primary Information
Age and the Secondary Information Age. Information in the Primary Information Age
was handled by newspapers, radio and television. The Secondary Information Age was
developed by the Internet, satellite televisions and mobile phones. The Tertiary
Information Age was emerged by media of the Primary Information Age interconnected
with media of the Secondary Information Age as presently experienced. [20][21][22]
Economics[edit]
Eventually, Information and communication technology (ICT)—
i.e. computers, computerized machinery, fiber optics, communication satellites,
the Internet, and other ICT tools—became a significant part of the world economy, as
the development of optical networking and microcomputers greatly changed many
businesses and industries.[24][25] Nicholas Negroponte captured the essence of these
changes in his 1995 book, Being Digital, in which he discusses the similarities and
differences between products made of atoms and products made of bits.[26]
Jobs and income distribution[edit]
The Information Age has affected the workforce in several ways, such as compelling
workers to compete in a global job market. One of the most evident concerns is the
replacement of human labor by computers that can do their jobs faster and more
effectively, thus creating a situation in which individuals who perform tasks that can
easily be automated are forced to find employment where their labor is not as
disposable.[27] This especially creates issue for those in industrial cities, where solutions
typically involve lowering working time, which is often highly resisted. Thus, individuals
who lose their jobs may be pressed to move up into joining "mind workers"
(e.g. engineers, doctors, lawyers, teachers, professors, scientists, executives, journalist
s, consultants), who are able to compete successfully in the world market and receive
(relatively) high wages.[28]
Along with automation, jobs traditionally associated with the middle class (e.g. assembly
line, data processing, management, and supervision) have also begun to disappear as
result of outsourcing.[29] Unable to compete with those in developing
countries, production and service workers in post-industrial (i.e. developed)
societies either lose their jobs through outsourcing, accept wage cuts, or settle for low-
skill, low-wage service jobs.[29] In the past, the economic fate of individuals would be tied
to that of their nation's. For example, workers in the United States were once well paid
in comparison to those in other countries. With the advent of the Information Age and
improvements in communication, this is no longer the case, as workers must now
compete in a global job market, whereby wages are less dependent on the success or
failure of individual economies.[29]
In effectuating a globalized workforce, the internet has just as well allowed for increased
opportunity in developing countries, making it possible for workers in such places to
provide in-person services, therefore competing directly with their counterparts in other
nations. This competitive advantage translates into increased opportunities and higher
wages.[30]
Automation, productivity, and job gain[edit]
The Information Age has affected the workforce in that automation and computerization
have resulted in higher productivity coupled with net job loss in manufacturing. In the
United States, for example, from January 1972 to August 2010, the number of people
employed in manufacturing jobs fell from 17,500,000 to 11,500,000 while manufacturing
value rose 270%.[31] Although it initially appeared that job loss in the industrial
sector might be partially offset by the rapid growth of jobs in information technology,
the recession of March 2001 foreshadowed a sharp drop in the number of jobs in the
sector. This pattern of decrease in jobs would continue until 2003,[32] and data has
shown that, overall, technology creates more jobs than it destroys even in the short
run.[33]
Information-intensive industry[edit]
Main article: Information industry
Industry has become more information-intensive while less labor- and capital-intensive.
This has left important implications for the workforce, as workers have become
increasingly productive as the value of their labor decreases. For the system
of capitalism itself, the value of labor decreases, the value of capital increases.
In the classical model, investments in human and financial capital are important
predictors of the performance of a new venture.[34] However, as demonstrated by Mark
Zuckerberg and Facebook, it now seems possible for a group of relatively inexperienced
people with limited capital to succeed on a large scale.[35]
Innovations[edit]
The Information Age was enabled by technology developed in the Digital Revolution,
which was itself enabled by building on the developments of the Technological
Revolution.
Transistors[edit]
Main articles: Transistor, History of the transistor, and MOSFET
Further information: Semiconductor device
The onset of the Information Age can be associated with the development
of transistor technology.[4] The concept of a field-effect transistor was first theorized
by Julius Edgar Lilienfeld in 1925.[36] The first practical transistor was the point-contact
transistor, invented by the engineers Walter Houser Brattain and John Bardeen while
working for William Shockley at Bell Labs in 1947. This was a breakthrough that laid the
foundations for modern technology.[4] Shockley's research team also invented
the bipolar junction transistor in 1952.[37][36] The most widely used type of transistor is
the metal–oxide–semiconductor field-effect transistor (MOSFET), invented by Mohamed
M. Atalla and Dawon Kahng at Bell Labs in 1960.[38] The complementary MOS (CMOS)
fabrication process was developed by Frank Wanlass and Chih-Tang Sah in 1963.[39]
Computers[edit]
Main articles: Computer and History of computers
Further information: Integrated circuit, Invention of the integrated
circuit, Microprocessor, and Moore's law
Before the advent of electronics, mechanical computers, like the Analytical Engine in
1837, were designed to provide routine mathematical calculation and simple decision-
making capabilities. Military needs during World War II drove development of the first
electronic computers, based on vacuum tubes, including the Z3, the Atanasoff–Berry
Computer, Colossus computer, and ENIAC.
The invention of the transistor enabled the era of mainframe computers (1950s–1970s),
typified by the IBM 360. These large, room-sized computers provided data calculation
and manipulation that was much faster than humanly possible, but were expensive to
buy and maintain, so were initially limited to a few scientific institutions, large
corporations, and government agencies.
The germanium integrated circuit (IC) was invented by Jack Kilby at Texas
Instruments in 1958.[40] The silicon integrated circuit was then invented in 1959
by Robert Noyce at Fairchild Semiconductor, using the planar process developed
by Jean Hoerni, who was in turn building on Mohamed Atalla's silicon surface
passivation method developed at Bell Labs in 1957.[41][42] Following the invention of
the MOS transistor by Mohamed Atalla and Dawon Kahng at Bell Labs in
1959,[38] the MOS integrated circuit was developed by Fred Heiman and Steven Hofstein
at RCA in 1962.[43] The silicon-gate MOS IC was later developed by Federico Faggin at
Fairchild Semiconductor in 1968.[44] With the advent of the MOS transistor and the MOS
IC, transistor technology rapidly improved, and the ratio of computing power to size
increased dramatically, giving direct access to computers to ever smaller groups of
people.
The first commercial single-chip microprocessor launched in 1971, the Intel 4004, which
was developed by Federico Faggin using his silicon-gate MOS IC technology, along
with Marcian Hoff, Masatoshi Shima and Stan Mazor.[45][46]
Along with electronic arcade machines and home video game consoles pioneered
by Nolan Bushnell in the 1970s, the development of personal computers like
the Commodore PET and Apple II (both in 1977) gave individuals access to the
computer. But data sharing between individual computers was either non-existent or
largely manual, at first using punched cards and magnetic tape, and later floppy disks.
Data[edit]
Further information: History of telecommunications, Computer memory, Computer data
storage, Data compression, Internet access, and Social media
The first developments for storing data were initially based on photographs, starting
with microphotography in 1851 and then microform in the 1920s, with the ability to store
documents on film, making them much more compact. Early information
theory and Hamming codes were developed about 1950, but awaited technical
innovations in data transmission and storage to be put to full use.
Magnetic-core memory was developed from the research of Frederick W. Viehe in 1947
and An Wang at Harvard University in 1949.[47][48] With the advent of the MOS transistor,
MOS semiconductor memory was developed by John Schmidt at Fairchild
Semiconductor in 1964.[49][50] In 1967, Dawon Kahng and Simon Sze at Bell Labs
described in 1967 how the floating gate of an MOS semiconductor device could be used
for the cell of a reprogrammable ROM.[51] Following the invention of flash memory
by Fujio Masuoka at Toshiba in 1980,[52][53] Toshiba commercialized NAND flash memory
in 1987.[54][51]
Copper wire cables transmitting digital data connected computer
terminals and peripherals to mainframes, and special message-sharing systems leading
to email, were first developed in the 1960s. Independent computer-to-computer
networking began with ARPANET in 1969. This expanded to become
the Internet (coined in 1974). Access to the Internet improved with the invention of
the World Wide Web in 1991. The capacity expansion from dense wave division
multiplexing, optical amplification and optical networking in the mid-1990s led to record
data transfer rates. By 2018, optical networks routinely delivered 30.4 terabits/s over a
fiber optic pair, the data equivalent of 1.2 million simultaneous 4K HD video streams. [55]
MOSFET scaling, the rapid miniaturization of MOSFETs at a rate predicted by Moore's
law,[56] led to computers becoming smaller and more powerful, to the point where they
could be carried. During the 1980s–1990s, laptops were developed as a form of
portable computer, and personal digital assistants (PDAs) could be used while standing
or walking. Pagers, widely used by the 1980s, were largely replaced by mobile phones
beginning in the late 1990s, providing mobile networking features to some computers.
Now commonplace, this technology is extended to digital cameras and other wearable
devices. Starting in the late 1990s, tablets and then smartphones combined and
extended these abilities of computing, mobility, and information sharing. Metal–oxide–
semiconductor (MOS) image sensors, which first began appearing in the late 1960s, led
to the transition from analog to digital imaging, and from analog to digital cameras,
during the 1980s–1990s. The most common image sensors are the charge-coupled
device (CCD) sensor and the CMOS (complementary MOS) active-pixel sensor (CMOS
sensor).
Electronic paper, which has origins in the 1970s, allows digital information to appear as
paper documents.
Personal computers[edit]
Main article: History of personal computers
By 1976, there were several firms racing to introduce the first truly successful
commercial personal computers. Three machines, the Apple II, PET 2001 and TRS-
80 were all released in 1977,[57] becoming the most popular by late
1978.[58] Byte magazine later referred to Commodore, Apple, and Tandy as the "1977
Trinity".[59] Also in 1977, Sord Computer Corporation released the Sord M200 Smart
Home Computer in Japan.[60]
Apple II[edit]
Main article: Apple II
See also[edit]
• Attention economy
• Attention inequality
• Big data
• Cognitive-cultural economy
• Computer crime
• Cyberterrorism
• Cyberwarfare
• Datamation – first print magazine dedicated solely to
covering information technology[69]
• Digital dark age
• Digital detox
• Digital divide
• Digital transformation
• Digital world
• Imagination age, the hypothesized successor of the
information age: a period in which creativity and
imagination become the primary creators of
economic value
• Indigo Era
• Information explosion
• Information revolution
• Information society
• Internet governance
• Netocracy
• Social Age
• Technological determinism
• Telecommunications
• Zettabyte Era
• The Hacker Ethic and the Spirit of the Information
Age
• Information and communication technologies for
environmental sustainability
References[edit]
1. ^ Zimmerman, Kathy Ann (September 7, 2017). "History of
Computers: A Brief Timeline". livescience.com. Archived from
the original on August 9, 2020. Retrieved October 17, 2019.
2. ^ Jump up to:a b "The History of
Computers". thought.co. Archived from the original on 2020-
08-01. Retrieved 2019-10-17.
3. ^ "The 4 industrial revolutions". sentryo.net. February 23,
2017. Archived from the original on October 17, 2019.
Retrieved October 17, 2019.
4. ^ Jump up to:a b c d Manuel, Castells (1996). The information
age : economy, society and culture. Oxford:
Blackwell. ISBN 978-0631215943. OCLC 43092627.
5. ^ Grobe, Klaus; Eiselt, Michael (2013). Wavelength Division
Multiplexing: A Practical Engineering Guide. John T Wiley &
Sons. p. 2.
6. ^ "General Concepts - Seconds Since the
Epoch". pubs.opengroup.org. Archived from the original on
2017-12-22. Retrieved 2022-08-29.
7. ^ Kluver, Randy. "Globalization, Informatization, and
Intercultural Communication". un.org. Archived from the
original on 19 July 2013. Retrieved 18 April 2013.
8. ^ Rider, Fredmont (1944). The Scholar and the Future of the
Research Library. New York City: Hadham Press.
9. ^ "Moore's Law to roll on for another decade". Archived from
the original on 2015-07-09. Retrieved 2011-11-27. Moore also
affirmed he never said transistor count would double every 18
months, as is commonly said. Initially, he said transistors on a
chip would double every year. He then recalibrated it to every
two years in 1975. David House, an Intel executive at the
time, noted that the changes would cause computer
performance to double every 18 months.
10. ^ Jump up to:a b Roser, Max, and Hannah Ritchie. 2013.
"Technological Progress Archived 2021-09-10 at the Wayback
Machine." Our World in Data. Retrieved on 9 June 2020.
11. ^ Hilbert, M.; Lopez, P. (2011-02-10). "The World's
Technological Capacity to Store, Communicate, and Compute
Information". Science. 332 (6025): 60–
65. Bibcode:2011Sci...332...60H. doi:10.1126/science.120097
0. ISSN 0036-8075. PMID 21310967. S2CID 206531385.
12. ^ Hilbert, Martin R. (2011). Supporting online material for the
world's technological capacity to store, communicate, and
compute infrormation. Science/AAAS. OCLC 755633889.
13. ^ Jump up to:a b c d Hilbert, Martin; López, Priscila (2011). "The
World's Technological Capacity to Store, Communicate, and
Compute Information". Science. 332 (6025): 60–
65. Bibcode:2011Sci...332...60H. doi:10.1126/science.120097
0. ISSN 0036-8075. PMID 21310967. S2CID 206531385.
14. ^ Jump up to:a b Gillings, Michael R.; Hilbert, Martin; Kemp,
Darrell J. (2016). "Information in the Biosphere: Biological and
Digital Worlds". Trends in Ecology & Evolution. 31 (3): 180–
189. doi:10.1016/j.tree.2015.12.013. PMID 26777788. S2CID
3561873. Archived from the original on 2016-06-04.
Retrieved 2016-08-22.
15. ^ Gantz, John, and David Reinsel. 2012. "The Digital
Universe in 2020: Big Data, Bigger Digital Shadows, and
Biggest Growth in the Far East Archived 2020-06-10 at
the Wayback Machine." IDC iView. S2CID 112313325. View
multimedia content Archived 2020-05-24 at the Wayback
Machine.
16. ^ Rizzatti, Lauro. 14 September 2016. "Digital Data Storage is
Undergoing Mind-Boggling Growth." EE Times. Archived from
the original on 16 September 2016.
17. ^ "The historical growth of data: Why we need a faster
transfer solution for large data sets Archived 2019-06-02 at
the Wayback Machine." Signiant. 2020. Retrieved 9 June
2020.
18. ^ Gilbert, Walter, Md, and Allan Maxam, Md.
"Biochemistry." Proceedings of the National Academy of
Sciences, USA. Vol. 74. No 2. p 560-64.
19. ^ Lathe III, Warren C.; Williams, Jennifer M.; Mangan, Mary
E.; Karolchik, Donna (2008). "Genomic Data Resources:
Challenges and Promises". Nature Education. Archived from
the original on 2021-12-06. Retrieved 2021-12-05.
20. ^ Iranga, Suroshana (2016). Social Media Culture. Colombo:
S. Godage and Brothers. ISBN 978-9553067432.
21. ^ Jillianne Code, Rachel Ralph, Kieran Forde et al. A
Disorienting Dilemma: Teaching and Learning in Technology
Education During a Time of Crisis, 14 September 2021,
PREPRINT (Version 1). https://doi.org/10.21203/rs.3.rs-
899835/v1
22. ^ Goodarzi, M., Fahimifar, A., Shakeri Daryani, E. (2021).
New Media and Ideology: A Critical Perspective. Journal of
Cyberspace Studies, 5(2), 137-162. doi:
10.22059/jcss.2021.327938.1065
23. ^ Hilbert, M. (2020). Digital technology and social change:
The digital transformation of society from a historical
perspective. Dialogues in Clinical Neuroscience, 22(2), 189–
194. https://doi.org/10.31887/DCNS.2020.22.2/mhilbert
24. ^ "Information Age Education Newsletter". Information Age
Education. August 2008. Archived from the original on 14
September 2015. Retrieved 4 December 2019.
25. ^ Moursund, David. "Information Age". IAE-
Pedia. Archived from the original on 1 August 2020.
Retrieved 4 December 2019.
26. ^ "Negroponte's articles". Archives.obs-us.com. 1996-12-
30. Archived from the original on 2011-09-04. Retrieved 2012-
06-11.
27. ^ Porter, Michael. "How Information Gives You Competitive
Advantage". Harvard Business Review. Archived from the
original on 23 June 2015. Retrieved 9 September 2015.
28. ^ Geiger, Christophe (2011), "Copyright and Digital
Libraries", E-Publishing and Digital Libraries, IGI Global,
pp. 257–272, doi:10.4018/978-1-60960-031-
0.ch013, ISBN 978-1-60960-031-0
29. ^ Jump up to:a b c McGowan, Robert. 1991. "The Work of
Nations by Robert Reich" (book review). Human Resource
Management 30(4):535–
38. doi:10.1002/hrm.3930300407. ISSN 1099-050X.
30. ^ Bhagwati, Jagdish N. (2005). In defense of Globalization.
New York: Oxford University Press.
31. ^ Smith, Fran. 5 Oct 2010. "Job Losses and Productivity
Gains Archived 2010-10-13 at the Wayback
Machine." Competitive Enterprise Institute.
32. ^ Cooke, Sandra D. 2003. "Information Technology Workers
in the Digital Economy Archived 2017-06-21 at the Wayback
Machine." In Digital Economy. Economics and Statistics
Administration, Department of Commerce.
33. ^ Yongsung, Chang; Jay H. Hong (2013). "Does Technology
Create Jobs?". SERI Quarterly. 6 (3): 44–53. Archived
from the original on 2014-04-29. Retrieved 29 April 2014.
34. ^ Cooper, Arnold C.; Gimeno-Gascon, F. Javier; Woo,
Carolyn Y. (1994). "Initial human and financial capital as
predictors of new venture performance". Journal of Business
Venturing. 9 (5): 371–395. doi:10.1016/0883-9026(94)90013-
2.
35. ^ Carr, David (2010-10-03). "Film Version of Zuckerberg
Divides the Generations". The New York Times. ISSN 0362-
4331. Archived from the original on 2020-11-14.
Retrieved 2016-12-20.
36. ^ Jump up to:a b Lee, Thomas H. (2003). "A Review of MOS
Device Physics" (PDF). The Design of CMOS Radio-
Frequency Integrated Circuits. Cambridge University
Press. ISBN 9781139643771. Archived (PDF) from the
original on 2019-12-09. Retrieved 2019-07-21.
37. ^ "Who Invented the Transistor?". Computer History Museum.
4 December 2013. Archived from the original on 13 December
2013. Retrieved 20 July 2019.
38. ^ Jump up to:a b "1960 - Metal Oxide Semiconductor (MOS)
Transistor Demonstrated". The Silicon Engine. Computer
History Museum. Archived from the original on 2019-10-27.
Retrieved 2019-07-21.
39. ^ "1963: Complementary MOS Circuit Configuration is
Invented". Archived from the original on 2019-07-23.
40. ^ Kilby, Jack (2000), Nobel lecture (PDF), Stockholm: Nobel
Foundation, archived (PDF) from the original on 29 May 2008,
retrieved 15 May 2008
41. ^ Lojek, Bo (2007). History of Semiconductor
Engineering. Springer Science & Business Media.
p. 120. ISBN 9783540342588.
42. ^ Bassett, Ross Knox (2007). To the Digital Age: Research
Labs, Start-up Companies, and the Rise of MOS Technology.
Johns Hopkins University Press.
p. 46. ISBN 9780801886393. Archived from the original on
2020-07-27. Retrieved 2019-07-31.
43. ^ "Tortoise of Transistors Wins the Race - CHM
Revolution". Computer History Museum. Archived from the
original on 10 March 2020. Retrieved 22 July 2019.
44. ^ "1968: Silicon Gate Technology Developed for
ICs". Computer History Museum. Archived from the original
on 29 July 2020. Retrieved 22 July 2019.
45. ^ "1971: Microprocessor Integrates CPU Function onto a
Single Chip". Computer History Museum. Archived from the
original on 12 August 2021. Retrieved 22 July 2019.
46. ^ Colinge, Jean-Pierre; Greer, James C.; Greer, Jim
(2016). Nanowire Transistors: Physics of Devices and
Materials in One Dimension. Cambridge University Press.
p. 2. ISBN 9781107052406. Archived from the original on
2020-03-17. Retrieved 2019-07-22.
47. ^ "1953: Whirlwind computer debuts core memory". Computer
History Museum. Archived from the original on 3 October
2019. Retrieved 31 July 2019.
48. ^ "1956: First commercial hard disk drive shipped". Computer
History Museum. Archived from the original on 31 July 2019.
Retrieved 31 July 2019.
49. ^ "1970: MOS Dynamic RAM Competes with Magnetic Core
Memory on Price". Computer History Museum. Archived from
the original on 26 October 2021. Retrieved 29 July 2019.
50. ^ Solid State Design - Vol. 6. Horizon House.
1965. Archived from the original on 2021-06-09.
Retrieved 2020-11-12.
51. ^ Jump up to:a b "1971: Reusable semiconductor ROM
introduced". Computer History Museum. Archived from the
original on 3 October 2019. Retrieved 19 June 2019.
52. ^ Fulford, Benjamin (24 June 2002). "Unsung
hero". Forbes. Archived from the original on 3 March 2008.
Retrieved 18 March 2008.
53. ^ US 4531203 Fujio Masuoka
54. ^ "1987: Toshiba Launches NAND Flash". eWeek. April 11,
2012. Retrieved 20 June 2019.
55. ^ Saarinen, Juha (January 24, 2018). "Telstra trial claims
world's fasts transmission speed". ITNews
Australia. Archived from the original on October 17, 2019.
Retrieved December 5, 2021.
56. ^ Sahay, Shubham; Kumar, Mamidala Jagadesh
(2019). Junctionless Field-Effect Transistors: Design,
Modeling, and Simulation. John Wiley &
Sons. ISBN 9781119523536. Archived from the original on
2019-12-21. Retrieved 2019-10-31.
57. ^ Chandler, Alfred Dupont; Hikino, Takashi; Nordenflycht,
Andrew Von; Chandler, Alfred D. (2009-06-30). Inventing the
Electronic Century. ISBN 9780674029392. Archived from the
original on 2022-01-18. Retrieved 11 August 2015.
58. ^ Schuyten, Peter J. (6 December 1978). "Technology; The
Computer Entering Home". Business & Finance. The New
York Times. p. D4. ISSN 0362-4331. Archived from the
original on 22 July 2018. Retrieved 9 September 2019.
59. ^ "Most Important Companies". Byte. September 1995.
Archived from the original on 2008-06-18. Retrieved 2008-06-
10.
60. ^ "M200 Smart Home Computer Series-Computer
Museum". Archived from the original on 2020-01-03.
Retrieved 2022-01-18.
61. ^ Reimer, Jeremy (14 December 2005). "Total share: 30
years of personal computer market share figures; The new
era (2001– )". Ars Technica. p. 9. Archived from the original
on 21 February 2008. Retrieved 13 February 2008.
62. ^ Reimer, Jeremy (December 2005). "Personal Computer
Market Share: 1975–2004". Ars Technica. Archived from the
original on 6 June 2012. Retrieved 13 February 2008.
63. ^ Markoff, John (March 3, 1997). "Fiber-Optic Technology
Draws Record Stock Value". The New York
Times. Archived from the original on November 9, 2021.
Retrieved December 5, 2021.
64. ^ Grobe, Klaus; Eiselt, Michael (2013). Wavelength Division
Multiplexing: A Practical Engineering Guide. John T Wiley &
Sons. p. 2.
65. ^ Sudo, Shoichi (1997). Optical Fiber Amplifiers: Materials
Devices, and Applications. Artech House, Inc. pp. xi.
66. ^ George, Gilder (April 4, 1997). "Fiber Keeps its
Promise". Forbes ASAP.
67. ^ Castells, Manuel. The Power of Identity, The Information
Age: Economy, Society and Culture Vol. II. Cambridge, MA;
Oxford, UK: Blackwell.
68. ^ Castells, Manuel. The Power of Identity, The Information
Age: Economy, Society and Culture Vol. II. Cambridge, MA;
Oxford, UK: Blackwell
69. ^ "Newspapers News and News Archive Resources:
Computer and Technology Sources". Temple
University. Archived from the original on 5 October 2017.
Retrieved 9 September 2015.
Further reading[edit]
• Oliver Stengel et al. (2017). Digitalzeitalter -
Digitalgesellschaft, Springer ISBN 978-3658117580
• Mendelson, Edward (June 2016). In the Depths of
the Digital Age, The New York Review of Books
• Bollacker, Kurt D. (2010) Avoiding a Digital Dark
Age, American Scientist, March–April 2010, Volume
98, Number 2, p. 106ff
• Castells, Manuel. (1996–98). The Information Age:
Economy, Society and Culture, 3 vols. Oxford:
Blackwell.
• Gelbstein, E. (2006) Crossing the Executive Digital
Divide. ISBN 99932-53-17-0
External links[edit]
show
Computer science
show
Western world and culture
Portals:
Internet Technology
Categories:
• Information Age
• Digital media
• Digital divide
• Contemporary history
• Historical eras
• Postmodernism
• Cultural trends
Navigation menu
• Not logged in
• Talk
• Contributions
• Create account
• Log in
• Article
• Talk
• Read
• Edit
• View history
Search
Search Go
• Main page
• Contents
• Current events
• Random article
• About Wikipedia
• Contact us
• Donate
Contribute
• Help
• Learn to edit
• Community portal
• Recent changes
• Upload file
Tools
• What links here
• Related changes
• Special pages
• Permanent link
• Page information
• Cite this page
• Wikidata item
Print/export
• Download as PDF
• Printable version
In other projects
• Wikimedia Commons
• Wikiquote
• Wikiversity
Languages
• العربية
• Español
• Français
• हिन्दी
• Bahasa Indonesia
• Português
• Русский
• اردو
• 中文
33 more
Edit links
• This page was last edited on 22 October 2022, at 20:34 (UTC).
• Text is available under the Creative Commons Attribution-ShareAlike License 3.0; additional terms may
apply. By using this site, you agree to the Terms of Use and Privacy Policy. Wikipedia® is a registered
trademark of the Wikimedia Foundation, Inc., a non-profit organization.
• Privacy policy
• About Wikipedia
• Disclaimers
• Contact Wikipedia
• Mobile view
• Developers
• Statistics
• Cookie statement
•
Get citation