You are on page 1of 10

EVOLUTION OF CAMERA:

It wasn’t that long ago that having access to cameras was a luxury. Nowadays, anybody with
a smartphone can take at least passible pictures with ease. But even when we still had to rely
on Kodak to commemorate our vacations, it was still a massive improvement over the early
days of photography.

Long before traditional cameras, people employed camera obscuras. Images projected using
this natural phenomenon were larger than normal but inverted. Functionally, it was like our
modern projectors. Some artists would use the projected image as a guide or even trace the
image, much like a lightbox. This was the only way to preserve the image after turning the
device off.

In 1816, Nicephore Niepce successfully made a partial photograph using a camera he built,
and a paper coated with silver chloride. Unfortunately, he hadn’t figured out how to remove
the untreated silver chloride, meaning the image was eventually darkened completely by the
light needed to view it. In the mid-1820s, Niepce used a new sliding wooden camera (created
by Charles and Vincent Chevalier) to experiment with Bitumen of Judea. Only one of the
photos from this experiment, View from the Window at Le Gras, has survived.
Louise Daguerre continued to experiment with cameras after Niepce’s death in 1833,
resulting in the first practical photographic process in 1837. Dubbed the “daguerreotype,” it
used silver-plated copper treated with iodine vapor to create images. It was wildly successful
after debuting to the public in 1839, where both it and the calotype began introducing
photography to the masses. Normally, having portraits taken was an activity exclusive to the
upper classes. The cost and amount of time needed to produce such works were unreasonable
for most working-class people. The speed of the camera, which only increased as time went
on, made it possible for anybody to have quality portraits.

When photographic film hit the market in 1888, cameras truly began to resemble the ones we
know today. George Eastman began selling his Kodak cameras, which were affordable and
small enough for the average consumer to consider. The film inside could hold up to 100
photos, and the entire device would have to be sent back to the factory for processing and
printing. In 1900, he introduced the Brownie camera, introducing the snapshot to the public.
It was during this time that movie cameras went from expensive toys to tools with legitimate
commercial use.

Another common type of film was 35mm. Typically, cheaper cameras used roll film, whereas
higher-end would utilize 35mm. Eventually, even lower-grade cameras were equipped with
35mm film, although the least expensive cameras would still use roll film. Famous camera
company Canon had its start during this period, gaining popularity with other Japanese
cameras after the Korean War. Single-lens reflex (SLR) cameras were also gaining traction
with the public as the technology made them less bulky. Both SLR and twin-lens reflex
(TLR) cameras were available previously, but mostly appealed to professionals due to both
size and cost.

Instant cameras, such as the famous Polaroid, first appeared on the market in 1948. The
simple nature and ability to print and view photos within minutes made them popular with
families, even with higher prices at launch. Automatic exposure followed a similar story:
expensive at first, but quickly catching on among consumers and lowering in cost as time
went on. All of these elements combined would lead to the earliest digital cameras.
Unfortunately, the history of the digital camera is far too dense to cover in its entirety here,
but to give you a rough idea of where they started: the very first digital cameras stored their
images on floppy disk.

It’s hard to imagine where cameras will go from here when now professional-grade images
can be produced with nothing but a timer button and a selfie stick. At the same time,
however, it’s fascinating to think that just over 200 years ago, photographs, as we know them,
were still experimental. Technology can change drastically in a short amount of time; maybe
in ten years’ time, we’ll have camera eyes.

CATEGORY:
FLOW MAP

HISTORY:
By the late 1980s, the technology required to produce truly commercial digital cameras
existed. The first true portable digital camera that recorded images as a computerized file was
likely the Fuji DS-1P of 1988, which recorded to a 2 MB SRAM memory card that used a
battery to keep the data in memory. This camera was never marketed to the public.
The first digital camera of any kind ever sold commercially was possibly the MegaVision
Tessera in 1987 though there is not extensive documentation of its sale known. The
first portable digital camera that was actually marketed commercially was sold in December
1989 in Japan, the DS-X by Fuji The first commercially available portable digital camera in
the United States was the Dycam Model 1, first shipped in November 1990. It was originally
a commercial failure because it was black-and-white, low in resolution, and cost nearly
$1,000 (equivalent to $1,900 in 2018). It later saw modest success when it was re-sold as
the Logitech Fotoman in 1992. It used a CCD image sensor, stored pictures digitally, and
connected directly to a computer for download.
In 1991, Kodak brought to market the Kodak DCS (Kodak Digital Camera System), the
beginning of a long line of professional Kodak DCS SLR cameras that were based in part on
film bodies, often Nikons. It used a 1.3 megapixel sensor, had a bulky external digital storage
system and was priced at $13,000 (equivalent to $24,000 in 2018). At the arrival of
the Kodak DCS-200, the Kodak DCS was dubbed Kodak DCS-100.
The move to digital formats was helped by the formation of the
first JPEG and MPEG standards in 1988, which allowed image and video files to be
compressed for storage. The first consumer camera with a liquid crystal display on the back
was the Casio QV-10 developed by a team led by Hiroyuki Suetaka in 1995. The first camera
to use CompactFlash was the Kodak DC-25 in 1996. The first camera that offered the ability
to record video clips may have been the Ricoh RDC-1 in 1995.
In 1995 Minolta introduced the RD-175, which was based on the Minolta 500si SLR with a
splitter and three independent CCDs. This combination delivered 1.75M pixels. The benefit
of using an SLR base was the ability to use any existing Minolta AF mount lens. 1999 saw
the introduction of the Nikon D1, a 2.74 megapixel camera that was the first digital
SLR developed entirely from the ground up by a major manufacturer, and at a cost of under
$6,000 (equivalent to $9,900 in 2018) at introduction was affordable by professional
photographers and high-end consumers. This camera also used Nikon F-mount lenses, which
meant film photographers could use many of the same lenses they already owned.
Digital camera sales continued to flourish, driven by technology advances. The digital market
segmented into different categories, Compact Digital Still Cameras, Bridge Cameras,
Mirrorless Compacts and Digital SLRs. One of the major technology advances was the
development of CMOS sensors, which helped drive sensor costs low enough to enable the
widespread adoption of camera phones.
Since 2003, digital cameras have outsold film cameras and Kodak announced in January
2004 that they would no longer sell Kodak-branded film cameras in the developed world –
and 2012 filed for bankruptcy after struggling to adapt to the changing
industry. Smartphones now routinely include high resolution digital cameras.

EVOLUTION OF INTERNET:

The Internet is our digital information superhighway which we use so ubiquitously today. The
term “online” has become synonymous with the Internet. We are actually almost always
online and sometimes we are not aware of it. This is because of the transparency in service
that Internet Service Providers (ISP) and cellular phone providers have given us. Our Internet
plan and smartphone service provide data access to the Internet 24/7/365.

We don’t just use the Internet for online shopping and general research. We also use the
Internet to pay bills, RSVP to invitations, post photos from our daily life and even order
groceries. It has become a necessity for modern living that it seems we cannot live without it.
If the Internet were to suddenly shut down, it will cause anxiety among people whose lives
have become very much dependent on it.

Bloggers and vloggers, social media influencers and online gamers form a large percentage of
the online community. This goes to show how the Internet has now become a big part of daily
life. This is due to commercialization and how all aspects of modern life revolve around it.
That is what has led to the centralization of the Internet. This centralization is now controlled
by the big players who provide it as a service, yet the original Internet was not like this.

In order to understand, the original Internet’s plan was not to be centralized. In fact it was a
project by the US DoD (Department of Defense) to establish a computer data
communications network that could withstand unforeseen events and disasters like war.
Therefore it must be decentralized so that if one part of the system fails the rest can still
function. It must also be able to communicate using peer to peer interconnectivity without
relying on a single computer. Another important consideration is that the computers must be
interoperable among dissimilar systems, so that more devices can be a part of the network.
It all started with ARPANET in October 29, 1969 when the first successful message was sent
from a computer in UCLA to another computer (also called node) at the Stanford Research
Institute (SRI). These computers were called Interface Message Processors (IMP).
In the beginning ARPANET benefited not just the military but also research institutes, so it
had it’s origins in the academic community though it was a military project. The system
slowly evolved so it was not immediately adopted for commercial use. Instead in the early
1980’s it was adopted by universities and research institutes through an initiative by the NSF
(National Science Foundation). It was called the NSFNET Project and it’s aim was to
promote research and education. The best way to do this was to use an interconnected
network of computers that can provide a way to collaborate and share information. This
provided a backbone that included the Computer Science Network (CSNET) that linked
computer science research among academics.

Eventually ARPANET and NSFNET would be decommissioned, thus paving the way for the
commercialization of the Internet. It was also called the “Internet” as a sort of portmanteau of
“interconnected” and “network” and has been called the Internet since. This would involve
the development of standards maintained by the IETF (Internet Engineering Task Force) with
contributions from among many organizations like the IEEE (Institute of Electrical and
Electronics Engineers), IESG (Internet Engineering Steering Group) and the ISO
(International Organization for Standardization).

CATEGORY:

FLOW MAP

HISTORY:

The history of the Internet has its origin in the efforts of wide area networking that originated
in several computer science laboratories in the United States, United Kingdom, and
France. The U.S. Department of Defense awarded contracts as early as the 1960s, including
for the development of the ARPANET project, directed by Robert Taylor and managed
by Lawrence Roberts. The first message was sent over the ARPANET in 1969 from computer
science Professor Leonard Kleinrock's laboratory at University of California, Los
Angeles (UCLA) to the second network node at Stanford Research Institute (SRI).
Packet switching networks such as the NPL network, ARPANET, Merit
Network, CYCLADES, and Telenet, were developed in the late 1960s and early 1970s using
a variety of communications protocols.[2] Donald Davies first demonstrated packet switching
in 1967 at the National Physics Laboratory (NPL) in the UK, which became a testbed for UK
research for almost two decades.[3][4] The ARPANET project led to the development of
protocols for internetworking, in which multiple separate networks could be joined into a
network of networks. The design included concepts from the French CYCLADES project
directed by Louis Pouzin.
In the early 1980s the NSF funded the establishment for national supercomputing centers at
several universities, and provided interconnectivity in 1986 with the NSFNET project, which
also created network access to the supercomputer sites in the United States from research and
education organizations. Commercial Internet service providers (ISPs) began to emerge in the
very late 1980s. The ARPANET was decommissioned in 1990. Limited private connections
to parts of the Internet by officially commercial entities emerged in several American cities
by late 1989 and 1990, and the NSFNET was decommissioned in 1995, removing the last
restrictions on the use of the Internet to carry commercial traffic.
In the 1980s, research at CERN in Switzerland by British computer scientist Tim Berners-
Lee resulted in the World Wide Web, linking hypertext documents into an information
system, accessible from any node on the network. Since the mid-1990s, the Internet has had a
revolutionary impact on culture, commerce, and technology, including the rise of near-instant
communication by electronic mail, instant messaging, voice over Internet Protocol (VoIP)
telephone calls, two-way interactive video calls, and the World Wide Web with its discussion
forums, blogs, social networking, and online shopping sites. The research and education
community continues to develop and use advanced networks such as JANET in the United
Kingdom and Internet2 in the United States. Increasing amounts of data are transmitted at
higher and higher speeds over fiber optic networks operating at 1 Gbit/s, 10 Gbit/s, or more.
The Internet's takeover of the global communication landscape was almost instant in
historical terms: it only communicated 1% of the information flowing through two-
way telecommunications networks in the year 1993, already 51% by 2000, and more than
97% of the telecommunicated information by 2007. Today the Internet continues to grow,
driven by ever greater amounts of online information, commerce, entertainment, and social
networking. However, the future of the global internet may be shaped by regional differences
in the world.

EVOLUTION OF SOCIAL MEDIA:

Social media has become a ubiquitous part of daily life, but this growth and evolution has
been in the works since the late 70s. From primitive days of newsgroups, listservs and the
introduction of early chat rooms, social media has changed the way we communicate, gather
and share information, and given rise to a connected global society.

According to an infographic from social media monitoring platform Simplify360, the


“Golden Era” of social media started in 2001. By this time there were already several chat
application iterations, including ICQ, and one of the earliest blog platforms, Livejournal.

However, starting in 2001, there was a constant stream of social innovation that started with
the first crowdsourced encyclopedia, Wikipedia. Wikipedia was followed by Friendster,
MySpace, Facebook in 2004, and Twitter in 2006. While Facebook and Twitter are the two
top social media platforms today, MySpace has undergone several pivots and is still in quiet
operation.

While the infographic does include the launch of apps like WhatsApp and Snapchat, it
doesn’t note the impact of mobile on the increase in social media use around the world. And
there are plenty of networks that aren’t even mentioned, including Yik Yak,
Whisper, Tsu and Ello, all of which are perhaps part of the ongoing Golden Age of social
media.
CATEGORY:

TIMELINE

HISTORY:

Years later, in 1994, what could be considered one of the first social networks, GeoCities, was
launched for the first time. However, the original idea of this space, which later had its clone
in Tripod and Spain in Ciudad Futura, Demasiado.com, and Galeón, which is still active, was
a service for creating simple web pages that lodged in certain neighborhoods according to
their content.

A year later, in 1995, TheGlobe.com came to light. This social network provided the
opportunity for its users to personalize their own online experiences by publishing their
content and interacting with other people who had similar interests. That same
year Classmates appeared, a website that helped people find their former classmates and
work.

Two years later it was time for SixDegrees. A space that some researchers consider as the first
social network. Or at least the one that best reflects the definition of social networks. In
fact, SixDegrees offered the possibility of creating personal profiles, inviting friends or
visiting the profiles of other users. However, all these social networks were far from
becoming relevant spaces on the Internet in an era dominated by directories such as Yahoo or
search engines such as Altavista.
On the other hand, it was the year 1997 when Instant Messenger was launched for the first
time. An instant messaging program created by Microsoft Windows with which users could
find the most basic chat services and contact list. He can be considered the precursor of one
of the most influential social networks currently, the WhatsApp instant messaging service.

The downturn of many technology companies that had developed under the protection of the
economic bonanza that was closed with the pinch of the dotcom bubble made it look like it
was going to break the growth dynamics of the internet. However, it caused the appearance of
new startups that were going to be part of the history of social networks.

Google deserves to be mentioned due to its attempts to create an appropriate social network.
Apart from YouTube, Google tried to create a social network with Google Buzz that had little
success. Later, he launched Orkut, which was only moderately successful in Brazil and ended
up trying it with Google+, a social network that has already seen several modifications and
doesn’t take off.

The development of social networks has been impressive in a few years. The history of social
networks is still in an incipient phase and has already caused thousands of changes in the
world. In fact, social networks have completely changed the way people relate to each
other. Communication is immediate both in personal and professional life.
Something that is accentuated in the new stage of the history of social networks that has
begun to be written: mobile social networks where applications such as
WhatsApp, Instagram or Snapchat begin to overshadow the giants of social networks.

EVOLUTION OF DIGITAL MARKETING:

Digital marketing is the marketing of products or services using digital technologies, mainly
on the Internet, but also including mobile phones, display advertising,and any other digital
medium. Digital marketing channels are systems based on the internet that can create,
accelerate, and transmit product value from producer to the terminal consumer by digital
networks.
Digital marketing's development since the 1990s and 2000s has changed the way brands and
businesses use technology for marketing. As digital platforms are increasingly incorporated
into marketing plans and everyday life, and as people use digital devices instead of visiting
physical shops, digital marketing campaigns are becoming more prevalent and efficient.
Digital marketing methods such as search engine optimization (SEO), search engine
marketing (SEM), content marketing, influencer marketing, content automation, campaign
marketing, data-driven marketing, e-commerce marketing, social media marketing, social
media optimization, e-mail direct marketing, Display advertising, e–books, and optical
disks and games are becoming more common in our advancing technology. In fact, digital
marketing now extends to non-Internet channels that provide digital media, such as mobile
phones (SMS and MMS), callback, and on-hold mobile ring tones. In essence, this extension
to non-Internet channels helps to differentiate digital marketing from online marketing,
another catch-all term for the marketing methods mentioned above, which strictly occur
online.

CATEGORY:
TIMELINE

HISTORY:

The advent of digital marketing can be traced back to the days of the 1980s.

This was the time when new innovations were taking place that made the computer system
advanced enough to store information of the customers. It was in the year 1981 when IBM
came out with the first personal computer and the storing capacity of the computers increased
to 100 MB in the year 1989.

The 1980s was also the time when companies recognized the importance of harboring
customer relationship rather than just implementing the practice of pushing products. The
marketers now dropped limited practices that included list brokering to adopt the database
marketing. This was the year when it started to become a practice of maintaining a database
of prospects, customers, and commercial contracts.
Consequently, in the year 1986, the customer management company ACT was responsible for
launching the database marketing software for the first time. This database marketing
software was now allowing the storage of a huge volume of customer information.

Robert Kestenbaum and Robert Shaw who are known as the father of marketing automation
together created several database-marketing models that helped BT and Barclays. These
Database Marketing solutions contained many features that included sales channel
automation, campaign management, contact strategy optimization, marketing analytics and
marketing resource management.

The 1980s saw the emergence of Digital Databases that changed the dynamics of the buyer-
seller relationship.

It enabled the companies to gain information, store and track their customers like never
before. The only catch here was that the whole process was still manual. Further, in this
period the launch of the personal computers and client architecture brought the revolution
that changed the marketing technology in a decade. The Customer Relationship Management
or CRM software brought this revolution in the 1990s.

EVOLUTION OF DATA STORAGE:

Punch cards were the first effort at Data Storage in a machine language. Punch cards were
used to communicate information to equipment “before” computers were developed. The
punched holes originally represented a “sequence of instructions” for pieces of equipment,
such as textile looms and player pianos. The holes acted as on/off switches. Basile Bouchon
developed the punch card as a control for looms in 1725.

In 1837, a little over 100 years later, Charles Babbage proposed the Analytical Engine, a
primitive calculator with moving parts, that used punch cards for instructions and responses.
Herman Hollerith developed this idea, and made the Analytical Engine a reality by having the
holes represent, not just a sequence of instructions, but stored data the machine could read.

He developed a punch card data processing system for the 1890 U.S. Census, and then started
the Tabulating Machine Company in 1896. By 1950, punch cards had become an integral part
of the American industry and government. The warning, “Do not fold, spindle, or mutilate,”
originated from punch cards. Punch cards were still being used quite regularly until the mid-
1980s

In 1948, Professor Fredrick Williams, and colleagues, developed “the first” Random Access
Memory (RAM) for storing frequently used programming instructions, in turn, increasing the
overall speed of the computer. Williams used an array of cathode-ray tubes (a form
of vacuum tube) to act as on/off switches, and digitally store 1024 bits of information.

Data in RAM (sometimes called volatile memory) is temporary and when a computer loses
power, the data is lost, and often frustratingly irretrievable. ROM (Read Only Memory), on
the other hand, is permanently written and remains available after a computer has lost power.
Flash drives appeared on the market, late in the year 2000. A flash drive plugs into computers
with a built-in USB plug, making it a small, easily removable, very portable storage device.
Unlike a traditional hard drive, or an optical drive, it has no moving parts, but instead
combines chips and transistors for maximum functionality. Generally, a flash drives storage
capacity ranges from 8 to 64 GB. (Other sizes are available, but can be difficult to find.)

A flash drive can be rewritten nearly a limitless number of times and is unaffected by
electromagnetic interference (making them ideal for moving through airport security).
Because of this, flash drives have entirely replaced floppy disks for portable storage. With
their large storage capacity, and low cost, flash drives are now on the verge of replacing CDs
and DVDs.

Flash drives are sometimes called pen drives, USB drives, thumb drives, or jump drives.
Solid State Drives (SSD) are sometimes referred to as flash drives, but they are larger and
clumsy to transport.

The Internet made the Cloud available as a service. Improvements within the Internet such as
continuously lowering the cost of storage capacity and improved bandwidth, have made it
more economical for individuals and businesses to use the Cloud for data storage. The Cloud
offers essentially an infinite amount of data storage to its user. Cloud services provide near-
infinite scalability, and accessibility to data from anywhere, at anytime. Is often used to
backup information initially stored on site, making it available should the company’s own
system suffer a failure. Cloud security is a significant concern among users, and service
providers have built security systems, such as encryption and authentication, into the services
they provide.

CATEGORY:

FLOW MAP

HISTORY:

Data storage capacity is measured in bits, and a Byte has an 8-bit capacity. The 18th century
punch card stored 100 Bytes. A thousand Bytes make a Kilobyte, equivalent to a static web
page. The floppy disks and CD-Rom disks a lot of us grew up with improved storage capacity
to the Megabyte. The entire portfolio of Shakespeare will fit in a 5 MB disk. Storage capacity
was still inadequate for digital files, and thus came the Gigabyte. The biggest File Transfer
Protocol ever developed was 600 Gigabytes. Today, a single smartphone is storing several
Gigabytes just of music.

As if that was not enough, the Terabyte followed with the storage potential of an entire
academic library. A Terabyte is far above the needs of any computer user, since a printed
collection of US Library of Congress is less than 10 Terabytes. Advances in technology
outdid the Terabyte with the Petabyte. Eight Petabytes are equivalent to all the information
available on the World Wide Web. All the hard disks manufactured in 1995, only equal the
capacity of 10 Petabytes. Five Exabytes can store all words ever spoken by any human being.
The Zettabyte and the Yottabyte are potential storage capacities, though there is no
application using the two currently. Given the history of data storage, the two capacities will
be used one day soon.

Data storage technology has gradually replaced storage formats or increased the data storage
capacity in the same format. The pioneer Compact Disk technology stored 2352 Bytes within
a circular fold of 120 mm in diameter. In the same format, DVDs now offer 4.78 GB storage
space (printed with a single layer on one side) or 17 GB storage (two layers on both sides).
Similarly, Blu-Ray exceeds the capacity of a DVD in the same format, with 25 GB (single
layer on one side) or 50 GB (two layers on both sides).

Today, data storage technology has already exceeded the human brain potential of between 1
and 10 Terabytes. A 200 Petabyte drive was being developed from digital magnetic tape in
1995. Again, the use of the Zettabyte and Yottabyte capacities is predictable, given the
evolution of digital data storage in the last decade. The Information Age is pushing towards a
data city where a single apartment complex will need a Terabyte storage space. The world is
already thinking beyond the potential of cloud storage. The highly anticipated Hologram will
require only a single centimeter to store 1 Terabyte, and less than 10 centimeters to store a
human brain.

One can only anticipate how data storage technology will advance in both capacity and
formats, soon, as we embrace the digital age.

You might also like