You are on page 1of 4

What is the Big Data?

Big data refers to large, complex, organized, and unstructured data


collections that are generated and sent in real time from a variety of
sources. The three Vs of big data are made up of these characteristics:
Volume: The massive amounts of data that must be kept.
Velocity: At breakneck speed, data streams must be analyzed and
evaluated.
Variety: The different sources and forms from which data is collected,
such as numbers, text, video, images, audio and text.
When we open an app, search Google, or simply travel from place to place
with our mobile devices, data is constantly generated. What's the end
result? Large amounts of useful data that businesses and organizations
must manage, store, visualize, and analyze.
Because traditional data tools aren't built to handle this level of complexity
and volume, a plethora of specialist big data software and architecture
solutions have emerged to handle the strain.

How is big data used?


Big data is intrinsically complicated due to its diversity, necessitating the
development of systems capable of handling its many structural and
semantic distinctions.
Big data necessitates the use of specialized NoSQL databases that can
store data without requiring strict adherence to a particular paradigm.
This gives you the freedom to evaluate seemingly incongruous sources of
data to get a comprehensive picture of what's going on, how to react, and
when to react.
When gathering, processing, and analyzing large amounts of data, it's
common to categorize it as operational or analytical data and store it
accordingly.
Large batches of data are served across several servers by operational
systems, which comprise input such as inventories, customer data, and
purchases – the day-to-day information within a business.
Analytical systems are more advanced than operational systems, capable
of managing complex data processing and delivering decision-making
insights to enterprises. To maximize data collection and usage, these
systems are frequently linked into existing processes and infrastructure.
Data is omnipresent, regardless of how it is labeled. Our phones, credit
cards, software applications, vehicles, records, websites, and the vast
majority of “things” in our environment are all capable of communicating
enormous amounts of data, which is quite valuable.
To discover patterns and trends, answer questions, acquire insights into
customers, and solve complicated problems, big data is employed in
practically every business. Companies and organizations utilize the data
for a variety of purposes, including expanding their operations, better
understanding customer decisions, improving research, forecasting, and
advertising to critical audiences.

History of Big Data


Data collecting may be traced back to ancient civilizations using stick
tallies to tally food, but the history of big data begins much later. Here's
a rundown of some of the key events that have lead us to where we are
now.

1880: During the 1880 census, one of the first examples of data overload
occurs. The Hollerith Tabulating Machine is invented, reducing the time it
takes to process census data from ten years to under a year.
1928: Fritz Pfleumer, a German-Austrian engineer, invents magnetic data
storage on tape, paving the path for how digital data would be kept in the
twenty-first century.
1948: Shannon's Information Theory is created, establishing the
groundwork for today's information infrastructure.
1970: A “relational database,” presented by IBM mathematician Edgar F.
Codd, demonstrates how information in big databases can be accessed
without understanding its structure or location. Previously, this was only
available to experts or individuals with substantial computer skills.
1976: Material Requirements Planning (MRP) systems are being created
for commercial use to organize and schedule information, and they are
becoming more widespread in daily business processes.
1989: The World Wide Web is created by Tim Berners-Lee.
2001: Doug Laney gives a presentation on the "3 Vs of Data," which are
the core properties of big data. The term "software-as-a-service" is coined
for the first time that year.
2005: Hadoop is an open-source software platform for storing massive
datasets.
2007: In the Wired article "The End of Theory: The Data Deluge Makes
the Scientific Method Obsolete," the term "big data" is first presented to
the general public.
2008: The study "Big Data Computing: Creating Revolutionary
Breakthroughs in Commerce, Science, and Society," written by a group of
computer science researchers, describes how big data is radically
transforming the way firms and organizations do business.
2010: According to Google CEO Eric Schmidt, individuals create as much
information every two days as they did from the dawn of civilization until
2003.
2014: Increasingly, businesses are migrating their Enterprise Resource
Planning (ERP) systems to the cloud. With an estimated 3.7 billion linked
devices or items in use, the Internet of Things (IoT) is becoming
increasingly popular, exchanging massive volumes of data every day.
2016: The Obama administration announces the "Federal Big Data
Research and Strategic Development Plan," which aims to accelerate
research and development of big data applications that benefit society
and the economy.
2017: According to an IBM research, 2.5 quintillion bytes of data are
created every day, and 90% of the world's data was created in the last
two years.

You might also like