You are on page 1of 3

Describe the evolution of data analytics (1.0, 2.0, and 3.0) and their characteristics.

Analytics 1.0, also known as the era of business intelligence. This was the rising era Data Warehouse
where new computing technology was the key, data marts that were used to capture information, and
of business intelligence software, used to query and report it. A real improvement was in establishing an
objective and deeper understanding of the business phenomenon. This was needed to give the
managers fact-based comprehension and go beyond intuition while planning.

However, a major drawback was in Analytics 1.0 era, IT and Business Analysts spent most of their time
preparing data for analysis and relatively little time on the analytics itself. The work was slow, and the
results would sometimes take months to come which also restricted the depth of analysis. Also, to add
on, the business intelligence activities addressed only what had happened in the past and offered no
predictions for the future.

Analytics 2.0, also known as the era of Big Data. The drawbacks of the previous era became more
prominent as the companies stepped out to explore and attain a good analytical strategy. The need for
power tools and the opportunity to provide them with profit became very apparent. The term ‘Big data’
was coined to distinguish from small data, which is generated purely by a firm’s internal transaction
systems. Big data that could not fit or be analyzed fast enough on a centralized platform was processed
with Hadoop, an open-source software framework for fast batch data processing across parallel servers
either in the cloud or on-premise. To deal with unstructured data, companies also turned to a new class
of databases known as NoSQL. Other big data technologies introduced during this period include “in
memory” analytics for fast analysis where the data is managed and processed in memory rather than on
disk.

Massive amounts of data were being created on the edge of the network and traditional ways of doing
analytics are no longer viable. Machine learning was used to generate models from fast-moving data
and there was a transition from black and white to colorful and complex visuals. As the technology
matured and started to have automated capabilities for data management, analysts became key
personnel who could report on progress and trends and make recommendations based on the data they
were processing.

Analytics 3.0, also known as the era of data enriched offerings. The pioneering big data firms began
investing in analytics to support customer-facing products, services, and features. They attracted
viewers to their websites through better search algorithms, recommendations, suggestions for products
to buy, and highly targeted ads, all driven by analytics rooted in enormous amounts of data. The next
generation of quantitative analysts was called data scientists, who possessed both computational and
analytical skills. This era marks the point where the large organization started to follow suit. Companies
can analyze those data sets for the benefit of customers and monetize them. They also can embed
advanced analytics and optimization in near real-time into every business decision made at the front
lines of their operations.

Analytics 3.0 brought new challenges and opportunities for both companies competing with analytics
and vendors that supplied data. In this era, organizations started to think critically about how their data
and analytics practice could translate into meaningful change for employees, customers, and business
processes.
What is the difference between descriptive, predictive, and prescriptive analytics?

The difference between is-

Descriptive Analytics: Insight into the past. It means “describe”, or summarize, raw data and make it
something interpretable by humans. They are analytics that describes the past. One common example is
analyzing seasonal purchasing trends to determine the best time to launch a new product. As consumers
are creatures of habit, looking at historical data is an effective way to predict their responses.

Predictive Analytics: Understanding the future. These analytics are about understanding the future.
Predictive analytics provides companies with actionable insights based on data. Predictive analytics
provides estimates about the likelihood of a future outcome. It is important to remember that no
statistical algorithm can “predict” the future with 100% certainty. One common example is the use of
predictive analytics to produce a credit score. These scores are used by financial services to determine
the probability of customers making future credit payments on time.

Prescriptive Analytics: Advice on optimal behaviors and actions. Prescriptive analytics attempts to
quantify the effect of future decisions to advise on possible outcomes before the decisions are made.
These analytics go beyond descriptive and predictive analytics by recommending one or more possible
courses of action. Prescriptive analytics is relatively complex to administer, and most companies are not
yet using them in their daily course of business. Use Prescriptive Analytics any time you need to provide
users with advice on what action to take.

How do traditional firms (such as Bosch, Schneider Electric, GE, etc) use analytics?

The Bosch Group, based in Germany, is 127 years old, but it’s hardly the last century in its application of
analytics. The company has taken a series of initiatives to adapt to data analytics to give back a so-called
intelligent offering to its customers. The steps include intelligent fleet management, intelligent vehicle-
charging infrastructures, intelligent energy management, intelligent security video analysis, and many
more. Bosch created a Software Innovations group that focuses heavily on big data, analytics, and the
“Internet of Things” to identify and deliver innovative products.

Schneider Electric, a 170-year-old company based in France, originally manufactured iron, steel, and
armaments and now it focuses on energy management, including energy optimization, smart-grid
management, and building automation. It has acquired and developed a variety of software and data
ventures in Silicon Valley. Its Advanced Distribution Management System handles energy distribution in
utility companies. ADMS monitors and controls network devices manage service outages and dispatches
crews. It gives utilities the ability to integrate millions of data points on network performance and lets
engineers use visual analytics to understand the state of the network.

In General Electric, we have one of the most dramatic conversions taking place to data and analytics is
being offered. GE’s manufacturing businesses are increasingly becoming providers of asset and
operations optimization services. GE can determine the most efficient and effective service intervals for
machines with sensors streaming data from turbines, locomotives, jet engines. GE has invested more
than $2 billion in new software and analytics to enable employees.GE is now selling the technology to
other companies for use in managing big data and analytics.
UPS, 107 years old has the best example that has pushed analytics to the forefront – the delivery routing
process. It captures the information on the millions of packages it delivers daily with the recent sensors
in a place called the Telematics sensor in more than 46,000 company trucks, which track metrics
including speed, direction, braking, and drivetrain performance. The waves of incoming data not only
show daily performance but also are informing a major redesign of drivers’ routes.

You might also like