You are on page 1of 24

Republic of the Philippines

NUEVA VIZCAYA STATE UNIVERSITY


Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

College COLEGE OF ARTS AND SCIENCES


College: __________________________
Logo BAYOMBONG CAMPUS
Campus :__________________________

DEGREE BS in Information
COURSE NO. IST1_1
PROGRAM Systems
COURSE
SPECIALIZATION N/A Fundamentals of Business Analytics
TITLE
WK IM
YEAR LEVEL 2nd Year TIME FRAME 5 hrs. 1-5 01
NO. NO.

I. UNIT TITLE/CHAPTER TITLE

Chapter 1: Overview of Big Data and Business Analytics

II. LESSON TITLE

What is a Business Tool?

III. LESSON OVERVIEW

This module covers the introduction and concepts on Big Data and Business Analytics

IV. DESIRED LEARNING OUTCOMES

By the end of this module, the student should be able to:

1. Discuss the basic concepts on business intelligence, big data and business analytics;
2. Trace the evolution of business analytics; and
3. Give examples of big data service providers.

V. LESSON CONTENT

I. Introduction to Business Analytics

Business analytics is a powerful tool in today’s marketplace. Across industries, organizations are
generating vast amounts of data which, in turn, has heightened the need for professionals who know how
to interpret and analyze that information.

Companies worldwide are using data to:

• Boost process and cost efficiency (60 percent)


• Drive strategy and change (57 percent)
• Monitor and improve financial performance (52 percent)

Research also shows that, over the next three years and beyond, 71 percent of global enterprises predict
their investments in analytics will accelerate.

In light of this trend, gaining an in-depth understanding of business analytics can be a way to advance
your career and make better decisions in the workplace.

“Using data analytics is a very effective way to have influence in an organization,” said Harvard Business
School Professor Jan Hammond, who teaches the online course Business Analytics, in a previous
interview. “If you’re able to go into a meeting, and other people have opinions, but you have data to
support your arguments and your recommendations, you’re going to be influential.”

NVSU-FR-ICD-05-00 (081220) Page 1 of 24


Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

Before diving into the benefits of data analysis, it’s important to understand what the term “business
analytics” means.

What is Business Analytics?

Business analytics is the process of using quantitative methods to derive meaning from data in order to
make informed business decisions.

There are three primary methods of business analysis:

• Descriptive: The interpretation of historical data to identify trends and patterns


• Predictive: The use of statistics to forecast future outcomes
• Prescriptive: The application of testing and other techniques to determine which outcome will yield
the best result in a given scenario

Deciding which method to employ is dependent on the business situation at hand.

To better understand how data insights can drive organizational performance, here’s a look at some of
the ways firms have benefitted from using business analytics:

Benefits of Business Analytics

1. More informed decision making

Business analytics can be a valuable resource when approaching an important strategic decision.

When ride-hailing company Uber upgraded its Customer Obsession Ticket Assistant (COTA) in
early 2018—a tool that uses machine learning and natural language processing to help agents
improve their speed and accuracy when responding to support tickets—it used prescriptive
analytics to examine whether the new iteration of the product would be more effective than its
initial version.

Through A/B testing—a method of comparing the outcomes of two different choices—the
company was able to determine that the updated product led to faster service, more accurate
resolution recommendations, and higher customer satisfaction scores. These insights not only
streamlined Uber’s ticket resolution process, but saved the company millions of dollars.

2. Greater revenue

Companies that embrace data and analytics initiatives can experience significant financial returns.

Research shows that organizations that invest in big data yield a six percent average increase in
profits, which jumps to nine percent for investments spanning five years.

Businesses are able to quantify their gains from analyzing data report an average eight percent
increase in revenues and a 10 percent reduction in costs.

These findings illustrate the clear financial payoff that can come from a robust business analysis
strategy—one that many firms can stand to benefit from as the big data and analytics market
continues to grow.

3. Improved operational efficiency

Beyond financial gains, analytics can be used to fine-tune business operations.

Many firms are now using predictive analytics to anticipate maintenance and operational issues
before they become larger problems.
NVSU-FR-ICD-05-00 (081220) Page 2 of 24
Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

A mobile network operator surveyed noted that it leverages data to foresee outages seven days
before they occur. Armed with this information, the firm can prevent outages by more effectively
timing maintenance, enabling it to not only save on operational costs, but ensure that it’s keeping
assets at optimal performance levels.

Why Study Business Analytics?

Taking a data-driven approach to business can come with tremendous upside, but many companies
report that the number of skilled employees in analytics roles are in short supply.

LinkedIn lists business analysis as one of the skills companies need most in 2019, and the Bureau of
Labor Statistics projects operations research analyst jobs to grow by 27 percent through 2026—a rate
much faster than the average for all occupations.

“A lot of people can crunch numbers, but I think they’ll be in very limited positions unless they can help
interpret those analyses in the context in which the business is competing,” said Hammond in a previous
interview.

If you’re interested in capitalizing on the need for data-minded professionals, taking an online business
analytics course can be a way to broaden your skill set and take your career to the next level

Through learning how to recognize trends, test hypotheses, and draw conclusions from population
samples, you can build an analytical framework that can be applied in your everyday decision-making
and help your organization thrive.

“If you don’t use the data, you’re going to fall behind,” Hammond said. “People that have those
capabilities—as well as an understanding of business contexts—are going to be the ones that will add
the most value and have the greatest impact.”

II. History of Business Analytics

Historically speaking, a simple definition of Analytics is “the study of analysis.” A more useful, more
modern description would suggest “Data Analytics” is an important tool for gaining business insights and
providing tailored responses to customers. Data Analytics, sometimes abbreviated to “Analytics,” has
become increasingly important for organizations of all sizes. The practice of Data Analytics has gradually
evolved and broadened over time, providing many benefits.

The use of Analytics by business can be found as far back as the 19th century, when Frederick Winslow
Taylor initiated time management exercises. Another example is when Henry Ford measured the speed
of assembly lines. In the late 1960s, Analytics began receiving more attention as computers became
decision-making support systems. With the development of Big Data, Data Warehouses, the Cloud, and
a variety of software and hardware, Data Analytics has evolved, significantly. Data Analytics involves the
research, discovery, and interpretation of patterns within data. Modern forms of Data Analytics have
expanded to include:

• Predictive Analytics
• Big Data Analytics
• Cognitive Analytics
• Prescriptive Analytics
• Descriptive Analytics
• Enterprise Decision Management
• Retail Analytics
• Augmented Analytics
• Web Analytics
• Call Analytics

NVSU-FR-ICD-05-00 (081220) Page 3 of 24


Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

Statistics and Computers

Data Analytics is based on statistics. It has been surmised statistics were used as far back as Ancient
Egypt for building pyramids. Governments worldwide have used statistics based on censuses, for a
variety of planning activities, including taxation. After the data has been collected, the goal of discovering
useful information and insights begins. For example, an analysis of population growth by county and city
could determine the location of a new hospital.

The development of computers and the evolution of computing technology has dramatically enhanced
the process of Data Analytics. In 1880, prior to computers, it took over seven years for the U.S. Census
Bureau to process the collected information and complete a final report. In response, inventor Herman
Hollerith produced the “tabulating machine,” which was used in the 1890 census. The tabulating machine
could systematically process data recorded on punch cards. With this device, the 1890 census was
finished in 18 months.

Relational Databases and Non-Relational Databases

Relational Databases were invented by Edgar F. Codd in the 1970s and became quite popular in the
1980s. Relational Databases (RDBMs), in turn, allowed users to write in Sequel (SQL) and retrieve data
from their database. Relational Databases and SQL provided the advantage of being able to analyze
data on demand, and are still used extensively. They are easy to work with, and very useful for
maintaining accurate records. On the negative side, RDBMs are generally quite rigid and were not
designed to translate unstructured data.

During the mid-1990s, the internet became extremely popular, but relational databases could not keep
up. The immense flow of information combined with the variety of data types coming from many different
sources led to non-relational databases, also referred to as NoSQL. A NoSQL database can translate
data using different languages and formats quickly and avoids SQL’s rigidity by replacing its “organized”
storage with greater flexibility.

The development of NoSQL was followed by changes on the internet. Larry Page and Sergey Brin
designed Google’s search engine to search a specific website, while processing and analyzing Big Data
in distributed computers. Google’s search engine can respond in a few seconds with the desired results.
The primary points of interest in the system are its scalability, automation, and high performance. A 2004
white paper on the topic of MapReduce inspired several engineers and attracted an influx of talent to
focus on the challenges of processing Big Data (Data Analytics).

Data Warehouses

In the late 1980s, the amount of data being collected continued to grow significantly, in part due to the
lower costs of hard disk drives. During this time, the architecture of Data Warehouses was developed to
help in transforming data coming from operational systems into decision-making support systems. Data
Warehouses are normally part of the Cloud, or part of an organization’s mainframe server. Unlike
relational databases, a Data Warehouse is normally optimized for a quick response time to queries. In a
data warehouse, data is often stored using a timestamp, and operation commands, such as DELETE or
UPDATE, are used less frequently. If all sales transactions were stored using timestamps, an
organization could use a Data Warehouse to compare the sales trends of each month.

Business Intelligence

The term Business Intelligence (BI) was first used in 1865, and was later adapted by Howard Dresner at
Gartner in 1989, to describe making better business decisions through searching, gathering, and
analyzing the accumulated data saved by an organization. Using the term “Business Intelligence” as a
description of decision-making based on data technologies was both novel and far-sighted. Large
companies first embraced BI in the form of analyzing customer data systematically, as a necessary step
in making business decisions.

Data Mining
NVSU-FR-ICD-05-00 (081220) Page 4 of 24
Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

Data Mining began in the 1990s and is the process of discovering patterns within large data sets.
Analyzing data in non-traditional ways provided results that were both surprising and beneficial. The use
of Data Mining came about directly from the evolution of database and Data Warehouse technologies.
The new technologies allow organizations to store more data, while still analyzing it quickly and efficiently.
As a result, businesses started predicting the potential needs of customers, based on an analysis of their
historical purchasing patterns.

However, data can be misinterpreted. Someone in the trades, having purchased two pairs of blue jeans
online, probably won’t want to buy jeans for another two or three years. Targeting this person with blue
jean advertisements is both a waste of time and an irritant to the potential customer.

Big Data

In 2005, Big Data was given that name by Roger Magoulas. He was describing a large amount of data,
which seemed almost impossible to cope with using the Business Intelligence tools available at the time.
In the same year, Hadoop, which could process Big Data, was developed. Hadoop’s foundation was
based on another open-source software framework called Nutch, which was then merged with Google’s
MapReduce.

Apache Hadoop is an open-source software framework, which can process both structured and
unstructured data, streaming in from almost all digital sources. This flexibility allows Hadoop (and its
sibling open-source frameworks) to process Big Data. During the late 2000s, several open source
projects, such as Apache Spark and Apache Cassandra came about to deal with this challenge.

Analytics in the Cloud

In its early form, the Cloud was a phrase used to describe the “empty space” between users and provider.
Then, in 1997, Emory University professor Ramnath Chellappa described Cloud Computing as a new
“computing paradigm where the boundaries of computing will be determined by economic rationale,
rather than technical limits alone.”

In 1999, Salesforce provided a very early example of how to use Cloud Computing successfully. Though
primitive by today’s standards, Salesforce used the concept to develop the idea of delivering software
programs by way of the internet. Programs (or applications) could be accessed or downloaded by any
person with internet access. An organization manager could purchase software in a cost-effective, on-
demand method without leaving the office. As businesses and organizations gained a better
understanding of the Cloud’s services and usefulness, it gained in popularity.

The Cloud has evolved significantly since 1999, with customers “renting the services,” rather than
acquiring hardware and software for the same purpose. Vendors are now responsible for all the trouble-
shooting, backups, administration, capacity planning, and maintenance. And, for several business
projects, the Cloud is simply easier and more efficient to use. The Cloud now has significantly large
amounts of storage, availability to multiple users simultaneously, and the ability to handle multiple
projects.

Predictive Analytics

Predictive Analytics is used to make forecasts about trends and behavior patterns. Predictive Analytics
uses several techniques taken from statistics, Data Modeling, Data Mining, Artificial Intelligence, and
Machine Learning to analyze data in making predictions. Predictive models can analyze both current and
historical data to understand customers, purchasing patterns, procedural problems, and in predicting
potential dangers and opportunities for an organization.

Predictive Analytics first started in the 1940s, as governments began using the early computers. Though
it has existed for decades, Predictive Analytics has now developed into a concept whose time has come.
With more and more data available, organizations have begun using Predictive Analytics to increase
profits and improve their competitive advantage. The continuous growth of stored data, combined with
an increasing interest in using data to gain Business Intelligence, has promoted the use of Predictive
Analytics.
NVSU-FR-ICD-05-00 (081220) Page 5 of 24
Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

Cognitive Analytics

Most organizations deal with unstructured data. Making sense of this unstructured data is not something
humans can easily do. Cognitive Analytics merges a variety of applications to provide context and
answers. Organizations can collect data from several different sources, and cognitive analytics can
examine the unstructured data in-depth, offering decision-makers a better understanding of their internal
processes, customer preferences, and customer loyalty.

Augmented Analytics

Augmented Analytics provides automated Business Intelligence (and insights) by using Natural
Language Processing and Machine Learning. It “automates” Data Preparation and enables data sharing.
Augmented Analytics provides clear results, and access to sophisticated tools, allowing researchers and
managers to make daily decisions with a high degree of confidence. It allows decision-makers to gain
insights and act quickly and confidently.

Ultimately, Augmented Analytics attempts to reduce the work of Data Scientists by automating the steps
used in gaining insights and Business Intelligence. An Augmented Analytics engine will automatically
process an organization’s data, clean the data, analyze it, and then produce insights leading to
instructions for executives or salespeople.

III. Big Data and Business Analytics

What is Big Data?

The definition of big data is data that contains greater variety, arriving in increasing volumes and with
more velocity. This is also known as the three V’s.

Put simply, big data is larger, more complex data sets, especially from new data sources. These data
sets are so voluminous that traditional data processing software just can’t manage them. But these
massive volumes of data can be used to address business problems you wouldn’t have been able to
tackle before.
The 3 V’s of Big Data

1. Volume

The amount of data matters. With big data, you’ll have to process high volumes of low-density,
unstructured data. This can be data of unknown value, such as Twitter data feeds, clickstreams
on a web page or a mobile app, or sensor-enabled equipment. For some organizations, this might
be tens of terabytes of data. For others, it may be hundreds of petabytes.

2. Velocity

Velocity is the fast rate at which data is received and (perhaps) acted on. Normally, the highest
velocity of data streams directly into memory versus being written to disk. Some internet-enabled
smart products operate in real time or near real time and will require real-time evaluation and
action.

3. Variety

Variety refers to the many types of data that are available. Traditional data types were structured
and fit neatly in a relational database. With the rise of big data, data comes in new unstructured
data types. Unstructured and semi-structured data types, such as text, audio, and video, require
additional preprocessing to derive meaning and support metadata.

The Value and Truth of Big Data

NVSU-FR-ICD-05-00 (081220) Page 6 of 24


Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

Two more V’s have emerged over the past few years: value and veracity. Data has intrinsic value. But
it’s of no use until that value is discovered. Equally important: How truthful is your data—and how much
can you rely on it?

Today, big data has become capital. Think of some of the world’s biggest tech companies. A large part
of the value they offer comes from their data, which they’re constantly analyzing to produce more
efficiency and develop new products.

Recent technological breakthroughs have exponentially reduced the cost of data storage and compute,
making it easier and less expensive to store more data than ever before. With an increased volume of
big data now cheaper and more accessible, you can make more accurate and precise business decisions.

Finding value in big data isn’t only about analyzing it (which is a whole other benefit). It’s an entire
discovery process that requires insightful analysts, business users, and executives who ask the right
questions, recognize patterns, make informed assumptions, and predict behavior.

But how did we get here?

The History of Big Data

Although the concept of big data itself is relatively new, the origins of large data sets go back to the 1960s
and ‘70s when the world of data was just getting started with the first data centers and the development
of the relational database.

Around 2005, people began to realize just how much data users generated through Facebook, YouTube,
and other online services. Hadoop (an open-source framework created specifically to store and analyze
big data sets) was developed that same year. NoSQL also began to gain popularity during this time.

The development of open-source frameworks, such as Hadoop (and more recently, Spark) was essential
for the growth of big data because they make big data easier to work with and cheaper to store. In the
years since then, the volume of big data has skyrocketed. Users are still generating huge amounts of
data—but it’s not just humans who are doing it.

With the advent of the Internet of Things (IoT), more objects and devices are connected to the internet,
gathering data on customer usage patterns and product performance. The emergence of machine
learning has produced still more data.

While big data has come far, its usefulness is only just beginning. Cloud computing has expanded big
data possibilities even further. The cloud offers truly elastic scalability, where developers can simply spin
up ad hoc clusters to test a subset of data. And graph databases are becoming increasingly important as
well, with their ability to display massive amounts of data in a way that makes analytics fast and
comprehensive.

Benefits of Big Data

• Big data makes it possible for you to gain more complete answers because you have more
information.

• More complete answers mean more confidence in the data—which means a completely different
approach to tackling problems.

Big Data Use Cases

NVSU-FR-ICD-05-00 (081220) Page 7 of 24


Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

Big data can help you address a range of business activities, from customer experience to analytics.
Here are just a few:

1. Product Development

Companies like Netflix and Procter & Gamble use big data to anticipate customer demand. They
build predictive models for new products and services by classifying key attributes of past and
current products or services and modeling the relationship between those attributes and the
commercial success of the offerings. In addition, P&G uses data and analytics from focus groups,
social media, test markets, and early store rollouts to plan, produce, and launch new products.

2. Predictive Maintenance

Factors that can predict mechanical failures may be deeply buried in structured data, such as the
year, make, and model of equipment, as well as in unstructured data that covers millions of log
entries, sensor data, error messages, and engine temperature. By analyzing these indications of
potential issues before the problems happen, organizations can deploy maintenance more cost
effectively and maximize parts and equipment uptime.

3. Customer Experience

The race for customers is on. A clearer view of customer experience is more possible now than
ever before. Big data enables you to gather data from social media, web visits, call logs, and other
sources to improve the interaction experience and maximize the value delivered. Start delivering
personalized offers, reduce customer churn, and handle issues proactively.

4. Fraud and Compliance

When it comes to security, it’s not just a few rogue hackers—you’re up against entire expert
teams. Security landscapes and compliance requirements are constantly evolving. Big data helps
you identify patterns in data that indicate fraud and aggregate large volumes of information to
make regulatory reporting much faster.

5. Machine Learning

Machine learning is a hot topic right now. And data—specifically big data—is one of the reasons
why. We are now able to teach machines instead of program them. The availability of big data to
train machine learning models makes that possible.

6. Operational Efficiency

Operational efficiency may not always make the news, but it’s an area in which big data is having
the most impact. With big data, you can analyze and assess production, customer feedback and
returns, and other factors to reduce outages and anticipate future demands. Big data can also be
used to improve decision-making in line with current market demand.

7. Drive Innovation

Big data can help you innovate by studying interdependencies among humans, institutions,
entities, and process and then determining new ways to use those insights. Use data insights to
improve decisions about financial and planning considerations. Examine trends and what
customers want to deliver new products and services. Implement dynamic pricing. There are
endless possibilities.

Big Data Challenges

NVSU-FR-ICD-05-00 (081220) Page 8 of 24


Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

While big data holds a lot of promise, it is not without its challenges.

First, big data is…big. Although new technologies have been developed for data storage, data volumes
are doubling in size about every two years. Organizations still struggle to keep pace with their data and
find ways to effectively store it.

But it’s not enough to just store the data. Data must be used to be valuable and that depends on curation.
Clean data, or data that’s relevant to the client and organized in a way that enables meaningful analysis,
requires a lot of work. Data scientists spend 50 to 80 percent of their time curating and preparing data
before it can actually be used.

Finally, big data technology is changing at a rapid pace. A few years ago, Apache Hadoop was the
popular technology used to handle big data. Then Apache Spark was introduced in 2014. Today, a
combination of the two frameworks appears to be the best approach. Keeping up with big data technology
is an ongoing challenge.

How Big Data Works

Big data gives you new insights that open up new opportunities and business models. Getting started
involves three key actions:

1. Integrate

Big data brings together data from many disparate sources and applications. Traditional data
integration mechanisms, such as extract, transform, and load (ETL) generally aren’t up to the
task. It requires new strategies and technologies to analyze big data sets at terabyte, or even
petabyte, scale.

During integration, you need to bring in the data, process it, and make sure it’s formatted and
available in a form that your business analysts can get started with.

2. Manage

Big data requires storage. Your storage solution can be in the cloud, on premises, or both. You
can store your data in any form you want and bring your desired processing requirements and
necessary process engines to those data sets on an on-demand basis. Many people choose their
storage solution according to where their data is currently residing. The cloud is gradually gaining
popularity because it supports your current compute requirements and enables you to spin up
resources as needed.

3. Analyze

Your investment in big data pays off when you analyze and act on your data. Get new clarity with
a visual analysis of your varied data sets. Explore the data further to make new discoveries. Share
your findings with others. Build data models with machine learning and artificial intelligence. Put
your data to work.

Big Data Best Practices

1. Align big data with specific business goals

More extensive data sets enable you to make new discoveries. To that end, it is important to base
new investments in skills, organization, or infrastructure with a strong business-driven context to
guarantee ongoing project investments and funding. To determine if you are on the right track,
ask how big data supports and enables your top business and IT priorities. Examples include
understanding how to filter web logs to understand ecommerce behavior, deriving sentiment from
social media and customer support interactions, and understanding statistical correlation methods
and their relevance for customer, product, manufacturing, and engineering data.
NVSU-FR-ICD-05-00 (081220) Page 9 of 24
Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

2. Ease skills shortage with standards and governance

One of the biggest obstacles to benefiting from your investment in big data is a skills shortage.
You can mitigate this risk by ensuring that big data technologies, considerations, and decisions
are added to your IT governance program. Standardizing your approach will allow you to manage
costs and leverage resources. Organizations implementing big data solutions and strategies
should assess their skill requirements early and often and should proactively identify any potential
skill gaps. These can be addressed by training/cross-training existing resources, hiring new
resources, and leveraging consulting firms.

3. Optimize knowledge transfer with a center of excellence

Use a center of excellence approach to share knowledge, control oversight, and manage project
communications. Whether big data is a new or expanding investment, the soft and hard costs can
be shared across the enterprise. Leveraging this approach can help increase big data capabilities
and overall information architecture maturity in a more structured and systematic way.

4. Top payoff is aligning unstructured with structured data

It is certainly valuable to analyze big data on its own. But you can bring even greater business
insights by connecting and integrating low density big data with the structured data you are
already using today.

Whether you are capturing customer, product, equipment, or environmental big data, the goal is
to add more relevant data points to your core master and analytical summaries, leading to better
conclusions. For example, there is a difference in distinguishing all customer sentiment from that
of only your best customers. Which is why many see big data as an integral extension of their
existing business intelligence capabilities, data warehousing platform, and information
architecture.

Keep in mind that the big data analytical processes and models can be both human- and machine-
based. Big data analytical capabilities include statistics, spatial analysis, semantics, interactive
discovery, and visualization. Using analytical models, you can correlate different types and
sources of data to make associations and meaningful discoveries.
5. Plan your discovery lab for performance

Discovering meaning in your data is not always straightforward. Sometimes we don’t even know
what we’re looking for. That’s expected. Management and IT needs to support this “lack of
direction” or “lack of clear requirement.”

At the same time, it’s important for analysts and data scientists to work closely with the business
to understand key business knowledge gaps and requirements. To accommodate the interactive
exploration of data and the experimentation of statistical algorithms, you need high-performance
work areas. Be sure that sandbox environments have the support they need—and are properly
governed.

6. Align with the cloud operating model

Big data processes and users require access to a broad array of resources for both iterative
experimentation and running production jobs. A big data solution includes all data realms
including transactions, master data, reference data, and summarized data. Analytical sandboxes
should be created on demand. Resource management is critical to ensure control of the entire
data flow including pre- and post-processing, integration, in-database summarization, and
analytical modeling. A well-planned private and public cloud provisioning and security strategy
plays an integral role in supporting these changing requirements.

NVSU-FR-ICD-05-00 (081220) Page 10 of 24


Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

IV. Big Data Investments by the Numbers

The Role of Big Data in Investing

The following are excerpts from the interview with investment professionals across Goldman Sachs Asset
Management (GSAM) Quantitative Investment Strategies team on the role of big data in potential
investments.

• Osman Ali, Portfolio Manager, Quantitative Investment Strategies, GSAM


• Takashi Suwabe, Portfolio Manager, Quantitative Investment Strategies, GSAM
• Dennis Walsh, Portfolio Manager, Quantitative Investment Strategies, GSAM

1. Can you explain your investment philosophy and how access to big data has impacted how you
invest?

Osman Ali: We are focused on creating data-driven investment models that can objectively
evaluate public companies globally through fundamentally-based and economically-motivated
investment themes. These models have historically utilized a large set of company-specific data
like publicly available financial statement, as well as market data like prices, returns, volumes,
etc. With the growth and availability of non-traditional data sources such as internet web traffic,
patent filings and satellite imagery, we have been using more nuanced and sometimes
unconventional data to help us gain an informational advantage and make more informed
investment decisions.

2. What types of data are you analyzing and how does it differ from what you were looking at before
the Data Revolution?

Takashi Suwabe: We identify strong businesses with attractive valuations, positive sentiment and
a strong connection with positive themes that are trending in the markets. The types of data we
analyze now are quite a bit more expansive than what we used 10 years ago. In the past,
computers could only analyze structured data, or data that is easily quantifiable and organized in
a set form. New technologies allow us to analyze unstructured data, or data that is not as easily
quantified. These innovations enable us to interpret information from a much wider variety of
sources, including language, images and speech for the first time.

Access to new types of data, along with the ability to capture and process that data quickly, has
given us new ways to capture investment themes such as momentum, value, profitability and
sentiment.

The Quantitative Investment Strategies Approach to Identifying Investment Opportunities

Big Data Investment Approach

Use Machine Learning techniques to identify the connections between companies based on
Momentum
industry sentiment, stock movements, and correlations in economic factors.
Analyze a large universe of industry-specific data that extends beyond a company’s financial
Value
statements to determine its “intrinsic value.”
Evaluate a company’s web traffic patterns to identify businesses that are gaining E-Commerce
Profitability
market share in real-time.

3. How has your technology and infrastructure evolved to keep up with big data?

Dennis Walsh: New data storage technologies have created the infrastructure needed to capture,
analyze and make informed decisions from new forms of real-time data. For example, the growth
of distributed databases, where data is stored across several platforms in place of a single
platform via a centralized database, allows for highly-scalable parallel processing of vast amounts
of data. This can decrease processing time by several orders of magnitude for many applications.
Unstructured data storage also allows for greater flexibility in onboarding and retrieving data from
non-traditional sources and in managing large amounts of text-based information.
NVSU-FR-ICD-05-00 (081220) Page 11 of 24
Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

4. How do portfolio managers interact with the models that are analyzing data and making
recommendations?

Takashi Suwabe: Data is the basis of our investment model, but the research and portfolio
construction processes still require human judgement. Portfolio managers exercise their judgment
when selecting the data and analytics that we use in investing, and also when reviewing and
approving each trade in every portfolio. This is to ensure that all portfolio positions make sense—
that they are economically intuitive and appropriately sized given current market conditions. We
do not have a computer in the corner simply shooting out trades with no human interaction.

We are researching new factors and analytics that have an impact on stock prices, and our
portfolio managers drive that research. Research success for us is not finding a new stock to
invest in, but rather finding a new investment factor that can help improve the way we select
stocks. Investment factors should be fundamentally-based and economically-motivated, and the
data enables us to empirically test our investment hypotheses. We would never work in the
opposite direction—observing relationships in the data that we would seek to justify or explain
after-the-fact.

Practically speaking, portfolio managers also rely on their own practitioner experience and market
knowledge to assess the future success of any investment factor. Certain market trends or risk
environments may bode well for particular factors and poorly for others. This awareness allows
our portfolio managers to more effectively assess risk on a real-time basis.

5. What kinds of boundaries are you pushing now and what do you see as the future of big data-
driven investment approaches like yours?

Dennis Walsh: Active management has always been about uncovering opportunities before they
are priced in by the broader market. The exponential growth in data is fueling our investment
decisions and research agenda. We’re seeking to push boundaries by moving beyond
conventional data sources and leveraging alternative forms of data to gain an informational edge.

Today, we’re able to process more data more quickly, in an effort to uncover insights and
connections that aren’t as obvious to other investors. Given new data availability and the
development of machine learning techniques to learn quickly from such data, we are only at the
beginning of this Data Revolution that we believe is transforming every industry globally.

6. What kinds of machine learning data analysis techniques do you use?

Osman Ali: Machine learning techniques allow us the flexibility to create dynamic models that
adapt to the data. Quantitative techniques in the past relied on more simplistic rules for ranking
companies based on certain pre-determined metrics—take price-to-book, for example—newer
machine learning techniques allow algorithms to learn and adapt from constantly changing data.

Natural language processing, or NLP, uses computers to read and interpret vast amounts of text,
enabling us to incorporate textual data in multiple languages from a variety of sources. One of the
more obvious NLP applications is to gauge sentiment in the text—is the tone in the news articles
or research reports being published on a company positive or negative? An extension of NLP is
topic modeling—summarizing a large body of text into topics and themes that are easily
understood by humans, but can also be used for systematic analysis in statistical and machine
learning applications. For example, what subjects did company management focus on in their
earnings call this quarter versus last quarter?

NLP also allows us to pick up on subtle relationships between companies that might otherwise go
unnoticed—we call this intercompany momentum. Traditional momentum focuses on the
persistence of price movements for a single security, while intercompany momentum seeks to
understand how the movement in price of one security might impact, albeit subtly, the movement
in price of other related securities. These not-so-obvious relationships can be assembled from the
clustering of companies in text-based data, appearing together in news articles, regulatory filings
or research reports.
NVSU-FR-ICD-05-00 (081220) Page 12 of 24
Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

7. What is your approach to big data in Emerging Markets?

Dennis Walsh: We feel that the information asymmetry in emerging markets may create
opportunities for data-driven investors like ourselves. A lack of available data is a sign of
mispricing and uncertainty, and investors who are diligent enough to analyze and uncover
potential opportunities in this environment may be rewarded. With 4,000 companies in the
emerging market universe, spanning 23 countries across 6 continents, it can be a challenge to
capture and digest vast amounts of disparate information, especially since the quality of data or
reporting governance standards in some of these countries is lacking. Our experience and
sophisticated techniques make us well-positioned to act in this space and analyze potential
investments without necessarily requiring us to have analysts locally based around the world. This
centralization of data processing is more scalable and allows us to cover a wider breadth of
companies when compared to traditional methods.

V. Providers of Big Data Services

1. ScienceSoft 6. Oracle 11. Google


2. Xplenty 7. SAP 12. VMware
3. IBM 8. EMC 13. Splunk
4. HP Enterprise 9. Amazon 14. Alteryx
5. Teradata 10. Microsoft 15. Cogito

1. Alteryx

Alteryx software is for the business user and not for a data scientist. Alteryx provides the ability
for analysts to meet their organization’s analytics needs. Alteryx delivers a platform for self-service
data analytics. It has access and ability to integrate from Big Data environment such as Hadoop
SAP Hana, Microsoft SQL Azure Database, etc.

Prepare and blend data inside and outside the Big Data environment.

Big Data analytics provides an opportunity for the organization to get new sources of insights from
a new source of data. Alteryx allows different organizations to take advantage of data from a big
data environment. This data again can be integrated with external datasets to gain the maximum
value from corresponding data sources

2. Amazon

Amazon.com founded in 1994 with headquarters in Washington. As of May 2017, it has a Market
Capitalization of $427 billion and sales of $135.99 billion as per Forbes list. The total employee
headcount as of May 2017 is 341,400.

Amazon is well known for its cloud-based platform. It also offers Big Data products and its main
product is Hadoop-based Elastic MapReduce. DynamoDB Big Data database, the redshift, and
NoSQL are data warehouses and are work with Amazon Web Services.

Big Data Analytics application can be built and deploy quickly using Amazon Web Services. These
applications can be built virtually using AWS which provides fast and easy access to low cost IT
resources. AWS helps to collect, analyze, store process, and visualize big data on the cloud.

3. Cogito

NVSU-FR-ICD-05-00 (081220) Page 13 of 24


Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

Cogito uses a famous technology as – behavioral analytics technology. Cogito analyzes the voice
signals in phone calls to improve communication, customer emails, social media behavior, etc.

Cogito also detects human signals and provides guidance to improve the interaction quality with
everyone. It helps in phone support and helps organizations to manage the agent performance.
Real-time guidance increases the call efficiency and gets the customer feedback, perception after
every call.

4. EMC

DELL EMC helps businesses to store, analyze and protect their data. It provides an infrastructure
to get the business outcome from Big Data. It helps the organization to understand customer
behavior, risk, operations. Dell EMC has over 50% growth with Data Analytics.

Data stored in one centralized repository which simplifies the analytics and management.
Powerful infrastructure gives your organization a competitive edge and increased revenue. SAP
Big Data Foundation has below listed products:

• Isilon
• ECS
• Boomi
• PowerEdge for Hadoop

5. Google

Google is founded in 1998 and California is headquartered. It has $101.8 billion market
capitalization and $80.5 billion of sales as of May 2017. Around 61,000 employees are currently
working with Google across the globe.

Google provides integrated and end to end Big Data solutions based on innovation at Google and
help the different organization to capture, process, analyze and transfer a data in a single platform.
Google is expanding its Big Data Analytics; BigQuery is a cloud-based analytics platform that
analyzes a huge set of data quickly.

BigQuery is a serverless, fully managed and low-cost enterprise data warehouse. So it does not
require a database administrator as well as there is no infrastructure to manage. BigQuery can
scan terabytes data in seconds and pentabytes data in minutes.

6. HP Enterprise

HP Enterprise was acquired by Micro Focus including Vertica

Micro Focus has built up a strong portfolio in Big Data products in a very short time span. The
Vertica Analytics Platform is designed to manage a large volume of structured data and it has the
fastest query performance on Hadoop and SQL Analytics. Vertica delivers 10-50x faster
performance or more compared to legacy systems.

With the help of Big Data software, it enables different organizations to store, analyze and explore
data irrespective of the source of data, type of data or location of data.

7. IBM

International Business Machine (IBM) is an American company headquartered in New York. IBM
is listed at # 43 in Forbes list with a Market Capitalization of $162.4 billion as of May 2017. The
company’s operation is spread across 170 countries and the largest employer with around
414,400 employees.

IBM has a sale of around $79.9 billion and a profit of $11.9 billion. In 2017, IBM holds most patents
generated by the business for 24 consecutive years.

NVSU-FR-ICD-05-00 (081220) Page 14 of 24


Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

IBM is the biggest vendor for Big Data-related products and services. IBM Big Data solutions
provide features such as store data, manage data and analyze data.

There are numerous sources from where this data comes and accessible to all users, Business
Analysts, Data Scientist, etc. DB2, Informix, and InfoSphere are popular database platforms by
IBM which supports Big Data Analytics. There are also famous analytics applications by IBM such
as Cognos and SPSS.

8. Microsoft

It is US-based Software and Programming Company, founded in 1975 with headquarters in


Washington. As per Forbes list, it has a Market Capitalization of $507.5 billion and $85.27 billion
of sales. It currently employed around 114,000 employees across the globe.

Microsoft’s Big Data strategy is wide and growing fast. This strategy includes a partnership with
Hortonworks which is a Big Data startup. This partnership provides HDInsight tool for analyzing
structured and unstructured data on Hortonworks data platform (HDP)

Recently Microsoft has acquired Revolution Analytics which is a Big Data Analytics platform
written in “R” programming language. This language used for building Big Data apps that do not
require a skill of Data Scientist.

9. Oracle

Oracle offers fully integrated cloud applications, platform services with more than 420,000
customers and 136,000 employees across 145 countries. It has a Market capitalization of $182.2
billion and sales of $37.4 B as per Forbes list.

Oracle is the biggest player in the Big Data area, it is also well known for its flagship database.
Oracle leverages the benefits of big data in the cloud. It helps organizations to define its data
strategy and approach which includes big data and cloud technology.

It provides a business solution that leverages Big Data Analytics, applications, and infrastructure
to provide insight for logistics, fraud, etc. Oracle also provides Industry solutions which ensure
that your organization takes advantage of Big Data opportunities.

Oracle’s Big Data industry solutions address the growing demand for different industries such as
Banking, Health Care, Communications, Public Sector, Retail, etc. There are a variety of
Technology solutions such as Cloud Computing, Application Development, and System
Integration.

10. SAP

SAP is the largest business software company founded in 1972 with headquarters in Walldrof,
Germany. It has a Market Capitalization of $119.7 billion with total employee count as 84,183 as
of May 2017.

As per the Forbes list, SAP has sales of $24.4 billion and a profit of around $4 B with 345,000
customers. It is the largest provider of enterprise application software and the best cloud company
with 110 million cloud subscribers.

The SAP provides a variety of Analytics Tool but its main Big Data Tool is the HANA-in memory
relational database. This tool integrates with Hadoop and can run on 80 terabytes of data.

SAP helps the organization to turn a huge amount of Big Data into real-time insight with Hadoop.
It enables distributed data storage and advanced computation capabilities.

11. ScienceSoft

NVSU-FR-ICD-05-00 (081220) Page 15 of 24


Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

ScienceSoft is a US-based provider of big data solutions and services with 31+ years of
experience in data analytics and data science.

ScienceSoft’s expertise covers a wide list of big data technologies including:

• Hadoop ecosystem
• Apache Spark
• Apache Hive
• Apache Cassandra
• Amazon big data ecosystem (Amazon EMR, Amazon S3, Amazon DynamoDB, Amazon
Kinesis, Amazon Redshift, etc.).
• Microsoft Azure big data ecosystem (Azure Data Lake Storage, Azure Cosmos DB, Azure
Stream Analytics,
• Azure Synapse Analytics

12. Splunk

Splunk Enterprise started as a log analysis tool and expanded its focus on machine data analytics.
With the help of machine data analytics, the data or information is usable by anyone.

It helps in monitoring the online end to end transactions; monitor the security threats if any, helps
to study customer behavior and helps for sentiment analysis on the social platform. Using the
Splunk Big Data you can search, explore and visualize data in one place.

Splunk’s Big Data solutions include:

• Splunk Analytics for Hadoop


• Splunk ODBC Driver
• Splunk DB Connect

13. Teradata

Teradata is founded in 1974 with headquarter at Dayton, Ohio. Teradata has more than 10K
employees across 43 countries and around 1,400 customers with a Market Capitalization of
$7.7B. It has extensive 35+ years of experience in innovation and leadership. Teradata Corp.
provides an analytic data platform, marketing, consulting services, and analytics application.

Teradata helps different companies to get value from their data. Teradata’s Big Data Analytical
solutions and a team of experts help different organizations to gain the advantage of data.
Teradata portfolio includes various Big Data applications such as Teradata QueryGrid, Teradata
Listener, Teradata Unity, and Teradata Viewpoint.

14. VMware

VMware founded in 1998 and headquartered is in Palo Alto, California. Around 20,000 employees
are working and it has a Market Capitalization of $37.8 billion as of May 2017. Also as per Forbes
data, it has sales of around $7.09 billion.

VMware is well known for its cloud and virtualization but nowadays it is becoming a big player in
Big Data. Virtualization of Big Data enables simpler Big Data infrastructure management, delivers
results quickly and very cost-effective. VMware Big Data is simple, flexible, cost-effective, agile
and secure.

It has a product VMware vSphere Big Data Extension which enables us to deploy, manage and
controls Hadoop deployments. It supports Hadoop distributions which include Apache,
Hortonworks, MapR, etc. With the help of this extension, the resource can be used efficiently on
the new and existing hardware.

15. Xplenty
NVSU-FR-ICD-05-00 (081220) Page 16 of 24
Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

Xplenty is a cloud-based data integration, ETL, and ELT platform that will streamline data
processing. It can bring all your data sources together. It will let you create simple, visualized data
pipelines to your data lake.

Xplenty’s Big Data processing cloud service will provide immediate results to your business like
designing data flows and scheduling jobs. It can process structured and unstructured data.

Through this platform, organizations will be able to integrate, process, and prepare data for
analysis on the cloud. Xplenty will ensure that businesses can quickly and easily benefit from big
data opportunities without investing in hardware, software, or related personnel.

Every organization will be able to immediately connect to a variety of data stores. Companies will
get a rich set of out-of-the-box data transformation components with Xplenty.

Xplenty has a team of top data experts, engineers, and DevOps. This team provides a data
integration platform with a simplified data processing service. Xplenty has solutions for Marketing,
sales, support, and developers.

In this section, we have seen the top Big Data Companies. This is not an exhaustive list and there are
many other companies who are startup now but have the capabilities to grow faster. This will be
challenging for the other rival companies.

VI. LEARNING ACTIVITIES

Activity #1: Finish the Python Course using the Codecademy Go App
Directions:

1. Open the Codecademy Go App.

2. Login to the app using your registered account.

NVSU-FR-ICD-05-00 (081220) Page 17 of 24


Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

3. After logging in, you will see a screen similar to the image below.

4. Select the Python course from the list.

5. Scroll down to see all lessons under the Python course. You will see that there are 13 lessons
in all. You are required to finish all lessons except lesson 12.

6. Scroll up to Lesson 1. Tap on the lesson to expand. You will see 3 sections; Review, Practice,
and Cheatsheet. You will see these 3 sections for every lesson.

NVSU-FR-ICD-05-00 (081220) Page 18 of 24


Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

7. Open the Review section and start reading the content.

8. When you finish the lesson, you will see a screen similar to the image below.

NVSU-FR-ICD-05-00 (081220) Page 19 of 24


Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

9. You may tap on the “continue to practice” button to take the exercises under the lesson or you
can tap on the back navigation button (see encircled in red) to go back to the lessons page.

10. For demonstration, I will now take the exercises under lesson 1. I will go to the lessons page
and tap on lesson 1 and tap on the Practice section.

NVSU-FR-ICD-05-00 (081220) Page 20 of 24


Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

11. Notice the progress bar on the upper part of the exercise screen. Answer the exercises until
you complete your progress.

NVSU-FR-ICD-05-00 (081220) Page 21 of 24


Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

12. If you are stuck in a question. You might want to tap on the hint button to help you.

13. Upon finishing the exercises, you should see a summary screen similar to the image below.
Take a screenshot of this summary and keep it as we are going to use it for documentation later
on.

This icon means you did not answer correctly on your first attempt on that

particular exercise. You are not given a score for this icon.

You screenshot should include your entire screen. The clock should be visible as well as the
other details.

14. Do the same steps for the other lessons.


NVSU-FR-ICD-05-00 (081220) Page 22 of 24
Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

VII. ASSIGNMENT

Assignment #1: Register for a Codecademy Account


Directions:

1. Go to https://www.codecademy.com

2. Enter your NVSU Email. Choose and create your own password different from that of your
NVSU Email. Take note of your password to avoid forgetting it later. Click/tap Sign Up.

DO NOT USE THESE LINKS


IN CREATING YOUR
ACCOUNT

NVSU-FR-ICD-05-00 (081220) Page 23 of 24


Republic of the Philippines
NUEVA VIZCAYA STATE UNIVERSITY
Bayombong, Nueva Vizcaya
INSTRUCTIONAL MODULE
IM No.:_________________________________
IM-IST1_1-2NDSEM-2020-2021

Assignment #2:

Watch the video lecture "Introduction to Big Data and Business Analytics" by Prof. Erik Paolo
Capistrano. The video is available in our Facebook Group Chat and in our Moodle LMS.

VIII. EVALUATION (Note: Not to be included in the student’s copy of the IM)

(Long Quiz covering Chapter 1-3 on the 6th week).

IX. REFERENCES

Top 13 best big data companies of 2021. (2021, March 27). Retrieved from
https://www.softwaretestinghelp.com/big-data-companies/

Gavin, M. (2019, July 16). Business Analytics: What it is and Why it’s Important. Harvard Business
Shool Online. Retrieved from https://online.hbs.edu/blog/post/importance-of-business-analytics

Foote, K. D. (2018, September 25). A Brief History of Analytics. DATAVERSITY. Retrieved from
https://www.dataversity.net/brief-history-analytics/.

UPOUNet. (2018, May 31). Introduction to big data and business Analytics | Prof. Erik paolo
capistrano. Retrieved from
https://www.youtube.com/watch?v=nVifzj0zM1Y&feature=emb_imp_woyt

The role of big data in investing. (n.d.). Retrieved from


https://www.gsam.com/content/gsam/global/en/market-insights/gsam-insights/gsam-
perspectives/2016/big-data/gsam-roundtable.html

What is big data? (n.d.). Retrieved from https://www.oracle.com/ph/big-data/what-is-big-data/

NVSU-FR-ICD-05-00 (081220) Page 24 of 24

You might also like