You are on page 1of 16

CONCEPT 1: PORTER’S FIVE FORCES:

The tool was created by Harvard Business School professor Michael Porter, to analyze an industry's
attractiveness and likely profitability. Since its publication in 1979, it has become one of the most
popular and highly regarded business strategy tools.

Porter recognized that organizations likely keep a close watch on their rivals, but he encouraged
them to look beyond the actions of their competitors and examine what other factors could impact
the business environment. He identified five forces that make up the competitive environment, and
which can erode your profitability. These are:

1. Competitive Rivalry- This looks at the number and strength of your competitors. How many
rivals do you have? Who are they, and how does the quality of their products and services
compare with yours?

Where rivalry is intense, companies can attract customers with aggressive price cuts and
high-impact marketing campaigns. Also, in markets with lots of rivals, your suppliers and
buyers can go elsewhere if they feel that they're not getting a good deal from you. On the
other hand, where competitive rivalry is minimal, and no one else is doing what you do, then
you'll likely have tremendous strength and healthy profits.

2. Supplier Power- This is determined by how easy it is for your suppliers to increase their
prices. How many potential suppliers do you have? How unique is the product or service
that they provide, and how expensive would it be to switch from one supplier to another?

The more you have to choose from, the easier it will be to switch to a cheaper alternative.
But the fewer suppliers there are, and the more you need their help, the stronger their
position and their ability to charge you more. That can impact your profit.

3. Buyer Power- Here, you ask yourself how easy it is for buyers to drive your prices down.
How many buyers are there, and how big are their orders? How much would it cost them to
switch from your products and services to those of a rival? Are your buyers strong enough to
dictate terms to you?

When you deal with only a few savvy customers, they have more power, but your power
increases if you have many customers.
4. Threat of Substitution- This refers to the likelihood of your customers finding a different
way of doing what you do. For example, if you supply a unique software product that
automates an important process, people may substitute it by doing the process manually or
by outsourcing it. A substitution that is easy and cheap to make can weaken your position
and threaten your profitability.

5. Threat of New Entry- Your position can be affected by people's ability to enter your market.
So, think about how easily this could be done. How easy is it to get a foothold in your
industry or market? How much would it cost, and how tightly is your sector regulated?

If it takes little money and effort to enter your market and compete effectively, or if you have little
protection for your key technologies, then rivals can quickly enter your market and weaken your
position. If you have strong and durable barriers to entry, then you can preserve a favorable position
and take fair advantage of it.

CONCEPT 2: STRATEGIC ALIGNMENT GRID (IT & BUSINESS MODEL)


To develop an appropriate IT management strategy for our organization, we must first look at two
critical dimensions. The first is the strategic impact of the firm’s IT initiatives on its operations
(vertical axis). The second is the strategic impact of IT initiatives (horizontal axis) that will influence
future sustainable business advantage for the firm. Let's discuss these two variables in greater detail.
CONCEPT 3: MCFARLAN’S STRATEGIC GRID
The Strategic Grid for IT from McFarlan, McKenney & Pyburn ('83) is a tool that can be used to assess
the current operational dependence on information systems (low, high) versus the future potential
strategic impact of information systems (low, high). Combining the 2 views in a matrix results in 4
possible combinations:

• Support (currently low, low in the future too). IT has little relevance and simply supports
some processes. Firms or systems in this quadrant will place the least amount of emphasis
on IT (planning) in terms of senior management concern and involvement.

• Turnaround (currently low, high in the future). IT will be a key feature of future strategic
planning. Significant top management involvement in IT (planning) must be established.

• Factory (currently high, low in the future). It is important in terms of day-to-day operations
but it is not felt that there are any major IT developments on the horizon that will
fundamentally alter the nature of the business. The level of senior management involvement
is decreasing.

• Strategic (high, high). IT Strategy is very important ands plays a critical role in the
formulation of the overall business strategy. High level of involvement of top management
in IT strategy.

One can use the McFarlan Grid to analyze the impact of information systems as a whole for an
organization or for a division or department of an organization. But the strategic grid can also be
used to analyze the dependence and impact of individual IT applications or systems.

Any organization would be wise to have an IT Strategy, even if it is in the support quadrant.
CONCEPT 4: TYPES OF CHANGE
There are two main ways of categorizing change. The first, defines change as either proactive or
reactive. Proactive change is usually undertaken in order to make the organisation, its systems,
and/or its people more effective in dealing with demands from its environment. Such demands may
be either existing or anticipated but the drive to change comes from within the organisation.
Reactive change is typically undertaken in response to environmental demands. Reactive change is
adaptive. It is aimed at making the organisation better able to deal with its environment. In this case,
the drive to change comes from outside the organisation.

The second change dimension looks at whether change is incremental or quantum. Incremental
changes do not question the basic nature of the system or of the organisation. Incremental change is
linear, orderly, slow, and ongoing, although small changes may over time accumulate to create large
effects. Quantum change by contrast is discontinuous, chaotic, fast, and temporary. It is marked by a
shift in paradigm of how the organisation and its people think about themselves, their organisation,
and how they do business. Quantum change is the sudden and radical punctuation of the normal
equilibrium.

Combining the two change dimensions results in four categories of change. These are
transformational, revolutionary, evolutionary, and developmental.

• Transformational Change- Transformational change is both proactive and quantum. The


organisation may go through a period of death and rebirth, arising phoenix-like from its own
ashes. The organisation is transformed by design from what it was, how its members
thought and acted, into a row organisation with qualitatively different methods. In this
approach, organisations have to match the complex adaptive systems found in nature in
order to cope with the newly chaotic business environment. One reason transformational
change is needed is that organisations can be limited by their own successes. Success is the
product of deep grooves, which then destroy adaptability. The better the organisation
becomes at a particular way of operating, or the more successful a particular product
becomes, the more likely it is that this success will lead to entrenched ways of operating and
thinking that will hinder future adaptability. To transform the organisation, techniques must
be used to break out of these grooves, to break the organisation's frame of its way of
thinking and operating.
• Revolutionary Change- This type of change is quantum and reactive. Organisations have to
adapt to chance events that have large effects on their environment. Those who fail to make
the major changes necessary will likely cease to control. Required for success are
revolutionaries with a qualitatively different conception of the organisation and its
environment who can win control and force massive and quick change. Manufacturers of
inexpensive watches had to make the fundamental shift from mechanical to electronic in
order to survive when physics made the quantum leap to transistors and integrated circuits.
• Evolutionary Change- Evolutionary change is reactive and incremental. Organisations
undergoing this type of change mimic the adaptive methods found in nature. They respond
to feedback from the environment, learn from experience, and specialize without getting
stuck in rigidity.
• Developmental Change- This category of change is proactive and incremental. The
organisation's aim is to develop in small, step-by-step increases, so that at each step it is
slightly better than it was before. Many of the methods used by organisations for their own
improvement fall into this change category.

Note: According to Sir’s diagram, proactive is top-down and reactive is bottom-up. Also
developmental is synonymous with tactical. Rest remain same.

CONCEPT 5: DELTA MODEL (ANALYTICS)


The DELTA Plus Model Framework encompasses the five foundational elements of a successful
analytics program (Data, Enterprise, Leadership, Targets, and Analysts) and introduces two new
elements (Technology and Analytical Techniques) required for high performance.

As enterprises of all shapes and sizes commit to harnessing the power of data and analytics to
transform all aspects of their businesses, leadership will inevitably ask these questions:

• How good are we at using data and analytics throughout our enterprise? Are we actually as good
as we think?

• Are we ahead of or behind our nearest competitors? Are other industries ahead of ours? • Are we
moving toward becoming an analytical competitor?

• How can we set a path forward without knowing where we stand today?

The Five Stages of Analytics Maturity and the DELTA Model have become the industry standard
frameworks for assessing analytics maturity. The five stages of analytics maturity were introduced in
2007 by Tom Davenport and Jeanne Harris in their book, Competing on Analytics: The New Science
of Winning. The DELTA Model was introduced in 2010 by Tom Davenport, Jeanne Harris and Bob
Morison in their book, Analytics at Work: Smarter Decisions, Better Results. Both frameworks were
updated by Tom Davenport and Jeanne Harris in their 2017 revision of Competing on Analytics. Two
new components were added to the DELTA model, creating the DELTA Plus model.
DATA- For meaningful analytics, data must be organized, unique, integrated, accessible, and of high
quality. Of course, not all organizations have an environment that encompasses all the important
elements of data, but it’s important to know what to pursue to create the largest opportunity. The
way an organization’s data is structured influences the type of analysis that can be done. The ability
of an organization to structure and leverage unstructured data also influences the type and value of
analytics that is done. The same is true for data uniqueness – if an organization can also gather
unique data outside what other companies have access to, then they have an analytical edge and
more opportunity in their analyses. Organizations also need to integrate their data across
organizational silos and boundaries. Most organizations have multiple transaction systems in
different business units and functions, and to fully understand organizational performance data from
all of them needs to be combined and harmonized. It is also no secret that many organizations face
data quality issues. Once data has been cleaned and integrated, it must be made accessible to the
organization for analytical purposes. Simply put, analysis cannot be done if the data cannot be
located and accessed. Data warehouses or Hadoop-based data lakes are the primary means to allow
analysts and non-analysts to access data. These repositories can be deployed on premise, in the
cloud or in a hybrid mix of the two. Finally, if an enterprise is becoming more mature within all
aspects of its data environment, it implements a dynamic governance strategy to ensure high-quality
and well-managed data across the organization.

ENTERPRISE- Analytical competitors take an enterprise approach to managing systems, data and
people. They have coordinated approaches relying on enterprise-level organizational structures,
resource allocations and plans. To embrace this approach a company must advocate a single and
consistent perspective for analytics across the organization. This is accomplished by setting an
analytics strategy and building a road map for strategy implementation. Integrating data and
managing a unified data and analytics platform are key components of an analytics road map, as is
cultivating a culture of analytics across the organization. Perspectives from individual managers and
business units/functions that do not support or advance the enterprise view must be discouraged
and replaced with a single, enterprise wide view of analytics. If analytics goals are not centrally
established, organizational silos can develop and lead to duplicated efforts and tools, errors in
analysis, ineffective use of resources, conflict among different groups, and increased complexity with
analytics projects. An enterprise approach to analytics will greatly increase the organization’s
competitiveness.

LEADERSHIP- Analytical organizations have leaders who fully embrace analytics and lead company
culture toward data-driven decision-making. Beyond the CEO or other top executives, all levels of
leadership within the organization should support analytics. This is important for cultural acceptance
of analytics across the enterprise, as well as the accomplishment of analytics initiatives. In Analytics
at Work, authors Davenport, Harris and Morison note 12 traits that analytical leaders exhibit in
analytically competitive organizations.

1. Possess people skills

2. Push for more data and analysis

3. Hire smart people, and give them credit

4. Set a hands-on example

5. Sign up for results

6. Teach

7. Set strategy and performance expectations

8. Look for leverage

9. Demonstrate persistence over time

10. Build an analytical ecosystem

11. Work along multiple fronts

12. Know the limits of analytics

TARGETS- Virtually no organization can afford to be equally analytical in all parts of its business.
Analytics efforts must be aligned with specific, strategic targets that are also aligned with corporate
objectives. Organizations will get lost in all the business opportunities that analytics can support if
they do not focus on a few initial and purposeful use cases and applications. Choosing these targets
based on the organization’s strategic plan is helpful, but not always easy. What is typically required
is a group of executives that understands both the business and the analytical possibilities for
improving it. Enterprises can also survey internal employees for ideas, as well as external groups to
help understand industry and analytical trends. Looking beyond one’s industry is also helpful to find
opportunities in common, cross-industry applications. When determining what targets to choose,
leaders should narrow in on the best options. This often requires several steps. Leaders need to
think about the big picture for where the business is going, create a systematic inventory of
possibilities, and then prioritize potential uses of analytics based on the benefit for and capabilities
of the organization. Once an enterprise is mature, its targets become embedded in the strategic
planning process, and are considered business initiatives, not just analytics initiatives. If the
organization is successful with analytics, its targets can broaden over time.

ANALYSTS- Organizations require analytical talent that covers a range of skills from employees
capable of basic spreadsheets to accomplished data scientists. In Analytics at Work, four analytical
types of people are defined, all of whom play an important role in an organization: analytical
champions, analytical professionals (now often known as data scientists), analytical semi-
professionals, and analytical amateurs. Recruiting analysts and data scientists can be quite difficult
today and retaining these employees even more challenging. Such professionals must have
quantitative and technical skills, business knowledge, interpersonal skills, and the ability to coach
others who may not understand analytics. They also must be adept at navigating new analytics
techniques such as machine learning and AI. Once the right people are in place, keeping them
motivated with creative and challenging projects is crucial. Of course, the perfect analyst or data
scientist with all the necessary skills for a specific project may be practically impossible to find. Some
may hire businesspeople with the potential to be great analysts, and others may hire analysts and
develop their business acumen along the way. Other companies employ teams to marshal the
required range of skills. Because analytical skills are often in short supply, organizational structures
and processes are critical for using them effectively. Both organizing and hiring analysts will have an
impact on how the analytics strategy is deployed across the organization, and on recruiting and
retention approaches.

Five Stages of Analytics Maturity Organizations mature their analytical capabilities as they develop in
the seven areas of DELTA Plus. The maturity model, described in Competing on Analytics and
developed in Analytics at Work, helps companies measure their growth across the seven DELTA
elements. This model enables an organization to assess which elements are strengths and which are
weaknesses. For example, an organization may achieve a stage 4 in analytics leadership maturity,
but achieve only a stage 3 in its management and use of data. This assessment enables targeted
investment to mature analytics weaknesses based on the DELTA Model.

Stage 1: Analytically Impaired. These companies rely primarily on gut feel to make decisions, and
they have no formal plans for becoming more analytical. They aren’t asking analytics questions
and/or they lack the data to answer them. Their leaders may be unaware of analytics and what can
be done with them.

Stage 2: Localized Analytics. Analytics or reporting at these companies exist within silos. There is no
means or structure for collaborating across organizational units or functions in the use of analytics.
This often leads to “multiple versions of the truth” across a company.

Stage 3: Analytical Aspirations. These companies see the value of analytics and intend to improve
their capabilities for generating and using them. Thus far, however, they have made little progress in
doing so.

Stage 4: Analytical Companies. Companies in this category are good at multiple aspects of analytics.
They are highly data-oriented, have analytical tools and make wide use of analytics with some
coordination across the organization. However, there remains a lack of commitment to fully
compete on analytics or use them strategically.

Stage 5: Analytical Competitors. These companies use analytics strategically and pervasively across
the entire enterprise. They view their analytical capabilities as a competitive weapon, and they
already seen some competitive advantage result from analytics.
CONCEPT(S) 6: CLUSTERING, DECISION TREES, LOGISTIC REGRESSION
(Read online, these are analytical concepts)

Which technique to use?

Depends on desired outcome, nature of variable (nominal, binary etc), nature of data

CONCEPT 7: OLTP vs OLAP


What is OLAP?

Online Analytical Processing, a category of software tools which provide analysis of data for business
decisions. OLAP systems allow users to analyze database information from multiple database
systems at one time. The primary objective of OLAP is data analysis and not data processing.

What is OLTP?

Online transaction processing shortly known as OLTP supports transaction-oriented applications in a


3-tier architecture. OLTP administers day to day transaction of an organization. The primary
objective of OLTP is data processing and not data analysis

Example of OLAP

Any Datawarehouse system is an OLAP system. Uses of OLAP are as follows


A company might compare their mobile phone sales in September with sales in October, then
compare those results with another location which may be stored in a sperate database. Amazon
analyzes purchases by its customers to come up with a personalized homepage with products which
likely interest to their customer.

Example of OLTP system

An example of OLTP system is ATM center. Assume that a couple has a joint account with a bank.
One day both simultaneously reach different ATM centers at precisely the same time and want to
withdraw total amount present in their bank account.

KEY DIFFERENCE between OLTP and OLAP:

Online Analytical Processing (OLAP) is a category of software tools that analyze data stored in a
database whereas Online transaction processing (OLTP) supports transaction-oriented applications
in a 3-tier architecture.

• OLAP creates a single platform for all type of business analysis needs which includes
planning, budgeting, forecasting, and analysis while OLTP is useful to administer day to day
transactions of an organization.
• OLAP is characterized by a large volume of data while OLTP is characterized by large
numbers of short online transactions.
• In OLAP, data warehouse is created uniquely so that it can integrate different data sources
for building a consolidated database whereas OLTP uses traditional DBMS.

CONCEPT 8: TYPES OF DATA


1. Time series data - It is a collection of observations(behavior) for a single subject(entity) at
different time intervals (generally equally spaced)

Example - Max Temperature, Humidity and Wind (all three behaviors) in New York City(single entity)
collected on First day of every year(multiple intervals of time)

2. Cross-Sectional data - It is a collection of observations(behavior) for multiple subjects(entities) at


single point in time.

Example - Max Temperature, Humidity and Wind( all three behaviors) in New York City, SFO, Boston,
Chicago(multiple entities) on 1/1/2015(single instance)

3. Panel Data (Longitudinal Data) - It is usually called as Cross-sectional Time-series data as it a


combination of above mentioned types, i.e., collection of observations for multiple subjects at
multiple instances.

Example - Max Temperature, Humidity and Wind( all three behaviors) in New York City, SFO, Boston,
Chicago(multiple entities) on First day of every year(multiple intervals of time)
CONCEPT 9: TYPES OF ANALYTICS

DESCRIPTIVE ANALYTICS- 90% of organizations today use descriptive analytics which is the most
basic form of analytics. The simplest way to define descriptive analytics is that, it answers the
question “What has happened?”. This type of analytics, analyses the data coming in real-time and
historical data for insights on how to approach the future. The main objective of descriptive
analytics is to find out the reasons behind precious success or failure in the past. The ‘Past’ here,
refers to any particular time in which an event had occurred and this could be a month ago or even
just a minute ago. The vast majority of big data analytics used by organizations falls into the category
of descriptive analytics. A business learns from past behaviours to understand how they will impact
future outcomes. Descriptive analytics is leveraged when a business needs to understand the
overall performance of the company at an aggregate level and describe the various aspects.

Descriptive analytics are based on standard aggregate functions in databases, which just require
knowledge of basic school math. Most of the social analytics are descriptive analytics. They
summarize certain groupings based on simple counts of some events. The number of followers, likes,
posts, fans are mere event counters. These metrics are used for social analytics like average
response time, average number of replies per post, %index, number of page views, etc. that are the
outcome of basic arithmetic operations.

The best example to explain descriptive analytics are the results, that a business gets from the web
server through Google Analytics tools. The outcomes help understand what actually happened in the
past and validate if a promotional campaign was successful or not based on basic parameters like
page views.

PREDICTIVE ANALYTICS- The subsequent step in data reduction is predictive analytics. Analysing
past data patterns and trends can accurately inform a business about what could happen in the
future. This helps in setting realistic goals for the business, effective planning and restraining
expectations. Predictive analytics is used by businesses to study the data and ogle into the crystal
ball to find answers to the question “What could happen in the future based on previous trends
and patterns?”. Organizations collect contextual data and relate it with other customer user
behaviour datasets and web server data to get real insights through predictive analytics. Companies
can predict business growth in future if they keep things as they are. Predictive analytics provides
better recommendations and more future looking answers to questions that cannot be answered by
BI. Predictive analytics helps predict the likelihood of a future outcome by using various statistical
and machine learning algorithms but the accuracy of predictions is not 100%, as it is based on
probabilities. To make predictions, algorithms take data and fill in the missing data with best
possible guesses. This data is pooled with historical data present in the CRM systems, POS Systems,
ERP and HR systems to look for data patterns and identify relationships among various variables in
the dataset. Organizations should capitalise on hiring a group of data scientists in 2016 who can
develop statistical and machine learning algorithms to leverage predictive analytics and design an
effective business strategy.

PRESCRIPTIVE ANALYTICS- Big data might not be a reliable crystal ball for predicting the exact
winning lottery numbers but it definitely can highlight the problems and help a business understand
why those problems occurred. Businesses can use the data-backed and data-found factors to create
prescriptions for the business problems, that lead to realizations and observations. Prescriptive
analytics is the next step of predictive analytics that adds the spice of manipulating the future.
Prescriptive analytics advises on possible outcomes and results in actions that are likely to maximise
key business metrics. It basically uses simulation and optimization to ask “What should a business
do?”

Simulating the future, under various set of assumptions, allows scenario analysis - which when
combined with different optimization techniques, allows prescriptive analysis to be performed.
Prescriptive analysis explores several possible actions and suggests actions depending on the results
of descriptive and predictive analytics of a given dataset.

Prescriptive analytics is a combination of data and various business rules. The data for prescriptive
analytics can be both internal (within the organization) and external (like social media data).
Business rules are preferences, best practices, boundaries and other constraints. Mathematical
models include natural language processing, machine learning, statistics, operations research, etc.

Prescriptive analytics are comparatively complex in nature and many companies are not yet using
them in day-to-day business activities, as it becomes difficult to manage. Prescriptive analytics if
implemented properly can have a major impact on business growth. Large scale organizations use
prescriptive analytics for scheduling the inventory in the supply chain, optimizing production, etc. to
optimize customer experience.
CONCEPT 9: CUSTOMER DATA STRATEGIES

Please note that the minimize costs quadrant has been replaced by after-sales, according to the
framework taught by the prof

When taking into account the characteristics of the industry in which a given firm competes, and the
product and services that it offers, the decision matrix enables simultaneous consideration of the
two critical dimensions and can be used to find a matching customer service strategy.

A firm may or may not fit neatly into one the matrix’s four quadrants. Yet, this matrix will aid in the
evaluation of the advantages and disadvantages of each general strategy and, more importantly, the
natural fit of each of the four approaches to the firm’s characteristics.

Minimize costs- When there is little likelihood of repeat business and few options for customization,
then the firm should focus on minimizing costs. Transactions should be efficient and uneventful. The
customer is served at minimum cost and neither party expects to have a future encounter. There
seems to be little potential for crafting a strategy around customer data. This is because very little
data will likely be generated and managers’ hands are tied with respect to what they can do with it.
A chain of budget or limited service tourist hotels in an exclusive fly-in destination (e.g., Hawaii, Fiji)
offers an apt example. Mid-scale hotels in these locations are generally a “window on an
experience” rather than the experience itself and their value proposition is to offer guests an
affordable opportunity to experience a great location. Because of the time commitment and cost of
reaching these destinations repurchase is relatively infrequent. Under these conditions the firm is
better off focusing on efficiency and low prices. Data collection and analysis should be aimed at
finding ways to reduce costs. Cost allocation and reporting systems will help the firm fine-tune its
revenue management.

Personalize interactions- A typical service personalization or product customization strategy is most


appropriate for firms competing in industries characterized by both a high theoretical repurchase
frequency and a high degree of customizability. Under these conditions the potential is there to
collect significant individual level data because of the repeated interactions the firm has with its
returning customers. Moreover, because of the high degree of customization management has
many opportunities to use this information to tailor the product or service to the specific needs –
learned or inferred – of the returning customers. Thus the firm can use the information to modify its
operations and differentiate its product or services. The Ritz Carlton, with its use of the CLASS
system, has provided a prime example of this strategy over the years. Event planning may also be a
good example of an industry that fits in this quadrant – particularly those firms that work closely
with customers who need the organization of many recurrent events (e.g., large investment banks).

Reward loyalty- A rewards strategy is predicated on the notion that the firm’s product and service
will be purchased frequently but these same products are fairly standardized and it is difficult for the
organization’s managers to tailor them to specific customer requests. Under these circumstances the
firm can use customer data to evaluate the profitability of each customer – actual and potential –
and use this information to reward behavior in an effort to increase customer loyalty or boost share
of wallet (i.e., make sure that customers consolidate their purchase behavior in the industry by
sourcing from the company rather than competitors). The firm can also use the individual level data
collected to generate accurate reports and improve its operations (e.g., grocery stores performing
basket analyses). Note that this means understanding customer profitability as well as their
propensity to repurchase without incentive – a strategy much more complex and sophisticated than
the “buy nine coffee cups and receive the tenth one free” that many firms seem to settle for. The
passenger air transportation industry is a classic example for this quadrant.

Acquire customers- Much conventional thinking about strategies based on customer data seems to
imply that when an industry has little potential for repurchase (i.e., low theoretical repurchase
frequency) customer data is not worth using. This could not be further from the truth. Even in the
face of low theoretical repurchase frequency, a firm in an industry with a high degree of
customization may benefit from an acquisition strategy. Following this approach the firm collects
exhaustive data about its current customers in an effort to profile them and develop predictive
models to identify and attract new profitable customers while avoiding non-profitable and marginal
ones. The availability of such deep business intelligence becomes all the more important during slow
periods when marketing budgets get slashed and efficiency in attracting new profitable customers
becomes paramount. A good example of an industry that falls in this quadrant is the wedding
reception business – an industry offering highly customizable products but typically enjoying low
repurchase frequency.

CONCEPT 10: TRACEABILITY OF GOODS IN SUPPLY CHAIN (from blockchain


chain)
Traceability System Evaluation Aspects

Breadth- Breadth is the number of transactions recorded by the system. The evaluation is done by
creating a program that automatically generate transactions.

Depth- Depth is how far or deep nodes forming the supply chain from upstream to downstream.

Precision- Precision is how the system is able to show precisely the movements and characteristics
of a particular product. The system built is able to meet the precision aspects as long as the data
entered is successfully stored to the blockchain, including data from the ingredients.

Access- Access is how quickly information can be communicated to members of the supply chain and
how quickly an information can be disseminated.

You might also like