You are on page 1of 70

(https://www.gartner.

com/home) LICENSED FOR


DISTRIBUTION

Critical Capabilities for Business Intelligence and Analytics Platforms


Published: 02 March 2017 ID: G00303250
Analyst(s): Cindi Howson, Rita L. Sallam, James Laurence Richardson, Thomas W. Oestreich, Joao
Tapadinhas, Carlie J. Idoine

Summary
The BI market has shifted to more user-driven, agile development of visual, interactive dashboards
with data from a broader range of sources. Data and analytics leaders should augment or upgrade
traditional BI platforms to modern platforms that improve business value and speed time to insight.

Overview
Key Findings
Core capabilities from traditional BI vendors have largely caught up to data discovery vendors who
initially disrupted this market, although differences remain at the subcriteria level and in the
degree of excellence exhibited.

The next wave of disruption in the form of smart data discovery has begun, with larger vendors
innovating first or acquiring startups.

Although this is a crowded market, significant differences remain in functionality, and in which
products are most appropriate for a given use case.

Recommendations
Data and analytics leaders looking to modernize their business intelligence and analytics should:

Expand BI and analytics tool portfolios beyond traditional BI platforms, either by augmenting or by
evaluating improved capabilities and product roadmaps of incumbent vendors.

Assess which BI and analytics products are best for your organization based on the use cases
needed by each business function and the sweet spot for that product.
Establish the degree to which centralized IT teams can keep up with demand for new data
sources and analyses, along with the skills levels and readiness of business units to perform
more of their own data preparation and analytics.

Embrace easier-to-use, more-agile tools as greater responsibility for analytics shifts to lines of
business.

Assess the measures that directly influence customer satisfaction with a BI and analytics vendor
on top of an evaluation of functionality, integration and cost-of-ownership requirements.

Strategic Planning Assumptions


Strategic Planning Assumptions
By 2020, smart, governed, Hadoop/Spark-, search- and visual-based data discovery capabilities will
converge into a single set of next-generation data discovery capabilities as components of modern
BI and analytics platforms.

By 2021, the number of users of modern BI and analytics platforms that are differentiated by smart
data discovery capabilities will grow at twice the rate of those that are not, and will deliver twice the
business value.

By 2020, natural-language generation and artificial intelligence will be a standard feature of 90% of
modern BI platforms.

By 2020, 50% of analytic queries will be generated using search, natural-language processing or
voice, or will be autogenerated.

By 2020, organizations that offer users access to a curated catalog of internal and external data will
realize twice the business value from analytics investments than those that do not.
Through 2020, the number of citizen data scientists will grow five times faster than the number of
data scientists.

What You Need to Know


This Critical Capabilities research is a companion to the 2017 "Magic Quadrant for Business
Intelligence and Analytics Platforms."

The BI and analytics platform market has undergone a fundamental shift away from more IT-centric
solutions to business-user-driven solutions. Although traditional ad hoc query tools allow power
users to author reports, they often still require an upfront IT modeling effort, often via a semantic
layer and data warehouse. In contrast, modern BI and analytic platforms often use a self-contained
in-memory engine with minimal to no upfront modeling requirements. This architecture allows a
wider range of business users to perform interactive analysis, without the need for advanced
technical or data science skills. As demand from business users for pervasive access to data
discovery capabilities grows, IT wants to deliver on this requirement without sacrificing governance
— in a managed or governed data discovery mode.

The need for governed reporting using an agile centralized BI provisioning model to run businesses
remains. However, there is a significant change in how companies are satisfying their governed
reporting requirements: Companies are asking if the data discovery tool can be used to fulfill the full
spectrum of BI and analytic requirements. They would like to leverage the higher ease of use, higher
business benefits, and lower cost of ownership to deliver enterprise reporting requirements. Here
we have seen vendors who started out as point solutions for individual analysts in a decentralized
use case evolve their capabilities to handle enterprise governance features with better report
distribution and KPI alerting.
As data discovery startups evolve their capabilities to meet a broader range of use
cases, traditional BI vendors are striking back, hoping to be early in the next wave of
disruption in the form of smart data discovery.

Smart data discovery leverages machine learning to prepare and cleanse data more intelligently,
automatically generate the most important insights, and interpret charts via natural-language
generation. Vendors are at different levels of product maturity.

The degree to which a customer must mix and match capabilities from multiple vendors to get the
best of both worlds varies significantly, depending on the vendor and product in question. The
modern BI platform includes the broad range of capabilities for agile, interactive visual exploration,
as well as governance and report distribution. Customers may start with a decentralized analytics
use case, and later look for governance and promotability. In other instances, customers may
immediately start with a governed BI use case, essentially trying to replace the former IT-centric
reporting platform with a more agile, modern solution.

Analysis
Critical Capabilities Use-Case Graphics
Figures 1 through 5 show aggregate product scores across the 15 critical capabilities that have
been weighted for each use case. Each of the products/services has been evaluated on the critical
capabilities (and their subcriteria) on a scale of 1 to 5:

1 = Poor or Absent: Most or all defined requirements for a capability are not achieved.

2 = Fair: Some requirements are not achieved.

3 = Good: Meets requirements.

4 = Excellent: Meets or exceeds some requirements.

5 = Outstanding: Significantly exceeds requirements.

Weightings have been applied to individual subcriteria to determine the score for each capability.
Scores and weightings represent a combination of customer survey results and analyst opinion.

A definition of the critical capabilities and the subcriteria evaluated are described in the Critical
Capabilities Definition and Use Cases sections. Capability weightings, scores by capability by
vendor, and scores by use case by vendor are shown in Figures 6 through 8. Each vendor section
details which platform product components were evaluated for each vendor to arrive at a composite
score.
Although Gartner has provided recommended weightings for each critical capability and use case,
individual customer requirements vary greatly. Customers are advised to use the web-based
interactive version of this Critical Capabilities research to set their own weightings. Further, the
rankings in Figures 1 through 5 may provide a useful prioritization, but customers should study the
differences in capability scores in Figure 6 to assess each vendor's strengths, weaknesses and
acceptable trade-offs.
Figure 1. Vendors' Product Scores for the Agile Centralized BI Provisioning Use Case

Source: Gartner (March 2017)

Figure 2. Vendors' Product Scores for the Decentralized Analytics Use Case
Source: Gartner (March 2017)

Figure 3. Vendors' Product Scores for the Governed Data Discovery Use Case
Source: Gartner (March 2017)

Figure 4. Vendors' Product Scores for the OEM or Embedded BI Use Case
Source: Gartner (March 2017)

Figure 5. Vendors' Product Scores for the Extranet Deployment Use Case
Source: Gartner (March 2017)

Scores in Figure 6 reflect a combination of analyst opinion and customer opinion based on products
released before 15 January 2017.
Figure 6. Product/Service Rating on Critical Capabilities
Source: Gartner (March 2017)

Scores in Figure 7 represent analyst opinion only and specifically exclude customer opinions that
can sometimes inflate as well as suppress product capability scores as customers work with
different definitions and expectations. Gartner clients may also want to consult Gartner Peer Insights
(https://www.gartner.com/reviews/market/business-intelligence-analytics-platforms) for additional
customer opinions from customers in which vendors did not supply a list of customer references.
Figure 7. Product/Service Rating on Critical Capabilities (Analyst Opinion Only)
Source: Gartner (March 2017)

Vendors
Alteryx
Alteryx offers a workflow-based platform for data preparation and building of parameterized
analytic applications. Alteryx Designer is a desktop application that can be used for stand-alone
advanced analytics or for self-service data preparation that can then be output to partner
applications such as Tableau, Qlik Sense or Microsoft Power BI. Alteryx Server enables data
engineers to publish datasets for governance and sharing. Alteryx Analytics Gallery is a cloud-based
application for sharing analytic apps and supporting browser-based interactivity within the
dashboards.
Alteryx is on an annual major release cycle with minor releases throughout the year, typically
quarterly. In the last year, Alteryx has added support for more cloud and big data sources, such as
Amazon Aurora, Google Sheets and Adobe Analytics. Its location-based analytics, which were
already strong, have further improved with support for more international regions. Version 10.6 is
the focus of this evaluation.
Decentralized analytics is the primary use case for Alteryx (73% of surveyed customers), and 43% of
customers use it for agile centralized BI provisioning.
Strengths
Self-service data preparation: Alteryx allows power users, such as citizen data scientists, to
combine data from multiple data sources while also transforming and cleansing data. Surveyed
customers report using an average of nine data sources per application, putting the product in the
top third of the vendors included in this research for this metric. It provides connectivity to a broad
range of data sources, including JSON, XML, direct HDFS, Spark, Impala, Google Big Query, and a
broad range of relational databases. For data scalability, Alteryx supports push-down processing
to a number of leading databases. Alteryx is in the top quartile for fastest time to create complex
reports, and customers rate the ease of use for authoring complex reports as the easiest.
Advanced and location analytics: Alteryx embedded advanced analytics are rated outstanding
overall. It supports forecasting and clustering via a menu-driven interface, along with more than
60 R-based functions, allowing these to be used either in the data preparation process or as
output columns for an application. Models can also be output to PMML or R for refinement in
other data science platforms. Alteryx has its origins with the U.S. Census Bureau, and supports
spatial analytics using a range of maps down to street level for a number of world regions. It also
supports drive time and radius geospatial calculations.
Scheduled reports: Alteryx allows formatted reports to be distributed in a variety of formats such
as PDF, PowerPoint and Excel on a scheduled basis, with email notification. While this capability is
typical in traditional BI platforms, it is lacking in many of the modern BI and analytic products.
These schedules can be set based on system or business events, such as low inventory.
Areas of Improvement
Visual exploration for consumers: Alteryx rates only Fair for its visual exploration capabilities. The
ability to manipulate data or author content is only supported in the desktop interface, not via the
browser. In this regard, information consumers mainly interact with a highly parameterized
dashboard, as opposed to performing more free-form exploration. Alteryx lacks the ability to
automatically display numeric values as percentages, link multiple visualizations on a page, or
create groups via a point-and-click interface. Particular chart types must be specified at design
time, with no support for trellis or histogram charts.
No native mobile: Alteryx does not offer native mobile apps, nor specific support for mobility
outside of generic, browser-based access. While there may not be high demand for mobile
support for the development of data blending workflows, the Alteryx Analytics Gallery could
benefit from improvements to its content-consumption experience through support for responsive
design when creating content, or through the addition of native mobile apps.

Other gaps: Alteryx does not natively support dashboard layouts. Scores for this capability are still
rated as Fair to Good because of its support for subcriteria related to mapping. Cloud capabilities
are limited to AWS deployment for the Analytics Gallery, with lack of support for hybrid
connectivity to on-premises data sources and no additional software security certifications.
Within the publish, share and collaborate capability, Alteryx lacks discussion threads, storytelling
and the ability to rate content.
Birst
Birst provides a full range of data management and analytic capabilities on multitenant cloud
architecture through a software as a service (SaaS)-based delivery model. Birst Enterprise Cloud
can be deployed in a public or private cloud or in a customer's data center; the same underlying
product — branded as Birst Enterprise Virtual Appliance — is also offered for on-premises
deployments.

In 2016, Birst added enhanced functionality for self-service data preparation, the ability to utilize
Exasol as a high-performance in-memory MPP data store, and increased its use of responsive
design techniques as part of a "design once, use everywhere" approach to multiple device types.

Birst delivers a major release every three months, with a minor release every two weeks. The focus
of this evaluation is version 6.
Birst is most often deployed for the agile centralized BI provisioning use case (43%), followed by
OEM or embedded BI (34%).

Strengths

Cloud native: Birst's multitenant, cloud-architected platform offers strong support for every
aspect of the cloud BI critical capability, with the exception of a marketplace, which it is planning
to offer in future. A particular strength is its ability to federate queries and support hybrid data
connections between cloud and on-premises data sources in a way that is transparent to the end
user. Birst also offers prepackaged applications called Solution Accelerators that bundle prebuilt
connectors to cloud data sources (Salesforce, Marketo, NetSuite and Google Analytics, for
example) with prebuilt metadata, transformations, and prebuilt reports and dashboards in an out-
of the-box solution that customers are able to customize to meet their specific needs.

Functional breadth: As a result of its broad capabilities, Birst scored in the top quartile for four out
of the five use cases assessed. Its highest scores were for the OEM or embedded BI and extranet
deployment use cases. This was largely driven by its broad range of SDKs and APIs used to
embed analytic content (where it rated Outstanding), and the capabilities of its cloud architecture.
The work the company has done in the area of self-service data preparation in Birst 6 (launched
November 2016) further bolsters its suitability for governed data discovery and decentralized
analytic use cases. Birst's strength in metadata management within a two-tier architecture (an
enterprise data tier and a user data tier) enables the definition and management of a semantic
layer for central governance, while also enabling decentralized use in a controlled manner. The
addition of what Birst labels "networked BI" instances builds on this core strength through
connecting independent user objects and the centrally defined semantic layer enabling agile, user-
driven growth and expansion of metadata.

Mobile: The Birst Mobile module capabilities are rated Excellent to Outstanding. Going beyond
responsive design, Birst supports offline exploration, and mobile collaboration and interaction
(expanding and sorting columns on a chart, filtering, revisualization, drilling, notifications, and
annotations [text and drawings]). The only area of mobile functionality missing is full GPS
integration, which is planned.

Areas of Improvement
Embedded advanced analytics: Gaps in the capabilities for embedded advanced analytics remain
and Birst scored as Limited in this capability. While the platform does offer core statistical
functions natively, it lacks the robust library of embedded advanced algorithms, functions and
visualizations that are required for more-complex use cases. The majority of this functionality is
instead enabled through its integration with R and Weka, which provides users with an option to
build and run models that leverage the underlying Birst data model.

Smart data discovery: Like other vendors covered in this report, Birst has yet to address the
growing need for smart data discovery in its platform, and lacks this functionality. Birst can (and
was one of the first BI offerings to) automatically process data and generate dashboards with
KPIs, charts and visualizations when new data is loaded. However, to be considered "smart," it
must automatically generate advanced analytic visualizations (such as the ability to visualize
correlations or clusters in a dataset, or display a decision tree) and automatically generate models
(including forecasting, trends, predictions, clustering, segments, correlations and factor analysis).
Neither has it yet addressed natural-language input or output.

Social and collaboration: The publish share and collaborate critical capability remains an area of
relative weakness for Birst, where it scores as Good. While functionality for diverse output
formats, content search, alerting, and printing is mature and complete, other areas are developing.
With the release of Birst 6, the company added live discussion threads within the platform, which
are shown on a timeline. However, Birst currently offers no support for user ratings of the value of
BI content, and lacks system-created recommendations of BI content. It does provide integration
with Salesforce Chatter, and customers can embed Birst inside Jive.

Board International
Board delivers a single, integrated system that provides BI, analytics and corporate performance
management (CPM) capabilities in a single platform. The focus is to deliver a single and unified
information platform as a basis for analytics, planning and budgeting, and consolidation. A key
differentiation is the hybrid in-memory self-contained platform built on Board's Hybrid Bitwise
Memory Pattern (HBMP) algorithms. Board's platform is available on-premises and in the cloud,
including a public cloud service offering. Board provides its own proprietary library of advanced
analytics functions, Board Enterprise Analytics Modeling (BEAM).
Board modernized its user interface with a mobile-first design ethos, and improved the storytelling
and collaboration features on the platform. Board's current release cycle is one major release per
year, and one minor release per quarter. The current version is 10. Board has started to invest in
smart data discovery features and collaboration on its near-term roadmap.

The two most prominent use cases in the survey for Board are agile centralized BI provisioning
(67%) and traditional IT-centric reporting (57%), followed by decentralized analytics (50%). The
average deployment size increased over previous years to just over 1,800 users and is slightly above
survey average now (1,182), indicating broader deployments across its customer base.

Strengths
Mature platform: With its long-standing legacy of developing a single platform, several critical
capabilities are strengths for the platform. Platform administration features, such as
authentication and authorization, scalability and performance, got Excellent scores. As a single
platform, it also supports multiple workflow and allows to customize them. Board natively
supports a broad range of relational and multidimensional data sources. Connectors also exist for
several enterprise applications, be it on-premises or in the cloud. Other web sources, such as
Twitter or Facebook, are supported through the OData connector. Board also supports Hadoop
and NoSQL sources.

Self-contained in-memory platform: Board's hybrid data platform offers Good to Excellent
capabilities for built-in data storage and data loading. In the capability for self-service data
preparation, Board's platform is Good in supporting business user data modelling and data
mashup, as well as supporting data inference and data enrichment. Data Fast Track enables
business users to develop their own data models independent from the platform and to
seamlessly promote them to the platform for reuse.
Analytic content creation and exploration: Board offers a comprehensive set of embedded
advanced analytics functions through its proprietary library of advanced analytics functions,
Board Enterprise Analytics Modeling (BEAM). Board continues to introduce new advanced
algorithms and functions through BEAM. Interactive visual exploration is well-supported on the
platform, for instance, with a broad range of chart types, global filters, binning or linking
visualizations. Conditional formatting, color selection and features to enable color consistency
are also supported.
Areas of Improvement

Embedded BI: Similar to last year, only 10% of surveyed clients indicated that they use Board for
embedded analytics and product rates Fair to Good for this critical capability, which is not a focus
point for Board's development. Several important software development kit (SDK) capabilities are
currently not supported by the platform, such as creating, copying and deleting reports or analytic
content, adding users, changing security settings, and performance management and monitoring.
Portal integration is supported through iframes and a native Microsoft SharePoint Web Part.

Dimensional model: Board supports a wide range of data sources, but customers are limited by
the vendor's "cube" concept. Board's core cube architecture is based on multidimensional online
analytical processing (MOLAP) or relational online analytical processing (ROLAP), organized by
facts and dimensions. A more-flexible data model is not supported.

Content authoring and analysis: Integration with R is not available, nor is PMML supported, so
clients have to rely on the BEAM library and cannot leverage advanced analytics models
developed by data scientists outside the platform. Board started to invest in developing smart
data discovery features, such as its "cognitive search," but does not automatically generate
insights or natural-language generation. Geospatial and location intelligence capabilities are still
limited. Geocoding requires a third-party solution, but no out-of-the-box integration is provided.
Only OpenStreetMap maps are fully integrated in BOARD, down to street level.

ClearStory Data
ClearStory Data is a cloud-based BI and analytics platform that allows for smart data preparation
and integration, data storytelling, and collaboration in a single platform. It uses Spark-based
processing to handle large data volumes. ClearStory is well-suited to business users that need to
combine, harmonize, blend, and explore multiple and varied data sources, including personal, cloud,
streaming and syndicated data.
As a cloud platform, ClearStory releases new products every three weeks. For major releases,
customers can choose not to have an upgrade implemented in their tenant. Major new capabilities
delivered in the last year include support for smart data discovery; insights can be generated
automatically and include natural-language generation through optional integration with Narrative
Science. A number of additional application connections with built-in data inference were added in
2016, including support for Google Analytics, Zendesk and Jira. ClearStory can also act as a data
source to Tableau and Microsoft Power BI.
ClearStory is mainly used for decentralized analytics, with 63% of surveyed customers deploying for
this use case, followed by 58% for governed data discovery.

Strengths
Smart data inference and harmonization: ClearStory was recently awarded a U.S. patent for its
smart data inference and harmonization, which uses machine learning. ClearStory can ingest
from traditional personal and relational data sources, but can also harmonize these data sources
with Hadoop-based and other NoSQL data sources — including Google BigQuery and IBM
BigInsights, log files, and streaming data sources. Data is processed using Spark for high levels of
query and analytic performance on granular data. The smart data inference will recommend how
best to blend and cleanse data but, in addition, the product will suggest other public and premium
datasets that are mashable, a capability it refers to as "data you may like."
Ease of use: ClearStory Data received the highest customer reference score for ease of
implementation and administration, as well as ease of content consumption. Across other ease-
of-use drivers — for content creation and visual appeal — ClearStory Data scored in the top
quartile. Harmonized datasets can be arranged into an interactive, visually appealing storyboard.
In building storyboards, the platform intelligence only exposes functions based on what is in the
data. For example, year over year will not be exposed if the data does not include time series.
There are also smart recommendations attached to text alerts for some functions that guide
users, such as "you did x, now try y." Furthermore, "smart visualizations" will automatically render
data using the best-fit visualization.
Metadata management: The metadata catalog contains everything ClearStory learns about a
source dataset, lineage, refreshes, and about the user activity in a story. As well, the accuracy of
inferred dimensions and the accuracy with which multiple datasets can be combined are tracked.
It also infers importance by tracking the types and frequency of questions users are asking, how
insights are explored, and how users collaborate and augment their analysis.
Areas of Improvement
Mainly cloud: ClearStory Data is primarily a cloud BI and analytics solution. It lacks hybrid
connectivity for live query of on-premises data sources, although customers can connect to on-
premises data sources. This may make the product less suitable for customers with large-scale,
on-premises data sources that do not want their data in the cloud. In addition, ClearStory Data
relies on its own physical data centers; on a case-by-case basis, ClearStory will work with
customers who want to deploy in Amazon Web Services (AWS) or on-premises. The data centers
have a number of security certifications — such as SOC2, the Federal Information Security
Management Act (FISMA) and ISO 2700 — but this approach limits its geographic reach and
provides less flexibility than competitive products that will also allow customers to rely on other
cloud infrastructure providers.
Embedded advanced analytics: ClearStory only scores Fair to Good for the embedded advanced
analytics capability. It supports Spark-based statistical and machine-learning functions, but does
not support the ability to call R functions or other third-party libraries. It also lacks native support
for menu-driven forecasting and decision trees. Support for K-means clustering was added in the
last year, and is supported from within the storyboard.
No native mobile: ClearStory uses HTML5 for content consumption and authoring on
smartphones and tablets. It does not support native apps that would further allow for location-
based analytics, annotations and offline interactivity. There is no out-of-the-box support for
integration with third-party mobile device management platforms.

Datameer
Datameer specializes in big data analytics, targeting organizations investing in data lakes and other
types of big data environments supporting analytics. The company offers a modern BI and analytics
front end with the potential to solve complex problems, leveraging the native query engines for
Hadoop and Spark, with support for an expanding range of connectors to other types of data
(including SQL relational databases, several file formats, cloud storage platforms, NoSQL databases
and web services ranging from enterprise applications to social media and consumer APIs).
Datameer customers report using the platform primarily for agile centralized BI provisioning (60%)
and decentralized analytics (60%) use cases. Governed data discovery is less common with 45% of
customers deploying in this use case.
Datameer 6 was a major release announced in May 2016, to enhance the user experience and
further optimize the smart query engine. A single front end now provides access to multiple steps of
the workflow including data access, preparation, analytics and visualization. Spark was also added
to Datameer's smart execution engine, which automatically determines the best compute
framework, or combination of frameworks for various big data analytics jobs across both small and
large datasets. The product is evolving at a rate of one major and quarterly minor releases every
year.

Strengths

Big data analytics: Datameer has strong capabilities in self-contained ETL and data storage, and
native connectors to big data sources. One of the key differentiators of the platform is a patent
pending smart execution framework, capable of identifying the right query processing engine for
each analytics task — from Tez to Spark, and others, in a way that is transparent to the user. The
platform can ingest and process data from multiple sources, but is optimized for big data. The
tool also offers a breadth of data sources made available to analysts, including support for SQL-
based data sources as well as more-complex data such as IoT and typical digital marketing
datasets — including social media.

Complex analytics environments: Some of Datameer's strengths are related to the support of
different types of challenging data sources (mostly big data related), Excellent self-contained ETL
and data storage, and Good to Excellent capabilities to embed analytic content. The multiple
performance enhancement features offered by Datameer (such as its smart execution engine,
and smart sampling) enable it to address challenging data environments, and help position the
tool as the go-to solution where other mainstream data discovery solutions might not meet data
scalability and performance requirements.

Fast content development: Datameer customers report favorable development times for content
across different levels of complexity when compared to the averages of products included in this
report. This is particularly impressive given the complexity of the data and analysis done by
content authors with Datameer — which often includes big data sources. This ability will appeal to
roles such as the citizen data scientist that need to speed the exploration process and time to
insight.

Areas of Improvement
Low ease of use and visual appeal: Despite a significant update in Datameer 6 to the user front-
end experience in 2016, customers report below-average ease of use and low visual appeal, in the
bottom quartile of vendors in this Critical Capabilities research. Although the composite rating is
Good overall, this is not enough to drive buying in the modern BI and analytics market.

Capabilities gaps: Datameer has gaps in a number of key capabilities that are expected features
of modern BI and analytics platforms. In particular, interactive visual exploration, analytic
dashboards, mobile exploration and authoring, publishing, sharing and collaboration are weaker
than with most other products, while self-service data preparation and metadata management
show some limitations. A possible way to overcome these shortcomings would be to leverage
Datameer's data environment and big data engine on the back end, while using a partner product
for the visual exploration (such as Tableau or Microsoft Power BI).

Narrow focus on big data: Although Datameer is investing to expand the product's scope, it is still
clearly targeting native functionality to support big data use cases (through specialized self-
contained ETL and data storage) that rely on technologies such as Hadoop and Spark. Emerging
capabilities that are driving innovative offerings in the market — such as smart data discovery,
natural-language processing or more-automated data preparation — are not on the vendor's near-
term roadmap.

Domo
Domo is a cloud-based business intelligence and analytics platform targeted at senior executives
and business users, and is well-suited to management-style interactive dashboards. Domo enables
rapid deployment through its native cloud architecture, an extensive set of data connectors and
prebuilt content, and an intuitive, modern user experience. Strong social and collaborative features
in DomoBuzz make it possible for users to discuss findings, follow alerts, collaboratively develop
content, and rate dashboards from the web or mobile devices, including smartphones. Domo
includes a web-based and business-oriented data preparation tool called Magic for combining
cloud-based and on-premises data sources. Domo Workbench is a desktop tool for loading
additional on-premises data sources into Domo datasets in the cloud. These datasets are stored in
an OEM version of a cloud-based columnar database. The product runs on Amazon Web Services.
Release cycles are continuous (as with most cloud vendors).

During the past year, Domo formalized its channel program and launched developer.domo.com on a
newly branded Domo Business Cloud platform, and an Appstore for Domo and its ecosystem of
partners to sell vetted Domo connectors and apps. Also, Domo now supports live customer
instances on Amazon Australia, Microsoft Azure and an Equinix colocation — in addition to Amazon
US-East previously supported. Amazon Ireland is on the roadmap for 2017.

A higher percentage (74%) of Domo's customer references report using it primarily for decentralized
use cases, more than most of the other vendors in this Critical Capabilities. This is consistent with
how business people primarily use Domo — for management-style dashboards often deployed in the
line of business with little or no support from IT.
Strengths

Rapid deployment of management-style dashboards and infographics: Domo offers business


people an easy and intuitive interface in which to build interactive "cards" (views) and store them
either in "collections" (a way to visually organize cards — instead of in folders) or assemble them
into "pages" (Domo's equivalent of dashboards). Users can also build visually appealing
infographics using App Design Studio that leverages Adobe Illustrator. Domo's reference
customers rate it highly for its intuitive user experience when combining a large number of data
sources into business-friendly dashboards. A large percentage of reference customers (in the top
quartile) report selecting Domo for ease of use and, because it can combine a large number of
data sources into business-friendly dashboards. Domo ranks in the top quartile for ease of use
and visual appeal, with an overall Excellent rating for this capability.

Collaboration, alerting and scheduling: Domo's "design or assign" abilities enable collaborative
content development between a content author and a content consumer. In the assign mode, a
user can assign the creation of content to someone else with more skills, which automatically
populates a template in both users' favorite pages. This way, the author and the consumer can
iterate and discuss until the content is what the user wants. With DomoBuzz, users can
participate in discussion threads or groups, and can follow other users. Users are also presented
with recommendations based on the behavior of other users as a native part of the product
workflow. Users can also rate dashboards and follow content created by particular users.
Business-user-defined dynamic alerting and scheduling is extensive and intertwined with the
platform's collaborative and social capabilities. Users can create their own scheduled reports as
well as add and remove metrics that are shown in a "favorites" tab.
Native mobile with a focus on smartphones: Domo's native mobile applications for iOS and
Android (Windows Phone devices are supported only through HTML5) optimally render content on
both tablets and smartphones. They are well-integrated with DomoBuzz for chat, collaboration
and alerts. However, integration with enterprise MDM security providers and offline analysis on a
mobile device are not supported.
Areas of Improvement

Advanced data exploration and embedded advanced analytics: Domo's analyst-oriented visual
exploration and user-based data manipulation features are limited. For example, users can only
create reusable groups as new dimensions, or automatically bin data in the dashboard authoring
and analysis environment, through a custom calculation. Moreover, while Domo's formula
expression editor (referred to as "beast mode") lets users create their own calculations, there are
limited automated suggestions that would make it easier for analysts to use the correct syntax.
Finally, while business users can create forecasts in a card via a drag-and-drop feature, and
analysts can integrate R and Python scripts into calculations, embedding advanced statistical
functions in calculations requires the use of JavaScript libraries. There is limited support for other
common drag-and-drop functions (such as for clustering and correlations). Users can create
linked charts to filter and drill to detail, or drill across to a related card, but there is no way to
centrally define, or automatically generate, or infer hierarchies (time or geography, for example) to
support free-form drilling. Consistent with Domo's primary use for simpler management
dashboards, reference customers scored Domo's complexity of analysis in the bottom third.
Self-service data preparation a work in progress: While Magic, Domo's self-service capability,
provides a web-based point-and-click design process for accessing web-based and on-premises
relational data sources to load data into Domo, its data inference and profiling capabilities are a
work in progress. Users can promote, collaborate and reuse a dataset, but they can't reuse
individual metadata objects. Workbench — Domo's desktop tool for administrators to load on-
premises data into its cloud — doesn't have the same ease of use and visual appeal as its web-
based dashboards and data-loading process. Workbench supports manual data mashups,
creation of additional calculations, and some data transformations. It lacks a point-and-click
graphical user interface to build a query or extract data. According to reference customers, IT
and/or tech-savvy business users who are familiar with SQL can handle this part of the data
loading process.
Cloud-centric approach: Domo's approach requires all data — whether from on-premises sources
or cloud applications — to reside in its cloud for visualization and analysis, which may not suit
organizations with primarily on-premises data sources. Domo has recently introduced an on-
premises version (with limited adoption to date) when this is a requirement. Hybrid data
connectivity to on-premises data is on the roadmap. Domo offers a desktop tool specifically built
for admin users to load on-premises data into its cloud, but this tool is less business friendly than
other components of the platform.
IBM (Cognos Analytics)
IBM Cognos Analytics is one of two product offerings provided by IBM that, together, offer a broad
range of BI and analytic capabilities. Cognos Analytics is version 11 of the Cognos Business
Intelligence product line and a much-improved and redesigned, modern product offering.
Over the past year, IBM has delivered on adapting its BI and analytics offerings to more-closely align
with the market. Cognos Analytics was first released in December 2015. The product is on a
continuous release cycle and averages one release per quarter. There were, however, five new
releases in 2016 with version 11.0.5 becoming available in November 2016. The product combines
both IT-authored content and content authored by business users within one platform. In addition,
several modern design elements from Watson Analytics have been incorporated, resulting in an
easier-to-use, more-visually-appealing experience. Cognos Analytics can be deployed both on-
premises or via the IBM Cloud.

Cognos Analytics is most often used to support the agile centralized BI provisioning use case, as
represented by 74% of survey references. It is also often used to support decentralized analytics
(47%) and traditional IT-centric reporting (47%).

Strengths
Platform management: Robust BI platform administration, security and architecture capabilities
provide a solid foundation for the IBM Cognos Analytics solution. The product enables scalability
and performance via extensive load balancing features, tunables for managing resource
availability, and performance optimizations such as function shipping to the database, multilevel
caching and aggregate awareness. The platform supports a number of operating systems
including AIX, Linux, Solaris and Windows, and supports a number of open standards. Users are
authenticated by directly connecting to any v3 LDAP, including Active Directory. Cognos Analytics
also provides an out-of-the-box audit solution to monitor when and how users interact with the
platform.
Interactive visual exploration and analytic dashboards: Visual exploration and analytic
dashboards are rated Good. The extensive chart library leverages IBM's RAVE visualization
technology and includes standard chart types as well as heat maps, tree maps, packed bubble
charts, geographic maps and more. Each chart is automatically interactive and, with Release
11.0.5, able to be animated. Release 11.0.5 also introduced enhanced mapping and geospatial
analytics enabled via a partnership with Mapbox and Pitney Bowes. Users can create their own
groups to form new dimensions, interactively display numbers as values or percentages, and
easily rank values. Additionally, multiple charts on a page are automatically linked for brushing
and filtering.

Visually appealing and easy to use: Borrowing from IBM Watson Analytics, IBM Cognos Analytics
has a new clean and attractive user interface that enables access to both IT-authored content, and
business-user-authored content, via one portal. In addition, several of the authoring interfaces in
previous versions have been combined and streamlined in Cognos Analytics. Reports can be
authored against existing Framework Manager models and, with release 11.0.5, an author has
direct access to relational Framework Manager packages in dashboards. "Smart" capabilities
including smart searches and joins, automatic inferences, representation of time and location
data. Recommended tasks and visualizations further enhance the user experience and reduce
time to insight.

Areas of Improvement

Deficient self-contained ETL and data source connectivity: The trend in analytic platforms is to
not only extend analytic capabilities to include more-advanced analysis, but to also extend the
ability to acquire, organize and store the data for analysis. IBM Cognos Analytics lacks many of
the features needed to effectively move from ingesting data to providing insight. Neither does
Cognos Analytics provide a true columnar in-memory data store, instead, relying on the creation of
datasets using file caching. As a result, self-contained ETL and data storage rated only as Fair.
IBM Cognos Analytics does not provide access to unstructured/semistructured data sources. In
addition, it has no native connectivity to enterprise applications.

Gaps in collaboration: As analytics proliferates throughout the organization, the ability to share
and collaborate around analytic content is paramount. The ability to create discussion threads,
have real-time collaboration and integrate with other social platforms is currently unavailable. The
ability to rate and recommend analytic content based on user ratings or usage patterns is also
lacking.
Limited extension to additional analytic users and analytics in the cloud: Smart data discovery is
a growing trend in the analytics space, allowing the use of the analytic tools by new types of users
as well as enabling faster time to insight. Limited smart data discovery capabilities, as well as a
lack of embedded advanced analytics, limit the extension of the platform. In addition, the limited
ability to embed analytic content in other applications further hampers the ability to extend the
use of analytics within both internal- and external-facing applications; embedding is limited to
URLs within web portals, which was added in Release 11.0.5. Platform integration and workflow
integration are a work in progress. Cognos Analytics' cloud offering is somewhat limited,
minimizing the opportunity to extend analytics via the platform. Multitenancy is not yet supported,
although it is on the roadmap for 2017. Lack of a marketplace and packaged content, and minimal
self-service administration capabilities also minimize the ability to jump-start analytics in the
cloud using Cognos Analytics.

IBM (Watson Analytics)


IBM Watson Analytics is one of two products that comprise IBM's BI and analytics offering. Watson
Analytics continues to pioneer the next-generation, machine-learning-enabled user experience for
analytics, including automated pattern detection, support for natural-language queries and
generation, and embedded advanced analytics, via a cloud-only solution.

Over the past year, IBM has delivered on adapting its offerings to more-closely align with the market
with its continuous release cycle, averaging one release per quarter. Integration with Cognos
Analytics is currently limited to the ability to bring in a Framework Manager package, a list report or
a dataset for deeper exploration, and appeals primarily to individual users and workgroups who
need to perform smart data discovery. Watson Analytics is also a distinct product from the Watson
cognitive solutions offered by IBM (such as Watson for Oncology and Watson Discovery Advisor).
There is no integration between Watson Analytics and these later Watson systems.
IBM Watson Analytics is most often used for decentralized analytics as represented by 74% of the
survey respondents. An additional 26% of survey respondents use the platform for governed data
discovery.

Strengths
Easy to use and visually appealing: Watson Analytics provides integrated data access,
exploration, dashboarding and exploration capabilities within one platform and enables
interaction and exploration via natural-language dialogue, enabling an easy-to-use approach for
working with the platform. It rated Good to Excellent for ease of use and visual appeal. In 2016,
IBM developed 20 "storybooks" that provide analytic templates for addressing specific business
problems. In addition, an expert storybook can be created by an author, which then guides a user
by selecting analyses, questions and visualizations that lead to faster analytic insight. Expert
storybooks are accessible via Watson Analytics' Analytics Exchange.
Smart data discovery and exploration: Watson Analytics is a pioneer in continuing to push the
boundaries for smart data discovery. The platform automates many of the steps in the data-
access and preparation process, including scoring the data on readiness for analysis and
highlighting potential data issues, as well as providing semantic recognition of concepts such as
time, place and revenue. In addition, Watson Analytics provides recommended starting points for
analysis and targets for prediction by automatically detecting patterns in the data as it is loaded
by determining strong correlations and associations. Statistical information about the models,
and which algorithms were used, can also be viewed, enabling validation of the model.
Cloud-based architecture: Watson Analytics is fully cloud-enabled and accessed via a web
browser. The data and content within the platform are hosted in the SoftLayer cloud and data is
stored using IBM dashDB, which combines columnar, in-memory capabilities with embedded, in-
database analytics. The cloud platform provides robust self-service administration and elasticity
capabilities for monitoring, managing and scaling the solution as needed by the end-user
organization. The administrator user interface allows monitoring of user licenses, data source
connections, space utilized and space available.
Areas of Improvement

Minimal self-contained ETL: A precursor to performing data discovery is the ability to extract,
transform and load (ETL) the data easily within the platform in preparation for analysis. The self-
contained ETL capabilities available within IBM Watson Analytics rated Poor to Fair, the weakest
of the vendors included in this Critical Capabilities. The platform does not use existing analytic
storage (such as the data warehouse or third-party in-memory engines) but instead requires
loading the data into Watson Analytics' own data storage environment. Incremental data loads
and parallel/multithread loading of data are also not currently supported, nor is scheduling and
monitoring of active data loads.

Lacking comprehensive data source connectivity and metadata management: IBM Watson
Analytics' data source connectivity capabilities continue to expand, but are still deficient relative
to other vendors. Although the platform does provide access to flat files, many relational data
stores, and Twitter data, it does not support OLAP connectivity or other data sources such as
XML, RSS feeds or JSON. It also supports connectivity to Cloudant, and Apache Hive, Cloudera
Impala and Hortonworks HDFS, but does not include Spark. Support for native connectivity to
enterprise applications is also limited, with access only to Salesforce. The platform does not
provide data lineage or impact analysis.

Gaps in sharing findings: The ultimate value of an analytic platform is its ability to share and
collaborate throughout the analytic process. IBM Watson Analytics rated Fair for its ability to
embed analytic content and its ability to publish, share and collaborate. SDKs are limited to data
loading, administration and building connectors. There are no SDKs for printing, parameterization,
building workflows, custom visualizations/analytic web applications, or create, copy and delete
capabilities. The platform does not support white labeling, and provides portal integration solely
via iframes. Chart extensions to support third-party chart libraries, and the ability to access all BI
content as an embeddable report via an API, are also currently unavailable. The platform supports
limited output formats (including PDF, PPT and image) and does not support scheduling or alerts.
The platform also does not support discussion threads, real-time collaboration and timelines,
ratings for content, or recommendations based on rating or usage patterns.
Information Builders
This report covers Information Builders' InfoAssist+ product. This is part of the company's
integrated WebFOCUS business intelligence and analytics platform, but can be used stand-alone.
InfoAssist+ is a combination of visual data discovery, reporting, rapid dashboard creation,
interactive publishing, mobile content and the Hyperstage in-memory engine.

During 2016, Information Builders improved the self-service data preparation and visual exploration
capabilities in InfoAssist+. It also made changes in packaging and distribution, and is now leading
with InfoAssist+ as the introductory edition to all three editions of the WebFOCUS platform (the
business user edition, application edition and enterprise edition).
Information Builders releases major new features annually with maintenance releases quarterly. The
focus of this evaluation is on InfoAssist+ 8.2.
According to the reference customers surveyed, InfoAssist+ is most often deployed for the agile
centralized BI and decentralized analytics use cases (both 52%), closely followed by governed data
discovery (46%).

Strengths
Platform administration: BI platform administration is rated Excellent to Outstanding. This
includes architecture, scalability and performance, and disaster recovery. WebFOCUS is Section
508-compliant, and the server runs on Linux, Unix, IBM (iSeries and System Z), and VMS. For the
self-contained ETL and data storage critical capability, it is rated Excellent. InfoAssist+ has its
own columnar, in-memory data store, which is included in the product. Unsurprisingly (given
Information Builders' heritage, and the data integration capabilities of iWay), data source
connectivity is rated Excellent to Outstanding. Connectivity to a range of relational data sources,
enterprise applications, big data sources and personal data sources is a core strength. The
product supports XML, RSS, SOA, REST, JSON, flat files, Excel and other data sources. In addition,
it has native adapters for Facebook, Twitter and Salesforce to consume data for social/sentiment
analysis and other reporting.
Embed analytic content: Information Builders was rated Excellent to Outstanding in the embed
analytic content capability. Its web services REST API allows developers to call a broad range of
functionality from another application. Reports and dashboards can be embedded within other
portals including Microsoft SharePoint via Web Parts, JSR 168 portlets and iframes. This strength
makes InfoAssist+ on WebFOCUS a good match with the OEM and embedded BI and extranet
deployment use cases covered in this report, but relatively few customers surveyed use
InfoAssist+ in this way, perhaps using IBM's WebFOCUS enterprise edition instead.

Mobile: Information Builders supports native mobile apps on a broad range of devices including
iOS, Android and via HTML5 for browser-based access. The MobileFavs native app allows users
to interact with dashboards and reports using touch and in offline mode. Information Builders
supports interactive disconnected analytics via its patented Active Technologies.

Areas of Improvement
User-centric functionality: Although strong in IT-led areas, InfoAssist+ has less-capable
functionality in the areas that are needed to meet the requirements of our decentralized analytics
use case. It gained Good ratings for the interactive visual exploration and analytic dashboards
capabilities, areas that are key buying requirements in this space. It has some gaps — for
example, the fact that visually driven binning is not supported, or that only a small subset of the
chart types available in analytic dashboards are available in InfoAssist+ when doing interactive
visual exploration. While rated Limited to Good for the self-service data preparation capability, the
company has made progress here in 2016, adding automatic generation of metadata and sample
visualization, reports, interactive documents and formatted dashboards immediately from user's
data. In the smart data discovery capability, InfoAssist+ lacks the level of automatic generation of
visualizations and analytic models needed, and so scores as Limited. Information Builders has
partnered with natural-language specialist Yseop, and at the time of writing was about to make
natural-language generation functionality generally available.

Cloud BI: Although Information Builders has a number of cloud-based partners, adoption has
been slow, and it rated Limited to Good only for this capability. The product is architected to be
multitenant, but otherwise lacks a turnkey solution for cloud deployment or hybrid connectivity to
on-premises data sources. During 2016, it has established new partnerships with Microsoft Azure,
Amazon Web Services (AWS) and IBM SoftLayer, better positioning it to take advantage of
growing cloud adoption intentions, although it still does not offer a SaaS model directly.

Less easy to use: Consistent with prior research, for ease of use, Information Builders' customers
rated the vendor as Good overall, but in the bottom quartile relative to the rest of the products
covered. In a market in which ease of use significantly influences buying, Good is often not
enough. In part, the rating stems from the product's multiple authoring interfaces (for a
dashboard, for a visualization, for a report, for a chart) with inconsistent capabilities in each.
InfoAssist+ has a UI style like the Microsoft Office ribbon, which may also be viewed as outmoded
by end users. In this regard, its visual appeal also ranked in the bottom quartile. The only area
where the customers surveyed ranked InfoAssist+ above the bottom quartile for ease of use was
in content development.
Logi Analytics
Logi Analytics' BI platform Logi Suite is composed of Logi Info, Vision and DataHub. Logi Analytics
is best-known for Logi Info, which is commonly used to embed analytic content in websites and
applications, and to enable end-user organizations to extend their BI access externally to customers,
partners and suppliers. Logi Vision is the company's data discovery tool, which enables business
users to prepare, analyze and share data. Logi's DataHub is a data preparation and columnar data
store that ingests, blends and enriches data from multiple sources. Logi Info and Vision can both
use DataHub for self-service data preparation.

In 2016, Logi Analytics added enhanced functionality to make self-service data discovery
embeddable into analytical applications — for shared authoring of analytics and applications — and
expanded data preparation to include more data joining and blending fidelity, as well as faster query
performance.

Logi has two major and two minor releases per year. The focus of this evaluation is on version 12.2.
Logi Analytics is most often deployed for the OEM and embedded BI use case (49%), followed by
traditional IT-centric reporting (46%).

Strengths
Embedded BI: Logi is deployed in an embedded use case by more of its customers than any other
vendor surveyed. From a product perspective, Logi rates as Excellent to Outstanding for the
embed analytic content critical capability. The Logi Embedded Reports API allows developers to
embed content in other applications and call functions to create, add and delete objects.
Individual visualizations and/or reports are fully interactive within the third-party application.
Portal integration includes JSR 168-compliant portlets, iframes, Microsoft Web Parts and Oracle
BPEL portlets.
Sharing, collaboration and visualization: Logi offers good functions to drive adoption. It scores
Excellent in the publish, share and collaborate critical capability. The Logi Vision Info Board is a
modern user interface through which users visualize their most important content based on
usage statistics and ratings. Users can pin content to a board. A visual activity stream also shows
who is producing new content and making changes. Integrated discussion threads allow users to
collaborate on findings and reference users within a comment. The size of a particular
visualization is automatically adjusted in the dashboard to reflect its usage and importance.
Overall, Logi Analytics' interactive visual exploration is rated Excellent to Outstanding. Users can
filter, sort, lasso and drill; and all common chart types are offered. Geographic mapping
capabilities automatically interpret location based on names, and data does not need to be
geocoded in advance by latitude and longitude. Data manipulation — such as binning and display
as percentage variances — is intuitively supported. However, the ability to create new custom
groups remains less intuitive.
Data connectivity: Logi's self-contained ETL and data storage capabilities are rated as Excellent. It
offers a wide range of relational, OLAP, and other data sources (such as XML, RSS, JSON feeds)
and connectors to apps. Logi DataHub is used for caching and performance, but customers can
also use the in-database processing to run complex SQL in the data source. Its self-service data
preparation capabilities are also rated as Good (handling user mashups, data modelling, joins and
profiling).
Areas of Improvement
Smart data discovery: While Logi's embedded advanced analytics functions have improved
(adding support for decision trees and menu-driven forecasting) and are now rated Good to
Excellent, its smart data discovery functions are immature, lacking the ability to automatically
generate forecasting, trends, predictions, clustering, segments, correlations and factor analysis on
data load. While not supporting natural-language queries, Logi is working on natural-language
generation with partner Yseop.
Metadata handling: Logi's concept of a metadata layer is different for each of its main interfaces
and, as such, Logi scores Fair to Good on our metadata management critical capability. When
working with Logi's Self-Service Reporting Module (SSM), report authors only need to decide what
in the dataset (tables and columns, for example) they want to make available to the end user, as
well as the types of joins they want to allow between those data sources. Within Logi's SSM, the
Metadata Builder automatically introspects the database catalog and infers relationships between
tables and columns. Logi Vision, meanwhile, does not use the same Metadata Builder. The degree
that these models can be reused across applications and promoted by users is a work in
progress. The optional Logi DataHub module can be used as a unified metadata layer.
Mobile: Mobile exploration and authoring could be improved with a better touchscreen experience
associated with native mobile device support, and also offline exploration. Currently, all mobile
access is via browser-based HTML5. The benefit of this approach, though, is that content can be
authored, consumed and interacted with from any tablet or smartphone. Logi does support
responsive design (which allows content to be smartly re-rendered depending on the screen
dimensions).

Microsoft
Microsoft offers a broad range of BI and analytics capabilities with its Power BI suite, delivered via
the Azure cloud. (Microsoft Reporting Services and Analysis Services are covered in our "Market
Guide for Enterprise-Reporting-Based Platforms," as on-premises offerings.) Excel is also widely
used for data analysis and, while it is not considered here as a BI and analytics tool per se, the
integration with Power BI has continued to improve. A number of Excel add-ins that were part of
earlier releases of Power BI are native and supported in Office 2016 (Power Query, Power Pivot,
Power View and Power Map).

Power BI supports browser-based authoring and visual exploration for cloud data sources, but when
authoring complex data mashups involving on-premises data sources, the desktop interface is
required. Power BI offers data preparation, data discovery and interactive dashboards via a single
design tool. The Cortana Intelligence Suite (reviewed in the "Magic Quadrant for Data Science
Platforms" and its companion Critical Capabilities) includes Power BI.
Power BI Desktop can be used as a stand-alone, on-premises option for individual users as part of
the decentralized analytics use case, for which 46% of survey customers have deployed it. Agile
centralized BI provisioning is the predominant use case (61%). With this use case, central BI teams
are modeling the data and publishing the dashboards and reports to the cloud-based Power BI
server. Microsoft does support hybrid connectivity to on-premises data sources, but all dashboards
and reports must be published to the Microsoft Azure cloud for sharing and collaboration. An option
to publish Power BI reports to an on-premises SQL Server Reporting Services deployment is a major
roadmap item due in 2017. The on-premises version of Power BI will not support the full range of
data sources and features provided in the SaaS version.
Throughout 2016, Microsoft has continued on its monthly release cadence. Major improvements in
the last year include the addition of a number of key data sources, embedded BI capabilities, and
enterprise features that include row-level security and usage monitoring.

Strengths
Ease of use and visual appeal: Microsoft's ease of use rates Excellent overall and is ranked in the
top quartile. Visual appeal is an aspect of this, and inquiry customers have frequently said that
business users chose Microsoft in competitive proofs of concepts partly for its appealing first
impressions. There are a number of specific capabilities that contribute to its ease of use,
including that Power BI is a cloud-based product in which Microsoft takes care of the
infrastructure for the data storage, processing and sharing environment. In addition, the product
has particular features such as Q&As that have a searchlike interface for users to generate
visualizations. In addition, Quick Insights is a basic form of smart data discovery that will
automatically generate the most-meaningful charts. Despite the high ease-of-use scores, the time
for users to create reports in dashboards is higher than the survey average for all report types
(simple, moderately complex, complex) with the complex report development time ranking in the
bottom third.
In-memory engine with data preparation: The Power BI in-memory engine has its origins in
Microsoft SQL Server Analysis Services tabular data models. These provide both a flexible and
high-performance analytic tier in the cloud. Microsoft limits storage to 10GB per user. Power BI
also supports DirectQuery mode for the most-popular data sources, in which data is not replicated
into the in-memory engine for greater data scalability. A robust self-contained in-memory engine
allows users to mash multiple data sources together in a reusable dataset. The data preparation
capabilities, which rate Good overall, allow data modelers to clean and transform the data as they
load it.
Data sources and prebuilt apps: Microsoft scores Excellent to Outstanding for its breadth of data
sources. Microsoft has continued to expand the range of relational data source supported, with
SAP Hana added in the last year, and Informix in beta. It also natively supports the Hadoop File
System with support for Spark (both third-party and Microsoft Azure HDInsight) also in beta.
Prebuilt applications (called "content packs") include data connectors, models, metadata and out-
of-the-box dashboards. Power BI users can quickly connect to their accounts in SaaS applications
(for example, in Salesforce, Marketo, Zendesk, QuickBooks Online and Google Analytics) and see
their data through prebuilt, live dashboards and interactive reports. Microsoft owns and delivers a
number of these content packs, and its partner network provides others.

Areas of Improvement
Embedded advanced: Microsoft has a number of key ingredients for advanced analytic
capabilities across several products. However, none of the advanced analytic capabilities are
available out of the box with Power BI. Microsoft gives users the ability to install a local R instance
with Power BI Desktop, and to call and embed an R script directly from within Power BI Desktop.
However, there is not a straightforward way to perform basic forecasting via a menu-driven option
in Power BI. More-advanced visualizations and analytics, such as decision trees and clustering,
are not natively supported, but could be possible via extensions in the marketplace. The range of
statistical functions natively supported is limited to Data Analysis Expressions (DAX).

Lacking important basics: Surprisingly, pivot tables continue to be lacking in Power BI. Even table
displays with subtotals are still not supported. While several modern BI products lacked these
capabilities in version 1 releases, Power BI Desktop has now been in the market for more than 18
months. For many customers, this may be a showstopper. Microsoft's work-around is for
customers to create the tables in Excel and import the content into Power BI. However, this is less
than an ideal workflow. The product also lacks a number of formatting options.

Disjointed platform components and workflow: The scale-up options for more data storage and
processing — whether to HD Insight or SQL Azure Analysis Services — are not straightforward or
clear. For the advanced analytics capabilities, the features may also be spread across Excel, Azure
Machine Learning and R. While it's reasonable for a vendor to have a distinct product for data
scientists, even basic advanced capabilities for business users are disjointed. Hence, Microsoft
scores only Limited to Good for this capability.

MicroStrategy
MicroStrategy combines self-service data preparation, visual data discovery and big data
exploration with enterprise BI. Version 10, a major release, added substantially enhanced interactive
visual exploration capabilities, better promotability of user-built data models and content, improved
self-service data preparation, and direct support for HDFS as well as a range of personal data
sources. This range of capabilities, delivered in a single integrated platform, makes it better-suited
to large-scale system-of-record reporting as well as governed data discovery deployments for larger
and more-complex datasets than many other offerings.
MicroStrategy's front-end interface — MicroStrategy Web — is used for data preparation, visual-
based exploration, and report and dashboard authoring deployed on MicroStrategy Server.
MicroStrategy's Parallel Relational In-Memory Engine (PRIME) is an embedded, in-memory, column-
oriented, distributed, analytic data store. MicroStrategy Cloud is a hosted service that includes
MicroStrategy BI, an analytical database (which supports Actian Matrix, Microsoft SQL Server,
Informatica, Netezza and Teradata among others), and data integration capabilities (such as
Informatica PowerCenter, Informatica Cloud, and SQL Server Integration Services [SSIS]).
MicroStrategy Cloud now runs on Amazon Web Services (AWS). MicroStrategy Mobile — part of the
MicroStrategy Server — is a code-free environment for building native mobile apps for iOS, Android
and Windows devices.

MicroStrategy is on a quarterly release cadence. Version 10 point releases in 2016 have introduced
a new dossier client and "workstation" to simplify the creation, sharing and viewing of analytic
dossiers and briefing books. Workstation capabilities also streamline the configuration and
administration of enterprise deployments, including dynamic scaling of MicroStrategy Cloud on
AWS. The new workstation is built on new REST APIs in an effort to make the platform more
attractive for OEM and embedded use cases. A free desktop version of MicroStrategy 10, available
since 3Q16, is well-suited for visual-based data discovery and gives users a risk-free way to try the
product.
MicroStrategy is deployed primarily for traditional IT-centric BI (59%), followed by agile centralized BI
(57%), and decentralized analytics (48%).
Strengths
Fully featured integrated product for all use cases: MicroStrategy has top-quartile product ratings
for three of the given use cases. It is an enterprise-grade platform including security, scheduling
and distribution with a strong capabilities in support of governed data discovery (including self-
service data preparation of complex data models). It is well-suited to companies that need large-
scale system-of-record reporting, mobile, dashboards and robust, business-oriented data
discovery on large complex datasets in a single platform. Outstanding scores for BI
administration, architecture, security, data source connectivity and platform workflow integration
anchor this rating.
Extensive and widely deployed mobile: MicroStrategy has also been an early innovator in mobile
BI, with some of the most-comprehensive, highly rated and widely adopted mobile capabilities.
Customers choose MicroStrategy for mobile more often than most other vendors. MicroStrategy
Mobile is a fully featured and native mobile development and consumption environment for iOS,
Android and BlackBerry. It supports advanced and less-common features such as disconnected
analysis, write-back, multifactor authentication, biosecurity, GPS and camera integration, although
authoring from a mobile device is not supported. MicroStrategy Mobile can be deployed with the
MicroStrategy platform, as a stand-alone mobile solution, or as a complement to other BI
platforms. This is a differentiator versus the mobile solutions of other BI platforms.

Enterprise-grade platform with modern capabilities: MicroStrategy 10 can support large-scale,


trusted, self-service BI. It offers a seamless workflow for promoting business-user-generated data
models and content to enterprise sources in MicroStrategy Web. When user data models are
promoted to the enterprise, common dimensions are automatically remapped to inherit row-level
security. These dashboards and datasets can then leverage other enterprise administration,
scalability and distribution features in MicroStrategy Server. Advanced data manipulation
capabilities support multisource self-service data preparation that recognizes geographical and
time data, and is able to automatically generate the hierarchical elements not available in the
source. From a visualization, users can build on-the-fly groups and hierarchies and do drag-and-
drop forecasting. MicroStrategy natively connects to HDFS with queries executed in Hadoop or
Spark engines, and then stages the data in memory for fast interactive visualization of large
datasets and models that natively span modeled and un-modeled relational, personal and Hadoop
sources. Version 10 supports Spark SQL and native connectors to big data platforms such as
Amazon Elastic MapReduce and IBM BigInsights. Extensive geospatial capabilities are available
in the platform (via an OEM version of Esri for free) — although specialized geospatial algorithms
are not supported.
Areas of Improvement
Collaboration and smart data discovery: MicroStrategy is rated highly across most of Gartner's
critical capabilities for this market; however, smart data discovery, and publish, share and
collaborate score lower. Smart data discovery features — such as natural-language query,
automated insight generation and integrated natural-language generation/narration — are missing
in the current product, although recommendations based on user context, interest and usage are
on the roadmap.

Gaps in cloud: MicroStrategy's single-tenant cloud solution lacks packaged domain and vertical
content, and a robust content marketplace for customers and partners. Although MicroStrategy
was early to invest in the cloud, it also has among the highest percentage of its reference
customers reporting that they have no plans to consider deploying it in this manner.

Ease of use: Although an ongoing area of focus for MicroStrategy development, customers still
rate its platform as more-difficult to administer and consider it builds content with a less-
favorable visual appeal than competing products. For both ease of use and visual appeal,
MicroStrategy's reference sores place it in the bottom quartile of vendors included in this
research. Although the composite rating is Good to Excellent for this capability, the relative
weakness impacts customer buying in the modern market. While a desktop deployment is
relatively easy to download and use, an enterprise deployment still requires significant IT
involvement. Similar feedback is reflected in Gartner's Peer Insights ratings and in client inquiries.

Oracle
Oracle offers a broad range of BI and analytic capabilities, both on-premises and in the Oracle cloud.
Oracle Data Visualization (ODV), Oracle's modern BI offering, is available either as part of the Oracle
Business Intelligence Cloud Service (OBICS) as a stand-alone cloud service, as a desktop offering,
or as an optional component to Oracle Business Intelligence 12c (deployed on-premises). ODV (the
focus of this Critical Capabilities assessment) offers integrated data preparation, data discovery
(with advanced exploration) and interactive dashboards via a single design tool supporting both
desktop and web-based authoring.
ODV is particularly attractive to organizations that have deployed Oracle applications and Oracle
information management technology. Smart connectors inherit Oracle security, there are an
extensive set of content packs for Oracle enterprise applications, and the platform offers semantic
layer access to Oracle 12c and Oracle BI SaaS.
While Oracle was late to respond to the shift in the market toward modern BI and analytics, its
modern BI and analytics components continue to gain traction and are now starting to appeal to the
market — particularly within its own installed base. Oracle is also investing early in machine-
learning-enabled smart data discovery, including automated pattern detection and integrated
search/natural-language processing.

ODV is used for decentralized analytics (45% of customers surveyed) as well as agile centralized BI
provisioning (44%). ODV is updated in four major releases a year.
Strengths
Packaged content for Oracle applications: ODV appeals to IT departments that have already
implemented Oracle's traditional BI platform capabilities, and to lines of business that have
deployed Oracle BI SaaS operational reporting on top of Oracle enterprise applications. Oracle
packaged, domain-specific content packs include connectors, dashboards and KPIs for finance,
HR, supply chain management and CRM. Users are also able to conduct "what if" and scenario
analysis within OBICS or ODV Cloud Service via Oracle's Essbase Service.

Global and hybrid cloud offerings: Oracle BI can be deployed on-premises or in its global cloud,
with the ability to directly query on-premises data from the cloud or migrate and extend on-
premises data models and content to the cloud (and vice versa) using a common interface for
content development across on-premises and cloud. Oracle's support for hybrid cloud
deployments and data gives its on-premises BI customers a glide path to transition to the cloud.
Oracle BI (including ODV) leverages the Oracle BI server's historical strengths in function shipping
queries to the underlying database to support and optimize querying of data left in place.
Embedded advanced analytics and visual appeal: In addition to offering core visual exploration
features for light interactive analysis, ODV supports advanced exploration and data manipulation
with the ability to create custom groups while visualizing (as well as in the data preparation layer)
drag-and-drop advanced analytic functions (such as forecasting, clustering, trending and outliers)
via an easy-to-install plug-in. The ability to bin measures is also available as a function. An
extensive set of statistical functions is natively available in the platform and through R integration,
although Python is not yet supported. ODV's use of motion as part of the interactivity experience
adds to the platform's modern visual appeal and contributes to its above-average scores.
Areas of Improvement
Self-service data preparation and data connectivity: ODV offers integrated self-service data
preparation for harmonizing a range of relational and big data sources but lacks support for JSON
and XML. Unlike competing products — where this basic data cleansing is done automatically —
when ingesting an Excel or .CVS file, data must be hyper clean free from blanks, spaces, nulls and
N/As. In the data preparation interface, analysts can create groups, build calculations, add filters
and there are a number transforms available, but users cannot create custom hierarchies.
Creating mashups from multiple sources is possible, but user-generated objects are not reusable
— only the individual datasets are available to others. In addition, multiple tables can't be joined in
one data source connection without writing SQL statements. Regarding inference, data types are
often incorrectly inferred and must be manually adjusted. Join and hierarchy inferences — such as
those for date and geography — are also not yet supported. While data lineage and impact
analysis are strong in OBICS and 12C, these capabilities are still limited in ODV. Finally, packaged
content including native connectors for cloud data sources is limited to Oracle (CRM, ERP, SCM,
HCM, including acquisitions — NetSuite, Taleo, RightNow and Eloqua) and Salesforce
applications, which is a gap when compared to most other cloud BI vendors.
Mobile, collaboration and alerts: Native mobile apps for iOS and Android are supported in OBICS
and 12C, but with a different rendering experience and less interactivity for ODV content versus
Answers and Publisher content; features such as disconnected analysis from a mobile device are
not supported for ODV content. Similarly, alerting is not available in ODV — although basic
scheduling is offered. Collaboration features such as discussion threads for discussing findings
with others within a view or dashboard, and integration of discussion threads within social
platforms, are also not supported.
Gaps remain for advanced interactivity and smart data discovery: There is no way to define a
custom hierarchy in ODV — although drilling through a hierarchy created in the Oracle BI semantic
layer and accessed as a data source in ODV is supported. Other interactive features such as
display as a percentage must be defined as a calculation, and the ability to create parameters is
not yet supported. Regarding geospatial capabilities, ODV does not yet support auto geocoding or
out-of-the-box location and distance calculations. With respect to smart data discovery, ODV
offers integrated search-based natural-language query as a way to generate views, but the ability
to automatically generate insights once a variable is selected is minimal in the data preparation
interface. More extensive features are on the roadmap.
Pentaho
The Pentaho Business Analytics platform offers a range of broad functionality across data
preparation, self-service and advanced analytics, with a particular focus on big data access and
integration. Mature data access and data transformation capabilities are provided by Pentaho Data
Integration (PDI) and advanced analytic capabilities by its Data Science Pack. Pentaho is a Hitachi
Group company. Lumada, Hitachi's IoT platform introduced in 2Q16, includes capabilities from
Pentaho.

In November 2016, Pentaho released version 7.0, which combined its data integration and business
analytics servers to support integrated analytic workflow, added visual data access and preparation,
and enhanced its capabilities for accessing diverse data sources with improved governance to solve
large-scale and complex analytic workflows.

Pentaho has one minor and one major release per year. The focus of this evaluation is on versions
6.1 and 7.0.
The reference customer group surveyed revealed that Pentaho is most often deployed for the OEM
or embedded BI use case (45%). Customers also reported using Pentaho for agile centralized BI
provisioning and decentralized analytics in more than a quarter of cases.

Strengths
Data to analytic scope: Pentaho rated Excellent to Outstanding for both embedded advanced
analytics and for its ability to embed analytic content. The machine learning capabilities offered
by its Data Science Pack offer a rich set of options for the development of BI applications with
embedded analytic capabilities. Pentaho provides commercial licenses for Weka — an open-
source, machine-learning framework with hundreds of analytic functions — and the ability to
embed Weka, R, Python and PMML, into orchestrated processes. When used in concert, Pentaho's
range of data handling and advanced analytics capabilities deliver a more-integrated experience
than offered by the use of separate data preparation and data science tools. Version 7 extended
this strength to governed data and analytic workflows, with visually enabled data access and
preparation supporting developer-to-business, closed-loop data and analytic usage.
Data integration: PDI offers a modern user interface and is the key platform component offering
access to data sources, data transformation capabilities, embedding of analytic models and BI
outputs (such as reports). Pentaho received among the top scores for the data source
connectivity and self-contained ETL and data storage critical capabilities, with an Excellent rating
in both. Users can analyze a broad range of data sources that can be blended and transformed in
PDI and analyzed in Pentaho Analyzer. Datasets can also be written back to a data repository for
analysis. Unique capabilities in PDI include the possibility of pushing native queries to NoSQL
databases, and an extensive array of data transformation functions using menus and formulas.
Scalable platform: Pentaho's open-source heritage, broad developer community and embedding
expertise mean the platform has tested, mature, corporate-grade core capabilities. As such, it
placed among the top scores for the admin, security and architecture capability. Scalability
features — such as node clustering and load balancing for high availability — make Pentaho an
excellent fit for large user deployments. Pentaho placed in the top quartile for user deployment
size, with an average deployment of almost 4,000 users, well above the 1,182 average deployment
size of surveyed customers.
Cautions
Ease of use: Customer reference survey data reports ongoing issues with ease of use. Pentaho
customers ranked it last overall for ease of use in administration and implementation, content
development and visual appeal, and in the bottom quartile for end-user content consumption ease
of use. When used as an enabler for OEM or embedded BI, ease-of-use limitations may be less of
an issue; however, they are more important for other use cases that are driving much of the new
buying activity.
Data discovery: Pentaho's functional capabilities remain less-suited for decentralized analytics
and governed data discovery use cases (only 5% of the customers surveyed used Pentaho for this
type of usage). Pentaho rates as Good for interactive visual exploration and analytic dashboards,
and is rated as Fair in the publish, share and collaborate capability. To be specific: in interactive
visual exploration, it lacks custom groupings and bins, and natural-language search are not
supported; in publish, share, and collaborate, it misses out-of-the box functionality for data
storytelling, discussion threads, integration with social platforms, real-time collaboration via
shared sessions, timelines and content rating and recommendations. These capabilities drive
user engagement and adoption.

Emerging functional areas: Pentaho scored as Fair in the capabilities that Gartner sees as the
emerging buying drivers for modern BI. In the cloud space, Pentaho does not provide its own
cloud hosting service, instead adopting a strategy of "bring your own license." Customers can
deploy the platform in AWS, Microsoft Azure or other cloud providers, which is then managed in
the same way as an on-premises solution, but running on a remote (cloud) server in a virtual
machine. For the smart data discovery area, it does not offer the capability to automatically
generate advanced analytic visualizations or models, or to operationalize auto-generated models.
Pyramid Analytics
Pyramid Analytics offers a modern BI and analytics platform with a broad and balanced range of
analytics capabilities, including ad hoc analysis, interactive visualization, analytic dashboards,
mobile, collaboration, automated distribution and alerts. The solution is well-suited to governed data
discovery through features such as BI content watermarking, reusability and sharing of datasets,
metadata management, and data lineage.

Continuing a long-term partnership, Pyramid Analytics remains highly integrated with Microsoft's BI
offerings. The platform offers an enterprise analytics front end to Microsoft SQL Server Analysis
Services (SSAS), while Microsoft Power BI can publish to the Pyramid BI Office server to deliver
Power BI content on-premises.

Pyramid's customers report deployments across a diverse range of use cases. Agile centralized BI
provisioning and decentralized analytics are both mentioned by 56% of its customers, and 55% use
the platform for traditional IT-centric reporting, while 51% reference governed data discovery.
In March 2016, Pyramid Analytics released major version 6, featuring additional connectors to new
data sources, improved data preparation features, user experience enhancements on data
discovery, and evolution of storytelling, mobile and publication capabilities. Since then, the product
has evolved to version 6.33, with new functionality in several areas and problem-correction updates.
A good example of functionality released throughout the year is the certified SAP connector,
opening access to SAP Hana and SAP Business Warehouse content. Product release cadence
stands at one major release per year, three minor versions and bug fixes as required.
Throughout 2017, Pyramid Analytics is focused on rearchitecting its platform to become partner
agnostic. This may prohibit innovation in leading-edge capabilities such as smart data discovery,
natural-language processing, native big data querying or higher integration with advanced analytics
solutions.
Strengths

Integration with Microsoft: Pyramid Analytics offers tight integration with the Microsoft BI stack
(including Microsoft Analysis Services) and the more-recent Power BI offering. Office BI has the
highest percentage of deployments on top of Microsoft-based enterprise data warehouses, at
74% of its customer references. This is even higher than Microsoft's own result (47%). It is also
one of the top platforms used with Microsoft's ERP and CRM solutions. Pyramid does offer a tight
and extensive integration with the Microsoft environment and should therefore be assessed when
that is a top requirement, although the stated roadmap is to make the platform agnostic and
available to any data warehouse.

Broad range of integrated capabilities: Pyramid received among the highest scores for platform
and integrated workflow capabilities, rated Outstanding, derived from a broad set of features
delivered under a single product and unified user experience. The user experience follows look
and feel of Microsoft Office, contributing to an easier learning curve.
Rapid content development: Content development using Pyramid Analytics is quicker than on
most other tools covered in this report. Regardless of complexity — simple, moderate and
complex content — the platform delivers results in less time than most of its competitors.
Excellent ratings on critical capabilities such as data source connectivity, interactive visual
exploration, and publish, share and collaborate explain these results.
Areas of Improvement
Dependence on the Microsoft stack: What was already described as one of the tool's strengths —
a tight integration with Microsoft products — can also be seen as an area for improvement.
Pyramid's dependence from Microsoft is strong and organizations leveraging databases and BI
components from other vendors will find it difficult to justify using the platform. Being aware of
that, and knowing that Microsoft will soon offer the ability to publish Power BI content on-
premises without needing Pyramid Analytics, the product roadmap is clearly headed at making
the platform more agnostic, to increase its appeal to non-Microsoft shops.

Modernization required to improve ease of use and visual appeal: Pyramid's customers rate the
product's ease of use below average relative to other products in this research, although it is rated
close to Excellent (3.8) in aggregate terms. Ease of use, for business consumers, was the lowest
rated in our survey, and this group is arguably the most-important segment for users. Visual
appeal also ranked in the bottom quartile.
Limited cloud offering: Although offering a cloud BI option that includes hybrid connectivity to on-
premises and cloud-based data, Pyramid Analytics has gaps where it lags behind competitors.
The solution has limited packaged content and does not offer a marketplace with partner's or
Pyramid's add-ons to BI Office. Also, the offering is mainly a Microsoft Azure marketplace option.
There is no optimized version for Amazon AWS — the leading cloud provider — although
customers could decide to operate on a bring-your-own-license model to deploy it there.
Qlik
Qlik offers governed data discovery and analytics either as a stand-alone application or
(increasingly) embedded in other applications. Qlik Sense is the vendor's lead product and is sold to
most new customers, while QlikView continues to be enhanced and makes up a larger portion of the
company's installed customer base. The Qlik Analytics Platform is the product that developers can
use for embedded BI and is the platform on which Qlik Sense has been developed.
The in-memory engine and associative analytics allow customers to build robust, interactive
applications and to visualize patterns in data in ways that are not readily achievable with straight
SQL. NPrinting, which provides report scheduling and distribution, was added to Qlik Sense in 2016
(previously only available for QlikView), enabling Qlik to provide interactive visual discovery and also
Mode 1 BI in an agile way.
In the last year, Qlik has improved the data preparation process, making it more visual, and launched
Qlik Sense Cloud Business for up to 50 users in the cloud, with an enterprise edition planned for
2017. The support of NPrinting to Qlik Sense (previously only supported in QlikView) was an
important enhancement in 2016, giving Qlik the ability to provide scheduled, formatted report
distribution as well as visual exploration in a cohesive product. In this way, Qlik supports both Mode
1 and Mode 2 style of analytics. Qlik Sense 3.1 (the focus of this evaluation) was released in 3Q16.
QlikView continues to be supported and enhanced but, as it is primarily offered to existing (rather
than net new) customers, it is not assessed in this note.
Qlik Sense is used primarily for agile centralized BI provisioning, with 64% of customers deploying
this way, followed by decentralized analytics (48%).
Strengths
Robust applications: Qlik Indexing Engine (QIX) allows customers to use Qlik Sense as a pseudo
data mart that supports multiple data sources, complex calculations and robust applications.
There is an in-database option, referred to as direct discovery, for customers who have invested in
an analytic database, but this is less widely used. The in-memory, associative engine supports
complex data models such as multiple fact tables. It also provides data scalability and in-memory
compression, with Qlik ranking in the top third of Critical Capabilities vendors for data volumes
from HDFS.

Ease of use and visual appeal: Ease of use is a key buying requirement in the modern BI and
analytics market that includes the ease of implementation, but also ease of building content.
While both QlikView and Qlik Sense both support rapid implementation, the ease of use and visual
appeal is markedly better in Qlik Sense. Qlik Sense rates Excellent for this critical capability. The
subcriterion for visual appeal ranks in the top quartile. Qlik's "smart search" contributes to the
ease and power of the application; a user can enter a search term and Qlik Sense will
automatically present a list of dimensions and measures to filter the current dashboard by these
keywords. Values with no association to the currently displayed dataset are grayed out; in this
way, a user can see, for example, which products are no t selling in a particular region.
Embedded BI: Open APIs were a key design tenet of Qlik Sense and for the embed analytic
content capability, the product rates Excellent to Outstanding. Partners can "white label" the
product in an OEM arrangement, or can embed it in custom applications. Partner Host Analytics in
the CPM market uses Qlik as its dashboard and visual exploration front end. In addition, Qlik was
one of the first to showcase natural-language generation with partner Narrative Science. Here, the
open APIs allowed Narrative Science to develop an extension that customers can easily install.
The Narrative Science extension then appears automatically in the dashboard design interface to
allow an author to add a text box to the page. A textual explanation of a chart automatically writes
itself as a user filters within the chart. This is just one example of many extensions that the
partner network and customers themselves have developed.
Areas of Improvement

No predictive: Qlik scores only Fair for its embedded advanced analytics. Increasingly, business
users expect menu-driven ability to do simple forecasting and clustering within a BI tool, neither of
which are supported in Qlik Sense. While there are some statistical functions as part of the Qlik
Sense function library, there is no ability to call out to an R script. This is on the product roadmap.
Likewise, advanced chart types such as decision trees are not natively supported but could be
added via an extension. Smart data discovery, in which users want insights generated
automatically from the software, is not supported.
Mobile limitations: Qlik Sense supports the consumption of content on mobile devices via
HTML5. This approach supports authoring as well as optimal rendering of content across
different device types via responsive design. However, Qlik Sense does not support native apps, a
point of difference from QlikView and key competitors. The lack of a native app approach means
that capabilities such as offline, push notifications and location awareness, and intuitive native
iOS selectors and charts are not supported. These are roadmap items.
Cloud and other gaps: Qlik launched its cloud solution in 2015, initially positioned for individuals.
Qlik Sense Cloud Business (for small to midsize organizations and up to 50 users) was released
end of January 2017. The current offering does not support hybrid connectivity to on-premises
data sources and lacks a number of administrative and security features that enterprises require.
An enterprise release is planned. Prebuilt content for cloud-data sources is a subcriterion that Qlik
also lacks. While Qlik scores well at the capability level for visual exploration and publishing, it
lacks specific features in these categories. For example, there is no out-of-the-box support for
trellis charts, display as percentages, or binning. Collaboration in the form of discussion threads is
also not supported. While ease of use is excellent overall, the product still lacks a point-and-click
interface for building expressions, although a type-ahead feature was introduced in the last year.
Salesforce
Salesforce is in the process of integrating its Wave Analytics offering with its recent acquisition
(September 2016) of BeyondCore. Wave is a platform for creating point-and-click interactive
visualizations, dashboards and analysis with integrated self-service data preparation. BeyondCore, a
Visionary on last year's Magic Quadrant, is a market disruptor that uses machine learning under the
covers to automatically find, visualize and narrate important findings in data, without requiring users
to build models or write algorithms. Salesforce Wave is sold as a stand-alone platform and also as
the foundation of packaged, closed-loop, front-office analytic applications for sales, marketing and
service. The platform is natively mobile and offers collaboration through integration with Salesforce
Chatter. BeyondCore, now rebranded as Salesforce Analytics Cloud Smart Data Discovery, will also
be sold stand-alone and as an optional component to Wave, Wave apps and to Salesforce-branded
Einstein applications. Ultimately, the plan is for BeyondCore's automated insight and narrative
generation to be a seamless part of the Einstein-enabled Wave platform, applications and
experience. Given that this work is already partially implemented, the capabilities are considered
jointly rather than as separate product evaluations.
Salesforce continues to primarily cater to its installed base.
Salesforce Wave has three major releases per year in October, February and June. Wave is used
primarily for the decentralized analytics (37%) and OEM or embedded BI use cases (37%).
BeyondCore is used primarily for governed data discovery (33%) and decentralized analytics (33%).

Strengths
Well positioned for the next BI and analytics disruption: Embedded advanced analytics and smart
data discovery are clear strengths for Salesforce receiving among the top ratings for these critical
capabilities. Salesforce has also earned among the highest scores for ease of use, and the
highest for visual appeal from its reference customers. BeyondCore is used to automatically
evaluate every data combination to identify meaningful patterns and areas of potential interest
that warrant further exploration. BeyondCore automatically creates a deterministic piece-wise
regression model that finds complex relationships that are difficult to express with traditional
regression models. For example, affinity analysis is used in diagnostic graphs and in the narrative
text of descriptive and diagnostic graphs. Time series data is automatically identified, and
bottom-up forecast analysis is conducted. BeyondCore conducts statistical tests on its findings
and insights to ensure that they are statistically significant and actionable before generating
graphs and corresponding narratives. It then automatically generates a narrative (textual as well
as voice) explaining the key insights in each graph and relationships between graphs. The
underlying R code for diagnostic and predictive analysis can be exported as an R model for data
scientists to validate and extend the models as needed. Initial integration between Wave and
BeyondCore allows Wave users to access datasets using BeyondCore connectors (not currently
available in Wave) and smart data preparation features. Users can also generate a Wave app from
insights automatically generated by BeyondCore.
Optimized for Salesforce: Wave packaged applications for sales, marketing and service are a key
differentiator and a major reason why customers buy the platform. Salesforce business
consumers gain integrated, contextualized insights from within the Salesforce Application
workflow (particularly when using the packaged Wave-based analytics apps). Wave is natively
integrated with Salesforce security, collaboration and metadata, including simplified access to
Salesforce application tables through an intuitive wizard. Users can invoke Salesforce actions
from within Wave (such as data quality, new campaigns and targeted outreach) and can
collaborate using Chatter. Due to a focus on Salesforce optimizations and customer-facing Wave-
enabled apps, Wave continues to appeal primarily to the Salesforce installed base who have most
of their data in the cloud, particularly in Salesforce, and want to augment it with on-premises data.
Extensibility and embeddability: Salesforce Analytics Wave has a broad partner ecosystem that
includes many ETL, predictive analytics vendors, and system integrators. Wave exposes services
via REST-based APIs, which can be consumed by other distributed services and used to create
and extend new applications based on the Wave platform. Its developer marketplace,
AppExchange, provides a platform for independent software vendors and developers to build and
sell custom content (including datasets, lenses, metadata and applications). It also provides a
market for developer skills, making them widely available despite Wave's recent entry into the
market. Salesforce also offers OEM-specific packaging and licensing for Wave.
Areas of Improvement
Advanced interactive data exploration and manipulation and analytic dashboards: Over the past
year, Salesforce has added a number of good interactive features including quick calcs for
common types of analysis (display as a percentage, for example) and good visual linking, and
both Wave and BeyondCore automatically generate best-practice visualizations including colors
and sorts, which is a positive step. However, advanced interactive exploration — such as grouping,
binning in the visual analysis and dashboard environment, and extensive geospatial capabilities —
is still limited. Users can interact with maps by country, region, state and city, but automatic
geocoding is only available for Salesforce data, and there are no out-of-the-box geospatial
algorithms. Currently, this is only possible with Salesforce SOQL Geolocate API. Some chart types
are missing out of the box (trellis, for example), with some on the 2017 roadmap or possible via
extensions.
Self-service data preparation: Over the past year, Salesforce added a new self-service data
preparation interface — Dataset Designer — to Wave, which provides basic capabilities for
accessing and manipulating data for analysis. Data lineage is available via the Wave bulk API,
including derived fields and the new data recipe feature, which self-records transformations to the
data, and is exposed in an information panel in each widget on a dashboard. While improved,
there are limited features for data profiling and manipulation, building calculations, groups, bins,
hierarchies and full join operations are not yet supported. BeyondCore can automatically discover
data types such as date, text, and measures, even on datasets with data quality issues. During the
"lookup data" (join) process, BeyondCore recommends the join key if it can detect a match. This
can be leveraged in Wave.
Data source connectivity: Data is loaded into Wave's proprietary data store for analysis — hybrid
data connectivity and direct query of on-premises data sources left in place are not yet supported
in Wave. It will be supported in Wave via the integration of BeyondCore, which provides this
capability. Connection to non-Salesforce enterprise application and other data currently requires
third-party partner tools. Native Wave connectors to a range of other enterprise applications
sources will be delivered through an OEM relationship. Informatica connectors are planned for
2017. Regarding personal data sources, connectors are limited to .CSV and Microsoft Excel files;
XML, JSON or RSS feeds are not yet supported. BeyondCore can be used to access most major
databases, as well as big data sources through Hive and partner sources such as SAP Hana.
SAP (BusinessObjects Cloud)
SAP delivers a broad range of BI and analytic capabilities for both, large IT-managed enterprise
reporting deployments and business-user-driven, data discovery deployments. Companies often
choose SAP as their enterprise BI standard, especially if they also standardize on SAP applications.
SAP BusinessObjects Cloud, introduced in October 2015 (formerly called SAP Cloud for Analytics),
is a purely cloud-based deployment, built on SAP's Hana cloud platform. The focus of this
evaluation is on version 2016.25.
SAP BusinessObjects Cloud combines data discovery, predictive analytics and planning in an
integrated, cloud-based product running on the SAP Hana Cloud Platform, SAP's platform-as-a-
service offering. SAP's new Digital Boardroom solution is built on the SAP BusinessObjects Cloud
platform. The Digital Boardroom includes stories from SAP BusinessObjects Cloud and additional
capabilities, most notably the ability to display stories on a three-panel, wall-size display that is
touch-enabled. In addition, there are unique capabilities such as an agenda builder, value driver tree,
and simulation or what-if analysis. As pure cloud products, all data modeling, administration, and
authoring of content are done via a browser. The introduction of guided machine discovery
represents a remarkable improvement toward smart data discovery as the next wave of disruption
in the BI and analytics platform market.
The predominant use case for SAP BusinessObjects Cloud is decentralized analytics (73%),
followed by agile centralized BI provisioning (46%). SAP BusinessObjects Cloud is currently updated
on a biweekly cycle.

Strengths
Modern BI platform with smart data discovery: Guided machine discovery is a capability on the
cloud platform, enabling business users to gain insights automatically, such as key influencers,
classification and regressions. The insights are visualized with automatically selected graphs and
a narrative to explain the most important drivers of a particular metric. A further option exists to
integrate the SAP Hana text analytics and natural-language services.
Part of a comprehensive cloud platform: The cloud platform also provides planning and
predictive analytics components (at extra license cost), and is the core platform for the Digital
Boardroom solution. SAP offers a unified platform for analytics, predictive and planning
capabilities, which has so far been a niche segment. The platform leverages Hana capabilities, as
it is built on the Hana Cloud Platform (HCP). Live access for in-database queries is available for
on-premises SAP Hana and SAP Hana Cloud Platform, with hybrid connectivity to Business
Warehouse and S/4HANA planned for 1Q17. Incremental data loading is available for other SAP
and cloud data sources, such as SAP BPC, SAP BusinessObjects Universes, Google Sheets, SAP
SuccessFactors, SAP ECC and Salesforce.

Ease of use and data preparation: SAP BusinessObjects Cloud achieved a Good to Excellent
score for the self-service data preparation critical capability. Several subcriteria in this critical
capability are fully supported, such as enabling business user joins, data mashup, data modelling
and data enrichment. It provides a modern, visually appealing user interface, and scores in the top
third for all products. Survey respondents rated the platform in the top quartile for ease of use
regarding administration and content creation; and in the top third for content consumption, with
an overall Excellent rating for this critical capability.
Areas of Improvement
Modern and new, but less mature: One out of five survey respondents indicated absent or weak
functionality as the primary platform problem for SAP BusinessObjects Cloud. For the critical
capabilities interactive visual exploration as well as analytic dashboards, this product is less
mature compared to Lumira, and other modern BI platforms. Several subcriteria are not fully
supported, such as information visualization, binning, advanced chart types or chart formatting
options, animation and playback, or disconnected exploration.
Mobile: Mobile exploration and authoring is currently the weakest critical capability for SAP
BusinessObjects Cloud. Native support for iOS and Android mobile devices are roadmap items,
unlike SAP BusinessObjects Lumira, which already offers full support here, including offline
consumption. The current iOS app can only be used for collaboration with colleagues, viewing
events and tasks, and notifications, but not to explore or even author analytics. Responsive
design, a critical feature to adapt to different screen sizes for generic mobile device usability, is a
roadmap item.

Embedding: Embedding analytical content is rather weak, with only Poor to Fair scores for SAP
BusinessObjects Cloud in this critical capability. Software development kits for printing,
parametrization, creation, deletion, copying, portal integration, administration, among others are
currently not available. Several of the mentioned features are on SAP's roadmap and planned for
near-term updates. Given SAP's frequent update cycle, clients wanting to embed analytics should
check regularly for these updates.

SAP (BusinessObjects Lumira)


SAP (BusinessObjects Lumira)
SAP delivers a broad range of BI and analytic capabilities for both, large IT-managed enterprise
reporting deployments and business-user-driven, data discovery deployments. Companies often
choose SAP as their enterprise BI standard, especially if they also standardize on SAP applications.
SAP BusinessObjects Enterprise and Lumira are primarily for on-premises deployments.
Similar to last year, several of SAP's BI and analytic components are described in the "Market Guide
for Enterprise-Reporting-Based Platforms" such as SAP BusinessObjects Design Studio, SAP
BusinessObjects Dashboards, SAP Crystal Reports, SAP BusinessObjects Web Intelligence and SAP
BusinessObjects Analysis for Office.

The following components were considered for the Magic Quadrant and this Critical Capabilities:
SAP BusinessObjects Enterprise with the following components:
SAP BusinessObjects Business Intelligence platform (current version is 4.2).

SAP BusinessObjects Lumira, server for BI platform (current version is 1.31).


SAP BusinessObjects Lumira desktop (current version is 1.31).
Lumira Edge, a small-to-medium size enterprise (SME) offering (current version is 1.31).

SAP BusinessObjects Lumira currently has a release cycle of one major release every 18 months
and one minor release every quarter. As of 3Q15, SAP discontinued future updates to Lumira server
on Hana, which is replaced by Lumira server for BI platform, and Lumira Cloud, which is replaced by
SAP BusinessObjects Cloud. End of maintenance for Lumira server on Hana and Lumira Cloud was
30 September 2016.
The most-significant effort with respect to SAP BusinessObjects Lumira is certainly the announced
merger of the two products, SAP BusinessObjects Lumira and SAP BusinessObjects Design Studio,
while SAP continued to improve robustness and maturity of SAP BusinessObjects Lumira. Advances
in SAP BusinessObjects Lumira were made with web-based authoring of SAP Hana-based stories
and support for offline mobile consumption.
For SAP BusinessObjects Lumira, survey responses indicate an almost equal distribution across
three use cases. It is equally often used for agile centralized BI provisioning (63), decentralized
analytics (60%) and governed data discovery (58%).

Strengths
Information portal and analytics workbench scenarios: SAP BusinessObjects Lumira's product
strength across the critical capabilities lies in the BI platform administration, and its interactive
visualization and analytic content creation capabilities. The former is based on features of the
mature SAP BusinessObjects BI platform, with integration to the metadata layer (universe). SAP
has continued to enhance the product's interactive visual exploration capabilities to include the
most important chart types, custom groups and hierarchies, applying filters, drilling down and up,
binning, and sorting.
Visual exploration and further simplification: Early versions of SAP BusinessObjects Lumira
lacked some of the core chart types and data manipulation features that have continued to
improve with each release. The product now supports the most-popular chart types out of the
box, including grouping, binning and display as percentages. Version 2 will merge the two
components of SAP BusinessObjects Lumira and SAP BusinessObjects Design Studio into one
product with two authoring interfaces positioned for business user authors, and IT developers,
respectively. It is planned to start ramp up in April 2017 and a general availability release is
planned in late 2Q17 or early 3Q17. This could further strengthen the interoperability between
centrally provisioned and decentrally developed analytics, and minimize confusion over which
authoring tool to use for analytic dashboards.

Self-service data preparation and mapping: SAP BusinessObjects Lumira was one of the first
data discovery products to include menu-driven self-service data preparation. It achieved a Good
score for our self-service data preparation capability, with full support for subcriteria, such as
enabling business user joins, data mashup, data modelling and data enrichment. Basic geospatial
capabilities are well-supported, leveraging the Esri partnership and Navteq geo database,
geocoding and reverse geocoding is available, based on latitude/longitude data. Clients can build
geographic hierarchies, and drawing point, bubble, pie and chloropleth visualizations are
supported.

Areas of Improvement
Smart and advanced analytics: SAP BusinessObjects Lumira offers less features in the critical
capabilities of embedded advanced analytics and smart data discovery (which are stronger in
SAP BusinessObjects Cloud). It only achieved Poor to Fair scores for embedded advanced
analytics, and Fair scores for smart data discovery. For embedded advanced analytics, some
features are not available, such as advanced analytics visualizations or advanced predictive
analysis. Menu-driven options for influence analysis and forecasting are supported. However,
clustering is not natively supported, nor is integration with third-party statistical languages such
as R, with only limited built-in statistical functions. For smart data discovery, the guided machine
discovery features are not available, nor is natural-language Q&A or natural-language generation
available as a feature.
Limitations in platform integration and workflow: Despite the common name, the interoperability
between SAP BusinessObjects Enterprise and SAP BusinessObjects Cloud is limited. While there
is some interoperability on data source and data access level, content created by one platform
cannot be promoted to or reused by the other. Clients should regard both as distinct platforms,
even requiring different licenses. Further, even within the SAP BusinessObjects Enterprise
platform, Lumira does not inherit all of the platform capabilities. For example, scheduling and
alerting are not supported.
Product maturity still work in progress: Clients continue to raise concerns about this product's
maturity. One out of three clients using SAP BusinessObjects Lumira identified absent or weak
functionality as the two most-significant platform problems. Software quality was the most
significant limitation for one out of five clients in the survey. Lumira achieved only a Fair score in
the publishing, sharing and collaboration critical capability, and only a Fair to Good score in data
source connectivity.
SAS
SAS
SAS offers a range of BI and analytic capabilities with SAS Visual Analytics providing interactive
discovery, embedded advanced analytics, and dashboards for mainstream business users as well
as data scientists.
SAS Office Analytics includes SAS Enterprise Guide and Microsoft Office integration with Excel,
PowerPoint, Word and Outlook. SAS Office Analytics allows Visual Analytics content to be
dynamically refreshed and interacted with via PowerPoint and other Office tools. SAS Enterprise
Guide is a desktop product that allows power users to perform self-service data preparation and
advanced analytics that can then be published to SAS Visual Analytics server.

SAS Visual Analytics can be deployed on-premises or, via the cloud, although only a small
percentage of surveyed customers have deployed in the cloud. According to SAS, Visual Analytics is
deployed by over 13,000 organizations. As a server-based product, Visual Analytics is most often
used for decentralized analytics (64% of customers surveyed), with 40% using it for agile centralized
BI provisioning.
SAS Visual Analytics runs on the SAS LASR Analytic Server, an in-memory offering that uses Hadoop
for its persistence layer, and that has the ability to handle large datasets.

SAS Visual Analytics 7.3 was released in August 2015 with no new releases in 2016 as SAS works
to rearchitect the product to run on the new SAS Viya architecture. SAS Viya is an open, cloud-ready,
microservice-based architecture. The user interface has been completely redesigned. Existing SAS
Visual Analytics customers will be able to upgrade to version 7.4 on their current SAS 9.4 platform
or eventually upgrade to SAS Visual Analytics running on Viya. The first release of SAS Visual
Analytics to run on Viya will be version 8.1, due in early 2017. As this product is not yet released, it is
not within the scope of this product evaluation.
Strengths

Embedded advanced analytics: With the company's origins in advanced analytics, it is not
surprising that SAS rates between Good and Excellent for its embedded advanced analytics
capabilities. SAS supports multiple forecasting models via a point-and-click interface. Clustering
and decision trees are also differentiators, as well as a number of advanced visualizations
including Sankey diagram, network analysis and correlation matrix. Although SAS has high scores
for this capability, the automated generation of analytics and natural-language generation are
lacking. These are roadmap items.
Interactive visual exploration: SAS Visual Analytics Explorer interface rates Excellent to
Outstanding for interactive visual exploration. As users drag and drop elements onto a canvas,
Visual Analytics automatically renders the data using the optimal chart display and color settings.
Users can choose to redisplay measures as percentages, time period variances or period growth,
all via menu options, and without needing to create specific expressions. Binning for time periods
is supported, but not for numeric values. Users can create custom categories for new groups of
dimension values.
Broad BI platform capabilities: SAS competed in the traditional BI platform market with its
Enterprise BI Server product and introduced SAS Visual Analytics in 2012. Often, newly introduced
products have not inherited the full platform capabilities. Although SAS Visual Analytics was
released as a new product, it supports the full platform capabilities of enterprise-grade
administration, scheduling and publishing, and support for mobile.

Areas of Improvement
Less easy to use, administer and deploy: SAS Visual Analytics scores in the bottom quartile for
ease of use and visual appeal, but still rating Good overall. Of customers surveyed, 17% cited
difficulty in deploying the platform. In authoring, ease of use suffers from SAS Visual Analytics
having multiple design interfaces for exploration and formatted reports. There is interoperability
between the two, but the workflow is not as seamless as in competitive products.
SDK and APIs: SAS Visual Analytics has Fair scores for embedded analytic content, and only 11%
of customers are using it for the OEM or embedded BI use case. There are no separate SDKs.
Individual charts and dashboards cannot be published directly to third-party portals. SAS does
support integration of Visual Analytics reports (a particular content type) as a Microsoft
SharePoint Web Part. The SAS Visual Analytics viewer can be embedded in an iframe. Many of
these limitations should be addressed via the new open architecture.
Self-service data preparation not cohesive product/workflow: Self-service data preparation is an
important characteristic of a modern BI and analytics platform as users must be able to access
and manipulate data from new data sources that may not have been curated by a data warehouse
team. SAS scored between Good and Excellent for this capability as it is possible to prepare data
via a number of options including the SAS Enterprise Guide and the browser-based Data Builder.
However, there is room for improvement in its smart preparation capabilities, data profiling and a
streamlined workflow. Chart types are not consistent between the visualize and report interfaces.
These are all areas of improvement planned for SAS Visual Analytics 8.1 on Viya.
Sisense
Sisense offers a single platform with a self-contained, in-chip, columnar database engine that allows
for visual exploration of web-based dashboards. Data modeling and preparation is via a desktop
interface, while dashboards are authored via a browser-based interface. Security and administration
is also browser-based.

The vendor's in-chip approach allows for data scalability and high performance. It has introduced a
number of innovative capabilities in the last year, the most noteworthy being the voice integration
with Amazon Alexa and BI bots. Users can pose analytic queries using voice, while Alexa will answer
with key metrics and trends. In addition, integration with LIFX lightbulbs is able to light a room or a
desktop lamp in particular colors depending on how key metrics are performing.
More fundamental improvements in the product include the ability to leave data in place, making it
now more suitable for customers that have already invested in a high-performing analytic database.
Scheduling and alerts were also added in 4Q16.
Sisense has a major release every quarter with minor releases monthly. The focus of this evaluation
is on version 6.6.
Sisense is most often deployed for the OEM or embedded BI use case (43%), followed by agile
centralized BI provisioning (33%).
Strengths
In-memory, in-chip and in-database: ElastiCube is Sisense's in-memory and columnar storage
engine that supports analysis of complex data models, coming from multiple data sources. While
many of the product's in this Critical Capabilities research use in-memory and columnar storage,
Sisense further differentiates itself on in-chip processing to provide even faster performance and
data scalability. With in-chip processing, large datasets are broken into smaller sets of data and
cached at the CPU level, giving more scalability than if all data had to be loaded in-memory only. In
addition, the engine supports multiple fact tables, with joins dynamically performed at query time,
on the fly, to minimize the upfront data modeling. For customers that have already invested in
high-performing databases, as of version 6.5 released in 4Q16, Sisense now also supports in-
database processing for several leading platforms including Amazon Redshift and HPE Vertica.
Embedded analytics: Sisense rated Excellent for the embed analytic content critical capability.
The platform is open with an SDK for a broad range of platform capabilities including mobile, print
and security. Dashboards and data in the ElastiCube are accessible via a REST API. Sisense
includes its own chart library that can be further extended with access to D3 visualization
libraries. As well, Sisense charts can be rendered as JavaScript to embed in any third-party
applications, with full support for interactivity such as dashboard filtering. Iframes are also
supported. OEM and extranet customers can white label the product and apply their own color
schemes and logos.
Elements of smart data discovery: Natural-language query and natural-language generation are
two aspects of smart data discovery that Sisense newly supports. With the BI bots, users can
engage in conversations to ask questions and receive a natural-language answer and
recommended action. These bots can be embedded within third-party collaboration tools such as
Slack. Sisense also has integration with Narrative Science for generating a text explanation of
results. Pulse is a new capability that supports anomaly detection and alert notification using
machine learning.

Areas of Improvement

Central model: The ability to mash data from multiple data sources is a responsibility of the cube
designer, thus limiting the flexibility to individual users and the decentralized analytics use case.
From a governance perspective, data lineage in Sisense is a two-step, disconnected process. On
the ETL/data integration level, all data sources and how they are accessed can be seen. On the
dashboard level, the underlying data model can be seen. Impact analysis of changes is not
supported. Watermarking is also not available for datasets, nor is the ability to outline and
manage the certification or sanctioning of datasets or content according to the clients'
governance policies.

Embedded advanced analytics not out of the box: Sisense added support for R in 2015 and uses
extensions to provide advanced analytics. Extensions may be built by the vendor or contributed to
by the community. For example, for menu-driven forecasting or clustering, the administrator must
download the extension to the server to be able to invoke it. A similar approach is used for
advanced chart types such as decision trees and network charts that are not native to the
product.

Cloud limitations: Sisense is offered in the Amazon Marketplace or can be hosted in a single-
tenant option that the vendor maintains in Amazon Web Services (AWS). The vendor supports a
bring-your-own-license approach for customers to deploy in other cloud platforms such as
Microsoft Azure or Rackspace, but does not offer a SaaS option itself. Further, the software has
not been certified for the cloud from a security perspective. Access to cloud-based applications is
a subcriterion within this capability, and here, Sisense provides a number of prebuilt industry
templates with data models and dashboards. However, these templates are not specific to
particular business applications (such as Salesforce for CRM).

Tableau
Tableau delivers an easy-to-use, visualization-based analytic workflow experience to business users
to access, prepare and analyze their data without IT support through three primary products:
Tableau Desktop, Tableau Server and Tableau Online (Tableau's cloud offering). Tableau Desktop is
a desktop-based design tool. Tableau Server and Tableau Online are used to share content and
scale deployments to a broader range of users. Tableau Server represents the on-premises option.
Tableau Online is the cloud-based option, which is hosted in Tableau's data center but is now also
running on AWS. Additionally, Tableau Server can be hosted on any of the major public cloud
platforms: AWS, Microsoft Azure and Google Cloud Platform. In both deployment models, data
model authoring and full formatting options are available within Tableau Desktop and then
published to the server or the cloud. From within a browser-based environment, users can interact
with published dashboards and create new dashboards and sheets.

Tableau 10, released in August 2016, added data federation capabilities and makes data mashups
more reusable. Drag-and-drop advanced analytics for clustering and segmentation are added to
existing capabilities for forecasting and trending. Tableau 10 now has improved JavaScript and
REST APIs. Also new in Tableau 10 is the ability to author a dashboard and workbook-level format
from within the browser in Tableau Server. Tableau is deployed primarily for decentralized analytics
(50%), followed by agile centralized BI provisioning (41%).

Strengths

Intuitive interactive visual exploration and dashboards: Tableau's core product strengths
continue to be its intuitive interactive visualization and exploration and analytic dashboard
capabilities for almost any data source, as confirmed by its Excellent rating for these capabilities.
Tableau enables rapid advanced exploration and content creation for core users by automating
routine tasks, such as geocoding and the creation of time hierarchies (month, quarter or year, for
example) on data fields and adding type ahead for formula building. Users can also create and
analyze data by custom geographic territories by lassoing or clicking on marks on a map. For
advanced analytics, new drag-and-drop clustering in Tableau 10 builds on existing advanced
analytics functions for forecasting, and trends that are available from simple menus make
insights from advanced analytics available to business users. R and newly added Python
integration are also supported for incorporating additional algorithms into Tableau calculations.
Tableau's reference customers score its ease of use among the highest of all these vendors.
Breadth of accessible data sources: Tableau allows business users to interact with a broad
range of data sources by using an extensive set of data connectors with both in-memory and
direct query access for larger datasets. Data source connectivity is a strength of Tableau, and
an area that is further improved in version 10. It natively supports a broad range of relational
databases, Hadoop distributions, NoSQL sources, personal files and statistical package output
formats (including IBM SPSS, SAS and R data files). A web data connector provides access to
web services via available APIs. New in Tableau 10 is support for Marketo, SAP Hana MemSQL,
QuickBooks Online, Oracle Stored Procedures and Google Sheets, to name only a few of the
additions.

Broad enterprise mobile support with responsive design: Tableau content can be rendered
natively with full interactivity on iOS and Android tablets and phones. Moreover, while offline
analysis is not yet supported, improvements to Tableau 10's mobile features address enterprise
security needs, with capabilities to integrate with third-party mobile device management
platforms, such as those of VMware (AirWatch) and MobileIron, while also enhancing responsive
design for all mobile screen sizes.
Areas of Improvement

Basic data models and multiple fact tables: Each data source is constrained to a single star
schema; multiple fact tables within the same physical data source are not supported and must be
created elsewhere when needed. Tableau's overall metadata management rating is impacted by
limitations in its ability to reuse data-source-specific metadata objects (such as calculated
measures, custom groups, hierarchies) across workbooks and underlying data extracts. It is not
well-suited to more-difficult data mashups with inconsistent codes in join fields. This also affects
Tableau's ability to offer support for data lineage and impact analysis. When preparing data, Data
Interpreter will make its best guess at unpivoting cross-tabs and removing empty cells to create a
tabular dataset from spreadsheet data, and users can do some data cleansing tasks through
formulas as they load data, but extensive, point-and-click or automated data profiling and
transformations are not supported. Tableau has announced its plans to release a stand-alone,
self-service data preparation tool, codenamed Project Maestro, in 2017 to address its customers'
challenges with large and complex data.

Gaps in enterprise features and collaboration: Tableau is prioritizing enterprise features to close
current gaps — such as adding support for Linux, and improving APIs for better embeddability and
extensibility. Event-based scheduling, conditional alerting, printing to PDF and PowerPoint, and
collaboration and social platform integration are also works in progress. Collaboration capabilities
for threaded discussions are not supported in Tableau 10; there is no integration with third-party
collaboration platforms. There is also limited capabilities for users to rate (like) reports, but
recommendation of content based on those ratings is on the roadmap.
Limited scale and variable performance of Tableau data extracts: When Tableau first came on
the market, it primarily connected to data sources live — a key benefit for those customers with an
analytic database that want to replicate data in a proprietary, in-memory solution. The Tableau in-
memory engine, in the form of Tableau Data Extracts, was added in 2010 as an optional
performance enhancement. This part of the product is less-mature and scalable than competing
in-memory engines. Poor performance for large in-memory extracts often requires modeling in a
separate data repository that is directly queried from Tableau. In March 2016, Tableau acquired
HyPer to improve the engine's scalability. HyPer is a high-performance in-memory database
developed as a research project at the Technical University of Munich (TUM) that was not
launched as a commercial product. Tableau plans to use HyPer to replace the Tableau data
extracts in order to support larger datasets.

ThoughtSpot
ThoughtSpot is a visual analytics platform whose main differentiator is its search-based interface,
with a high-performance in-memory, columnar database deployed as an appliance. Users explore
data via a type-ahead feature that will also recommend the most popular search terms, existing
datasets, charts and dashboards. Once a chart is generated, users can assemble them into
dashboards (or "pinboards" as the vendor refers to them). ThoughtSpot's software can be deployed
on a preconfigured hardware appliance, as an on-premises virtual machine with VMware, or
optionally in the cloud via Microsoft Azure or Amazon Web Services.

Although ThoughtSpot's origins are in search-based analytics, the vendor introduced smart data
discovery branded as A3 (or Auto Awesome Analytics, if you prefer). With A3, visualizations are
generated automatically with a short textual explanation about the most important outliers and
trends. The vendor also recently added support for scheduling, discussion threads (first seen in
version 4 released in 4Q16), and a native mobile app (version 4.1, 1Q17). ThoughtSpot has a major
release annually and minor releases each quarter.

ThoughtSpot is most often deployed for decentralized analytics (61%) followed by agile centralized
BI provisioning (52%).

Strengths
Ease of use via search: ThoughtSpot uses a search-based interface that allows users to explore
and visualize data with an overall rating of Excellent. Machine learning algorithms provide a type-
ahead ability to make it easier for business users to find the most relevant search term. Rather
than having to drag and drop data elements onto a page, as is the predominant query flow in most
BI tools, users simply type in words, such as "sales by state" or "sales by product if color is red."
The data modeler can add a range of synonyms for each measure or dimension. After a
visualization is automatically generated from the search-based query, a user can pin it to create a
dashboard.
In-memory engine: The product uses a columnar, in-memory engine that indexes all of the
searchable data to ensure fast performance on large datasets. The engine is deployed as an
appliance on commodity hardware. Of customers surveyed, 24% are analyzing 1TB or more of
data in a single application, ranking ThoughtSpot in the top third for this metric. The in-memory
engine supports loading of data from multiple data sources. An OEM arrangement with
Informatica as part of ThoughtSpot "Data Connect" module allows data to be transformed and
cleansed as it is imported.

Scheduling and discussions: As of the version 4 release, users can schedule a dashboard to be
distributed via PDF or .CSV. As this capability is lacking in many modern BI products, it provides a
useful differentiator. However, the product does not yet support business events alerts, although
this is a roadmap item. In addition, support for discussions was recently added within the
dashboard or optionally via integration with Slack, a third-party collaboration platform.

Areas of Improvement
Lack of core exploration and advanced analytics: ThoughtSpot presents data in the most popular
chart formats but lacks many of the advanced charts and data manipulation capabilities that
power users and citizen data scientists want. For example, trellis charts and candlestick charts
are not supported; geographic mapping is limited, without automatic geocoding or distance
calculations. Features such as visual grouping, display as percentages, and binning would have to
be created via a formula rather than a menu or visual point-and-click approach. Drill down is
essentially a drill anywhere option as hierarchies are not supported. There are no advanced
analytics charts, nor is there a simple forecasting feature. It is not possible to call to third-party
algorithms such as R, although this is on the 2017 roadmap. Prior to the 4.1 release, content could
be consumed via a browser only. Support for a native iOS app has now been added, but security
features and offline capabilities are not part of the near-term roadmap.

Data replication: ThoughtSpot achieves high performance by replicating data into its in-memory
engine. However, customers who have already made investments in a high-performance analytic
engine (such as SAP Hana or Amazon Redshift) would prefer not to move data, thus limiting the
range of data for which ThoughtSpot is ideally suited. This also may be why the data volumes
accessed by surveyed customers are below the survey average for data originating in Hadoop, as
well as in relational data sources. The creation of a reusable data model is not automatic or
intuitive; a data connection is first defined and tables imported into ThoughtSpot, with limited
ability to prepare the data during the ingestion process. A modeler can then create a shared
"worksheet," but this model does not support hierarchies, grouping and visual cues of dimensions
and measures, or complex joins; only left outer joins are supported. There is support for multiple
fact tables, but fan traps are not handled.

Emerging smart data discovery: ThoughtSpot recently introduced several elements of smart data
discovery. By selecting A3 analysis, ThoughtSpot will generate a number of charts that describe
outliers in the data. There is a short textual explanation. Although these are important
innovations, the statistical relevancy of the charts generated does not identify the most-important
drivers of a particular metric; the charts generated are more descriptive and do not include cluster
analysis or segmentation.
TIBCO Software
TIBCO Spotfire offers extensive capabilities for analytics dashboards, interactive visualization and
data preparation in a single design tool while offering flexible processing options either in-memory
or in-database. TIBCO has continued to expand its feature set to include advanced analytics,
streaming and location intelligence.

In recent years, TIBCO moved to agile release cycles for Spotfire where functionality is first released
to the cloud and then on-premises. It does not make its exact release cadence publicly available.
TIBCO released Spotfire 7.7 in October of 2016.
TIBCO has also sharpened its product development focus and investment in Spotfire with an
emphasis on streamlining all aspects of the analytics content development and exploration
workflow, enhancing scalability and extensibility, and expanding connectivity to a range of data
sources — including integration with Spotfire's real-time charts (with StreamBase). TIBCO rates
Good plus to Excellent across all use cases, and 75% of Spotfire reference customers said they use
it for decentralized deployments; this is the highest percentage of any vendor in this Critical
Capabilities research.
Strengths

Analytics dashboards and advanced interactive exploration: TIBCO's end-to-end user experience
centers around dashboards, where users can interact, filter and analyze information, while
building dashboards that will be shared and consumed throughout the organization. A single
interface serves both purposes — exploring information and designing dashboards — simplifying
the user experience. Over the past year, the data preparation and harmonization experience has
been more embedded into the exploration experience for iterative analysis. The ability to filter
blended datasets, the option of linking different visualizations, extensive visual customization, and
automatic recommendations of visualizations further enhance the dashboard design and
consumption processes. Excellent interactive visual exploration within the dashboard
environment combines a broad range of visualizations with highly interactive exploration of data,
and point-and-click interactive features such as grouping, including top N versus rest, binning,
displaying values as percent-to-totals, and year-over-year comparisons.

Embedded advanced and geospatial analytics: Within Spotfire, analysts have access to an
extensive library of embedded advanced analytic functions. One-click calculations for descriptive
statistics such as correlations, similarity analysis, clustering, regression modeling, classification,
best fit as well as forecasting will appeal to advanced users, analysts and citizen data scientists.
Reference customers say they select and use Spotfire for its ease of use in conducting advanced
and complex analysis. They also score the platform as above average for these metrics. TIBCO's
location analytics capabilities (acquired through Maporama Solutions in 2013) are a good
complement to the analytics offering, and a strong differentiator. While most other vendors can
only use maps to visualize location-based data, TIBCO provides advanced geospatial functionality
such as geocoding, multilayering, geographic clustering, geofencing and several other location-
related capabilities. This includes geospatial algorithms for best route/distance between data
points on the map, and integrated access to TIBCO's optional data science runtime engine for the
analytic language, R, called TERR, which also gives users access to hundreds of R functions, when
more-advanced data modeling is required. Because of TERR, TIBCO has one of the strongest
embedded advanced analytics offerings in the modern BI and analytics market.
Streaming and multistructured data source connectivity: TIBCO Spotfire supports a range of
sources (relational, big data and cloud data sources) with new support for Amazon Redshift,
OData, Salesforce and Google analytics added this year. Support for streaming data through
Spotfire's bidirectional integration with TIBCO's StreamBase, which gives users insight into current
streaming data with the ability to drill into history, is a differentiator versus most of the vendors
included here. TIBCO's optional component, based on technology licensed from Attivio, supports
recommendation-driven blending of structured and unstructured data and the natural-language
query and exploration of this multistructured data from within Spotfire.

Areas of Improvement
Advanced self-service data preparation and aspects of security: Overall, TIBCO's self-service
data preparation features are rated as Good due to limited impact analysis, watermarking of
certified data sources, data masking, and recommendations for fixing or enriching data. In terms
of security and user administration, to implement row-level security, an administrator must define
personalized information links that leverage the source system security. As a result, row-level
security is largely administered outside of the TIBCO platform.

Following rather than leading on smart data discovery: While TIBCO has invested in some
features of smart data discovery — such as support for search-based query generation and
exploration via an optional component — and there is early API-level integration for natural-
language generation/narration with Automated Insights (a company owned by TIBCO's investor,
Vista Partners), it has not leveraged its extensive embedded advanced analytics expertise to lead
the market in next-generation machine-learning-enabled automation of the analytics workflow.
Automatic generation of new insights and crowdsourced recommendations to the user are on the
roadmap, and should close the gap with early disruptors.

Gaps in cloud and broad mobile support: TIBCO has been investing in a cloud-first strategy and
hybrid cloud-direct query access to data left in place on-premises. However, its limitations around
cloud-based packaged content, lack of a robust content marketplace, and gaps between the
desktop and cloud authoring environments have resulted in only a Good rating for Spotfire's cloud
capabilities. Regarding mobile, native support is limited to iOS devices including native iPhone
support added over the past year — native support for Android and Windows Mobile devices is not
provided. Offline exploration and support for out-of-the-box integration with mobile device
management providers, often an important enterprise feature, are not supported. Responsive
design was enhanced over the past year, but there is still only partial support. For example, a KPI
chart is responsive but for others, the content author has to manually adjust the sizing to render it
properly on each device. A less out-of-the-box, scripted approach to integration with the GPS on a
mobile device to automatically filter reports based on location also contributed to TIBCO's Good
mobile rating.

Yellowfin
Yellowfin delivers a single, web-based BI and analytics platform with a tightly integrated set of
components. Yellowfin supports collaboration, storytelling, mobile and dashboard creation.
Yellowfin established a marketplace for clients, which provides, for instance, geopacks,
visualizations and prebuilt content. Yellowfin represents the modern design of a central enterprise
BI platform and is well-suited for embedded analytics.

Yellowfin has a major release every nine months, with minor releases monthly. Yellowfin's platform
is currently on version 7.3. In 2016, Yellowfin continued to deliver out-of-the-box connectors for
third-party applications, and introduced new workflow capabilities. Agile centralized BI provisioning
is the most-popular use case in this survey (37% of customers), followed by OEM and embedded
analytics (29%).
Strengths

Leading in collaboration: Yellowfin achieved Excellent scores for the critical capability publish,
share and collaborate. Collaboration and social BI can be regarded as the gold standard among
the vendors in this Critical Capabilities. Discussion threads, comments and annotations can be
seen in a timeline. Users can add reports, dashboards and storyboards as favorites and like or
dislike them. Users can mark comments as insightful, and users can "follow" other users in
Yellowfin's timeline and see what they are working on. They can also create and share tasks with
other users.
Single modern platform: Yellowfin got Good to Excellent scores along the business-user-driven
analytic workflow. Data inference, data lineage, data modelling and enrichment are examples of
strong capabilities on the platform for self-service data preparation. The critical capabilities for
interactive visual exploration and analytic dashboards also rated Good to Excellent, with support
for the most-popular chart types, bins, and display as percentages (added in 7.3). Yellowfin
provides good support for geospatial capabilities and also offers geopacks on its marketplace. In
addition, Yellowfin's platform got an Outstanding score for workflow support within the platform.
Mobile support and workflow features: Yellowfin supports mobile BI generically with its web-
based architecture through the mobile HTML5 browser interface. In addition, iOS and Android
smartphones and tablets are natively supported with specific apps. Offline support is provided
along with built-in device-based security if a device is lost. In addition, Yellowfin supports third-
party MDM providers such as AirWatch and MobileIron.
Areas of Improvement

Cloud BI: Yellowfin currently does not offer a public cloud solution. A cloud deployment is
available through partners — in line with its partner-centric strategy. As a consequence, some
features and aspects of the cloud BI critical capability are not available (such as security
certification for software, terms and conditions regarding data ownership, and hybrid connectivity
to on-premises data sources). Absent or weak functionality was cited as the most-significant
platform problems in the survey by 18% of reference customers.
Self-contained in-memory engine: Yellowfin connects directly to a data source or data
warehouse. There is a caching mechanism that supports, appends and unions, but at a query
level, rather than within the broader dataset. In this regard, Yellowfin is appealing to customers
that have already invested in a high-performing analytic database, but less appealing when data is
coming from multiple data sources and in need of further data preparation. Poor performance
was reported as a barrier to adoption by 15% of reference customers.
Less-advanced analytics and smart capabilities: Embedded advanced analytics and smart data
discovery were two critical capabilities where Yellowfin achieved only Fair scores. Yellowfin
provides time-series analysis and a basic level of scenario modelling using Yellowfin's what-if
analysis. Other advanced algorithms — such as decision trees, neural nets and vector machines —
are not supported. Advanced analytics visualization requires plugging in third-party visualizations.
Smart data discovery capabilities (including automated insight generation, natural-language Q&A,
and natural-language generation) are increasingly important capabilities, but not supported.

Zoomdata
Zoomdata supports fast interactive analysis, visualization and dashboards for big and streaming
data. It uses micro queries and Spark to push down query processing to underlying big data
sources, while estimating results and making them immediately available for interactive analysis as
queries process.

Zoomdata content authoring and data management are via a modern, web-based architecture.
Zoomdata includes features for administration and security, creation of dashboards and
visualizations, and includes a smart connector framework for building custom connections, and
prepackaged connectors for a number of common application data sources. With the Data Fusion
capabilities in Zoomdata, a business user can join data from multiple data sources, both structured
and unstructured, into a single virtualized data source for federated exploration and analysis.
Zoomdata's Data DVR capabilities allow users to rewind, fast forward, analyze and compare
historical data with real-time streams.

Zoomdata is well-suited to business users and data scientists that need real-time insights from
streaming data across a range of big data sources, or for developers that need to embed these
insights in applications (32% of customers deployed for this use case). It has innovative capabilities
for streaming analytics based on a modern, distributed architecture that harmonizes multiple large
data sources into real-time interactive dashboards without moving data. Zoomdata can be deployed
on-premises or purchased with a one-click deployment through Amazon Web Services, Google
Cloud Platform or Microsoft Azure. Customers looking for traditional reporting or interactive
analysis against a data warehouse, for example, would not look to Zoomdata as their best option.

Strengths
Native streaming and big data support via a hybrid data architecture: Zoomdata is best-suited to
organizations needing to perform real-time interactive analysis on large datasets or streaming
data. The platform is natively optimized for a range of big data sources, including Hadoop, NoSQL
databases, streaming data, search index data (including Apache Solr and Elastic [formerly Elastic
Search]), and cloud data sources (such as Google BigQuery, Google Cloud Dataproc, Microsoft
HDInsight, Amazon Redshift, EMR and S3, among others) with support for in-database processing
and in-flight data harmonization. Reference customers select Zoomdata for its ability to support
large data volumes, data access and integration, and ease of use for business users. They also
report among the highest data volumes for queries. Zoomdata's fast processing is achieved
through micro-querying data left in place while leveraging the processing power of underlying big
data repositories. Instead of one big SQL statement, Zoomdata executes parallelized micro-
queries and performs calculations on the data as it streams in from the data source. Data is only
pulled into Zoomdata's own Spark instance, or into an external one, as a last resort when this is
the best approach for interactive analysis.
Analytic dashboards supporting complex types of analysis: Zoomdata offers embedded
statistical capabilities for building custom calculations analytic dashboards (rated Good) and can
use third-party models such as R and Python for advanced analytics. It can also be integrated with
Jupyter Notebook to fit into a data science workflow. Zoomdata's reference customers report that
an above-average percentage of data scientists/citizen data scientists use the platform for
complex types of analysis. The platform is often chosen for its advanced types of analysis and its
support for complex data types. Zoomdata is tied for first place among all the vendors in this
Critical Capabilities for complexity of analysis supported by the range of data sources and
breadth of usage.

Attractive embeddability: With roughly 40% of its 2016 revenue coming from OEMs, Zoomdata
launched a developer network to serve the needs of the analytic app market and to continue the
expansion of this part of its business. Zoomdata SDKs and extensive REST APIs support the
customization and embedding of most aspects of the platform, including administrative
functions, connectors and visualizations. Zoomdata visualizations work with any base map
supported by the Leaflet JavaScript library, including OpenStreetMap, MapQuest, Mapbox, and any
other provider that uses a compatible tile map server URL. Although out-of-the-box distance
calculations are not supported, Zoomdata's geospatial analysis has the ability to zoom to the
lowest layer available from the map provider, such as ZIP Code, city or street. Customers can
create custom map visualizations using either the embedded Zoomdata Chart Studio, or through
external applications using the Zoomdata SDK.
Areas of Improvement

Mobile, collaboration, scheduling and alerting: For mobile devices, HTML5 is supported, but
Zoomdata does not offer native applications for specific mobile operating systems. Zoomdata is
certified and supported to run in Safari browsers on Apple iOS tablets, but not on iPhones.
Similarly, because Zoomdata is a web-based platform, there is no offline support or integration
with MDM security providers. Storytelling and integration with social platforms are also not
supported. Regarding scheduling and alerting, the product does not provide Zoomdata content to
end users on a scheduled basis (user-defined or otherwise).
Self-service data preparation: Data lineage and impact analysis are not supported, nor is
inference beyond the basic automatic inference of data types. Advanced features, such as those
automatically suggesting or inferring relationships for joins or hierarchies, are not supported. In
terms of data transformations, a user can rename a data source and other data transformations,
but cannot yet combine and split columns or replace values — these are on the roadmap. Most of
these gaps are considered basic and offered by most other modern BI platforms.

Advanced interactive exploration: Some aspects of advanced interactive exploration are limited.
For example, Zoomdata dashboards have a global filter feature where users can apply a common
filter to all charts from the same data source, but not across disparate data sources. Common
exploration features (such as display as a percentage) are supported via a toggle out-of-the-box,
but the creation of custom groups and bins is not available with the visualization environment —
instead they must be defined in the self-service data preparation layer via custom calculations,
which limits seamless interactive analysis. Conditional formatting beyond that which can be
defined with calculations is not yet supported — more extensive and out-of-the-box conditional
formatting features are on the vendor's roadmap.

Context
This Critical Capabilities research evaluates products included in the 2017 "Magic Quadrant for
Business Intelligence and Analytics Platforms" on 15 critical capabilities in support of the five main
use cases for BI and analytics platforms.

Product/Service Class Definition


Gartner's view is that the market for modern BI and analytics platforms will remain one of the
fastest-growing software markets. The 15 critical capabilities defined in this research represent
mainstream buying requirements for customers to modernize their BI and analytic platforms.

The modern BI and analytics market grew 64% in 2015 on a constant currency basis, with a
projected growth rate of 30% in 2016 (see "Forecast: Enterprise Software Markets, Worldwide, 4Q16
Update" and "Forecast: Enterprise Software Markets, Worldwide, 2013-2020, 4Q16 Update" ).

Much of net new BI buying is driven by the following market activity:

Traditional BI and modern BI in a single platform: As traditional BI platform vendors initially


lacked agile and data discovery capabilities, customers who were early to adopt data discovery
typically augmented their traditional BI platforms with what were then specialty products. In the
last two years, modern BI solutions from traditional BI vendors have matured to the point that it is
possible to address both Mode 1 requirements and Mode 2 requirements in a single platform.
Conversely, modern BI platforms have continued to advance their capabilities in governance and
distributed reports. Customers would like to support both use cases (agile centralized BI
provisioning for Mode 1) as well as governed data discovery or decentralized analytics (for Mode
2) in a single platform. The degree that this is possible (and "best in class" for both modes) is
highly dependent on the particular vendor.
Expansion of data discovery dominates new investment: Continued investment in decentralized
analytics and large, governed data discovery deployments is expected to continue. Current IT-
centric vendors will continue to shift the focus of their new-product investments and platforms to
more-modern capabilities, with more-frequent releases. Data-discovery-oriented, modern BI
platforms will increasingly marginalize IT-authored static reporting approaches. IT-authored
system-of-record reporting will not disappear, but will gradually account for a smaller percentage
of overall analytics use. At the same time, a larger percentage of data discovery deployments will
expand overall user adoption and easy-to-use, centrally deployed BI platforms based on modern
architectures and broader capabilities will be key drivers of market growth.

Self-service data preparation and enrichment address a high-value data discovery challenge:
The shift toward business-user-driven data discovery has highlighted the need to address the
significant challenges of data preparation to enable broader and more-governed use. Self-service
data preparation capabilities are emerging that extend beyond the current data mashup
capabilities of most data discovery tools that help users prepare their data for analysis, but can be
very time-consuming. Self-service data preparation platforms enable business users to reduce the
time and complexity of preparing data for analysis in a governed and reusable way. They feature
capabilities like visual data flow building and automation, semantic autodiscovery, intelligent joins,
intelligent profiling, hierarchy generation, data lineage, and data blending on varied data sources,
including multistructured data and enrichment. Many of these platforms also feature automated
machine-learning algorithms in the background that visually highlight the structure, distribution,
anomalies and repetitive patterns in data, with guided business-user-oriented tools to suggest
how to resolve issues and enhance data. The intent of these tools is to make the data integration
process accessible to business analysts — in addition to traditional IT users — to address the
ongoing and high-value problem of data preparation.

Smart data discovery will extend data discovery to a wider range of users and enhance insights
and interpretation: These emerging capabilities facilitate discovery of hidden patterns in large,
complex and increasingly multistructured datasets, without building models or writing algorithms
or queries. This goes beyond data discovery, because business users and business analysts can
benefit from advanced analytics (to highlight and visualize important findings, correlations,
clusters, predictions, outliers, anomalies, linkages or trends in data that are relevant to the user),
with user interaction and exploration via interactive visualizations, search and natural-language
query technologies. Some tools also interpret results for the user with natural-language
generation of text to highlight patterns and explain insights. This will also reduce the time to
insight, as well as the time and expertise needed for manual data exploration and modeling.
Smart data discovery does not replace advanced analytics or the data scientist; it complements
them, by adding a class of citizen data scientists that can develop hypotheses that can be
explored in more detail and validated by data scientist.

Marketplaces will extend capabilities and analytic maturity: A handful of vendors in this report
provide marketplaces through which developers can publish and sell extensions, data, and
applications to improve the out-of-the-box capabilities provided solely from the vendor. Algorithms
that detect patterns, recommend associated products, and optimize prices will be a point of
differentiation for digital businesses, and will become an important content type for these
marketplaces.
Cloud BI will continue to grow as data shifts to the cloud: Adoption intentions for cloud BI and
analytics have accelerated. About 51% of respondents to Gartner's companion Magic Quadrant
survey (compared with 47% in 2016 and 41% in 2015) said they either have already deployed, or
plan to deploy their BI in either a private, public or hybrid cloud during the next 12 months. New
capabilities are often delivered first and sometimes only via the cloud. Amazon's entrance with
QuickSight (released in 4Q16) could further accelerate cloud BI adoption. With increasing support
for hybrid cloud connectivity — which many BI vendors now support or have on their roadmaps —
customers have greater flexibility and a glide path to cloud BI. However, larger BI platform vendors
(such as IBM, Oracle, SAP or Microsoft) often rely predominantly (or entirely) on their own cloud
data centers, while other vendors give customers greater flexibility in cloud infrastructure.
Streaming data: Much of the last decade of analytics investment has centered on analyzing
internally generated transactional data to understand customer behavior and internal processes.
The next 10 years will be driven by investments in applications that use the Internet of Things
(IoT). The fastest-growing kinds of data will come from real-time event streams, sensors and
machine data, and events generated by devices. These new use cases, combined with insights
from other new (multistructured) data types (together with new types of analysis), will generate
the next major wave of analytics investment and business transformation. This will enable
companies that have historically competed on physical assets to compete on information assets.
Multistructured data analytics: Expanded investment in new types of analysis on a variety of
structured and unstructured data will deliver new insights that drive business value and
transformation. This may include external and public datasets, combined with internally generated
data. The desire to analyze information in relational data sources has expanded to include JSON,
personal data sources, Hadoop and NoSQL.
Embedded BI: Organizations will invest in embedding BI content (reports and dashboards),
interactive analysis, predictive and prescriptive analytics in applications and business processes
— all of which will deliver optimized recommendations and courses of action to nontraditional BI
users at the point of decision or action (increasingly mobile) — to further extend the
pervasiveness and benefits of BI and analytics.

Customer-facing analytics and data monetization: Companies will increasingly invest in


capabilities that transform analytics from a cost center to a profit center as they find new ways to
productize the data assets they have (or can assemble) to improve customer relationships, create
new business models, and generate new sources of revenue.

Collaboration and social capabilities: Together with the crowdsourcing and sharing of BI content
and analysis, these may drive a more-pervasive use and higher business value from BI
investments.

Critical Capabilities Definition


Vendors are assessed according to the following 15 critical capabilities. Changes, additions and
deletions from last year's Critical Capabilities are listed in Note 1. Subcriteria and detailed
functionality requirements are included in a published RFP document (see "Toolkit: BI and Analytics
Platform RFP" (https://www.gartner.com/document/3537217?ref=sendres_email) ).

Admin, Security and Architecture


Capabilities that enable platform security, administering users, usage monitoring, auditing platform
access and utilization, optimizing performance and ensuring high availability and disaster recovery.
This also includes the ability to run on multiple operating systems.

Data Source Connectivity


Capabilities that allow users to connect to structured and unstructured data contained within
various types of storage platforms, including personal data sources, relational, NoSQL and direct
HDFS. The ability to access business applications and ERP systems is included.

Cloud BI
PaaS and SaaS for building, deploying and managing analytics and analytic applications in the cloud
based on data both in the cloud and with hybrid connectivity to on-premises data sources.
Marketplaces and prebuilt content to cloud-data sources are included.

Self-Contained ETL and Data Storage


Platform capabilities for accessing, integrating, transforming and loading data into a self-contained
performance engine with the ability to index data and manage data loads and refresh scheduling of
loaded data.

Self-Service Data Preparation


Drag-and-drop cleansing, modeling, and blending of multiple data sources and creation of analytic
models.

Advanced capabilities include machine-learning-enabled semantic autodiscovery, intelligent joins,


intelligent profiling, hierarchy generation, data lineage and data blending on varied data sources,
including multistructured data.
Metadata Management
Tools for enabling users to leverage the same system-of-record semantic model or for creating a
semantic model automatically.
Modelers should be able to search, capture, store, reuse and publish metadata objects such as
dimensions, hierarchies and measures, as well as conduct impact analysis for changed objects.

Embedded Advanced Analytics


Enables users to easily access advanced analytics capabilities that are self-contained within the
platform itself, or through the import and integration of externally developed models.

Smart Data Discovery


Automatically visualizes the most important findings such as correlations, exceptions, clusters, links
and predictions in data that are relevant to users without requiring them to build models or write
algorithms.

Users explore data via visualizations, autogenerated voice or text narration, search, and natural-
language query technologies. Forecasting and clustering should be menu-driven. Support for
advanced visualizations such as decision trees should be out-of-the-box.

Interactive Visual Exploration


Enables the exploration of data via an array of visualization options that go beyond those of basic
pie, bar and line charts, to include trellis, heat and tree maps, scatter plots, and other special-
purpose visuals.

These tools enable users to analyze and manipulate the data by interacting directly with a visual
representation of it to display as percentages bins and groups.
Analytic Dashboards
The ability to create highly interactive dashboards and content with visual exploration and
embedded advanced and geospatial analytics to be consumed by others. Support for offline
dashboards should be included.

Mobile Exploration and Authoring


Enables organizations to develop and deliver content to mobile devices in a publishing and/or
interactive mode.

Takes advantage of the native capabilities of mobile devices, such as touchscreen, camera and
location awareness. Device-based security and integration with third-party MDM solutions should be
supported.

Embed Analytic Content


Capabilities including a software developer's kit with APIs and support for open standards for
creating and modifying analytic content and visualizations, and embedding them into a business
process, and/or an application or portal.

Publish, Share and Collaborate


Capabilities that allow users to publish, deploy and operationalize analytic content through various
output types and distribution methods with support for content search, scheduling and alerts.
Enables users to share and rate content via discussion threads, chat and storyboards.

Platform and Workflow Integration


This capability considers the degree that the capabilities are offered in a single, seamless product or
across multiple products with little integration.

Ease of Use and Visual Appeal


Ease of use to administer and deploy the platform, create content, consume and interact with
content as well as the visual appeal.

Use Cases
Agile Centralized BI Provisioning
Supports an agile IT-enabled workflow from data to centrally delivered-and-managed analytic
content using self-contained data management capabilities of the platform.

Agile centralized BI provisioning enables an information consumer to access their KPIs from an
information portal — whether on a mobile device or embedded in an analytic application — to
monitor and measure the performance of the business. In a modern BI and analytics platform,
interactivity is often supported out-of-the-box and automatically. This is in contrast to traditional
reporting-based platforms in which interactivity is limited to what is designed in by the content
author and in which a data warehouse must first be built.
The highest-weighted capabilities in this use case include:

Admin, security and architecture

Self-contained ETL and data storage


Metadata management
Analytic dashboards

Mobile exploration and authoring

Publish, share and collaborate


Ease of use and visual appeal

Decentralized Analytics
Supports a workflow from data to self-service analytics for individual business units and users.
On the analytics spectrum, users of platforms that excel at the decentralized analytics use case can
explore data using highly interactive descriptive analytics ("what happened" or "what is happening")
or diagnostic analytics ("Why did something happen?" "Where are areas of opportunity or risk?" or
"What if?").
Increasingly, because of the embedded advanced analytics offered by many vendors, users can
extend their analysis to some advanced descriptive analysis (for example, clustering, segmenting
and correlations) and to a basic level of predictive analytics (for example, forecasting and trends).
They can also prepare their own data for analysis, reducing their reliance on IT and time to insight.
As decentralized analytics becomes more pervasive, the risk of multiple sources of the truth
becomes a concern and decentralized analytics may evolve to governed data discovery over time
and as a deployment grows.

The highest weighted capabilities in this use case include:


Data source connectivity

Self-contained ETL and data storage

Self-service data preparation


Embedded advanced analytics

Interactive visual exploration

Analytic dashboards
Ease of use and visual appeal

Governed Data Discovery


Supports a workflow from data to self-service analytics to system-of-record, IT-managed content
with governance, reusability and promotability of user-generated content.

Capabilities that govern, promote and widely share content are what most differentiate governed
data discovery from decentralized analytics. With the success of data discovery tools in driving
business value, many organizations would increasingly like to use data discovery capabilities for a
broader range of analysis and an expanded set of users than was previously addressed by IT-centric
enterprise reporting platforms. Governed data discovery enables users to access, blend and prepare
data, then visually explore, find and share patterns with minimal IT support, or technical and
statistical skills. At the same time, it must also satisfy enterprise IT requirements for business-user-
generated model promotability, data reuse and governance. In particular, users should be able to
reuse sanctioned business-user-created data or datasets, derived relationships, derived business
models, derived KPIs, and metrics that support analyses.
Governed data discovery enables pervasive deployment of data discovery in the enterprise at scale
without proliferating data discovery sprawl. The expanded adoption of data discovery also requires
BI and analytics leaders to redesign BI and analytics deployment models and practices, moving
from an IT-centric to an agile and decentralized (yet governed and managed) approach. This would
include putting in place a "prototype, pilot and production" process in which user-generated content
is created as a preliminary model. Some of these would need to be recurring analysis and are
promoted to a pilot phase, and others are promoted to production and operationalized for regular
analysis as part of the system of record. Alternately, governance can be implemented after broad
sharing of content as centralized experts proactively monitor usage.
The highest weighted features in this use case include:

Admin, security and architecture

Data source connectivity


Self-contained ETL and data storage

Metadata management
Interactive visual exploration

Ease of use and visual appeal

OEM or Embedded BI
These capabilities are used to create and modify analytic content, visualizations and applications
and embed them into a business process, and/or an application or portal.

They support a workflow from data to embedded BI content in a process or application, as well as
extending out-of-the-box capabilities. They can reside outside the application, reusing the analytic
infrastructure, but must be easily and seamlessly accessible from inside the application, without
forcing users to switch between systems. The ability to integrate BI and analytics with the
application architecture will enable users to choose where in the business process the analytics
should be embedded.
The highest-weighted capabilities in this use case include:

Admin, security and architecture

Data source connectivity


Analytic dashboards

Embed analytic content

Publish, share and collaborate


Extranet Deployment
Supports a workflow similar to agile centralized BI provisioning for the external customer or, in the
public sector, citizen access to analytic content.

In addition, capabilities for embedding and cloud deployment are typically required for extranet
deployments.

The highest-weighted capabilities in this use case include:


Admin, security and architecture

Cloud BI

Metadata management
Analytic dashboards

Embed analytic content

Vendors Added and Dropped


Added
ThoughtSpot, Datameer, Oracle and Zoomdata were added to the Magic Quadrant and hence to this
Critical Capabilities this year as they met all of the inclusion criteria and were ranked in the top 24 of
assessed vendors based on an evaluation of their modern BI product offerings against the current
set of critical capabilities and other inclusion metrics defined for the Magic Quadrant.
Dropped
Platfora was acquired by Workday and is no longer being sold as a stand-alone BI platform.

BeyondCore was acquired by Salesforce and is included in the Salesforce assessment.


Datawatch and GoodData were excluded because they shifted their market emphasis.

Inclusion Criteria
Vendors included in this research also appear in the 2017 "Magic Quadrant for Business Intelligence
and Analytics Platforms."
The number of vendors on this year's Magic Quadrant was limited to 24. We ranked vendors that
met all the inclusion criteria below.

Modern BI and Analytics Platform Assessment


This was evaluated by Gartner analysts and was determined by the extent of IT involvement that is
considered to be mandatory before the platform can be used by a business analyst/information
worker to analyze data, without IT assistance. Products that require significant IT involvement,
either internal or external to the platform, in order to load and model data, create a semantic layer,
build data structures as a prerequisite to using the BI platform or are IT developer-centric platforms
focused on building analytic applications, do not meet the criteria of a modern BI and analytics
platform and were not evaluated further for inclusion. Products that met the modern criteria were
evaluated for inclusion in the Magic Quadrant based on a funnel methodology where requirements
for each tier must be met in order to progress to the next tier. Tiers 1 to 3 are evaluated at the
vendor level; Tiers 4 and 5 are evaluated at the product level.
Vendor-Level Criteria

Tier 1. Market Presence — A composite metric assessing both the interest of Gartner's client base
and that of the broader market, through internet search volume, job postings and trend analysis, was
conducted for each vendor.
Tier 2. Revenue* — For those vendors meeting the market presence criterion (Tier 1), BI and
analytics revenue for each vendor was assessed and evaluated. For this assessment, two common
license models were assessed and revenue from each was combined (if applicable) and evaluated
against the three revenue inclusion levels (shown below) for qualification:

1. Perpetual License Model — Software license, maintenance and upgrade revenue (excluding
hardware and services) for calendar years 2014, 2015 and 2016 (estimated).

2. SaaS Subscription Model — Annual contract value (ACV) for year-ends 2014, 2015 and
projected ACV for year-end 2016, excluding any services included in annual contract. For
multiyear contracts, only the contract value for the first 12 months should be used for this
calculation.
Revenue inclusion levels are as follows:

$25 million 2016 (estimated) combined perpetual license revenue + 2016 (estimated) ACV, or

$15 million 2016 (estimated) combined perpetual license revenue + 2016 (estimated) ACV with
50% year-over-year growth, or
$10 million 2016 (estimated) combined perpetual license revenue + 2016 (estimated) ACV with
100% year-over-year growth

* Gartner defines total software revenue as revenue that is generated from appliances, new licenses,
updates, subscriptions and hosting, technical support, and maintenance. Professional services
revenue and hardware revenue are not included in total software revenue (see "Market Share
Analysis: Business Intelligence and Analytics Software, 2015" ).

Tier 3. Magic Quadrant Process Participation — Participation in the Magic Quadrant process
requires the following input:

Completing and providing documentation for an RFP-style questionnaire of detailed critical


capabilities.
Completing an online questionnaire around market presence, growth, go-to-market strategy and
differentiation.

Submission of a video up to one-hour long that demonstrates how included products deliver on
the predefined analytic scenarios defined by Gartner (we only look at the first hour; anything
beyond that is not considered).

Verification of final BI and analytics revenue for 2014, 2015 and 2016 (estimated).
Providing references for an online customer and OEM survey.
Providing a vendor briefing to the Magic Quadrant authors.

Providing access to evaluation software.

Providing factual review of sections in the Magic Quadrant research.


Product-Level Criteria

Tier 4. Breadth of Coverage — The vendor must demonstrate breadth across vertical industries and
geographic regions, as specified by Gartner.
Tier 5. Product Assessment — Products that progressed to this final tier were assessed by Gartner
analysts using the information provided by each vendor in the data collection exercise outlined
above. The final step involved narrowing down the field to 24 vendors for inclusion in the Magic
Quadrant.
Gartner has full discretion to include a vendor on the Magic Quadrant regardless of their level of
participation in the Magic Quadrant process, if the vendor is deemed important to the market. This
discretion was not applied this year as all vendors fully participated in the process.
Table 1. Weighting for Critical Capabilities in Use Cases
Critical Agile Decentralized Governed OEM or Extranet
Capabilities Centralized BI Analytics Data Embedded Deployment
Provisioning Discovery BI

Admin, Security 10% 5% 10% 10% 10%


and Architecture

Data Source 5% 10% 10% 15% 0%


Connectivity

Cloud BI 0% 5% 5% 0% 25%

Self-Contained 10% 10% 10% 5% 5%


ETL and Data
Storage

Self-Service Data 0% 13% 8% 5% 0%


Preparation

Metadata 20% 0% 10% 0% 10%


Management
Critical Agile Decentralized Governed OEM or Extranet
Capabilities Centralized BI Analytics Data Embedded Deployment
Provisioning Discovery BI

Embedded 0% 10% 5% 0% 0%
Advanced
Analytics

Smart Data 0% 5% 5% 0% 0%
Discovery

Interactive Visual 0% 15% 10% 0% 5%


Exploration

Analytic 15% 10% 5% 10% 10%


Dashboards

Mobile Exploration 10% 0% 5% 0% 5%


and Authoring

Embed Analytic 0% 0% 0% 40% 25%


Content

Publish, Share and 15% 5% 5% 10% 0%


Collaborate

Platform and 5% 2% 2% 0% 0%
Workflow
Integration

Ease of Use and 10% 10% 10% 5% 5%


Visual Appeal

Total 100% 100% 100% 100% 100%

As of January 2017

Source: Gartner (March 2017)


This methodology requires analysts to identify the critical capabilities for a class of
products/services. Each capability is then weighed in terms of its relative importance for specific
product/service use cases.

Critical Capabilities Rating


Table 2 shows the product/service scores for each use case. The scores, which are generated by
multiplying the use-case weightings by the product/service ratings, summarize how well the critical
capabilities are met for each use case.

Table 2. Product Score in Use Cases


Governed OEM or
Agile Centralized Decentralized Extranet
Data Embedded
BI Provisioning Analytics Deployment
Use Cases Discovery BI

Alteryx 2.82 3.11 2.98 2.69 2.59

Birst 3.80 3.31 3.59 4.15 4.18

Board
3.24 3.22 3.27 3.05 2.97
International

ClearStory Data 3.86 3.62 3.77 4.04 3.84

Datameer 2.74 2.84 2.86 3.15 2.92

Domo 3.30 3.12 3.19 3.31 3.29

IBM (Cognos
2.82 2.71 2.81 2.56 2.69
Analytics)

IBM (Watson
2.47 2.69 2.64 2.32 2.55
Analytics)

Information
3.55 3.14 3.36 3.93 3.58
Builders

Logi Analytics 3.42 3.62 3.42 4.14 3.51

Microsoft 3.32 3.46 3.54 3.54 3.56


Governed OEM or
Agile Centralized Decentralized Extranet
Data Embedded
BI Provisioning Analytics Deployment
Use Cases Discovery BI

MicroStrategy 3.97 3.65 3.83 3.59 3.56

Oracle 3.17 3.05 3.15 3.65 3.62

Pentaho 3.20 3.14 3.19 3.81 3.34

Pyramid
3.38 3.37 3.32 3.52 3.37
Analytics

Qlik 3.36 3.29 3.34 3.86 3.63

Salesforce 3.42 3.27 3.39 3.87 3.95

SAP
(BusinessObjects 2.75 2.92 2.79 2.38 2.50
Cloud)

SAP
(BusinessObjects 2.96 2.81 2.94 2.99 2.94
Lumira)

SAS 3.38 3.53 3.48 2.94 3.09

Sisense 3.46 3.25 3.35 3.54 3.39

Tableau 3.53 3.55 3.52 3.32 3.37

ThoughtSpot 3.30 2.82 3.05 2.61 2.79

TIBCO Software 3.62 3.71 3.63 4.06 3.76

Yellowfin 3.61 3.17 3.25 3.59 3.34

Zoomdata 2.81 2.77 2.77 3.36 3.08

Source: Gartner (March 2017)


Acronym Key and Glossary Terms
Acronym Key and Glossary Terms
ACV annual contract value

AWS Amazon Web Services

BI business intelligence

ETL extraction, transformation and loading

HDFS Hadoop Distributed File System

IoT Internet of Things

KPI key performance indicator

PaaS platform as a service

SaaS software as a service

Evidence
Gartner's analysis, the ratings and commentary in this report are based on a number of sources
including:

Customer perceptions of each vendor's strengths and challenges (as gleaned from their BI-related
inquiries to Gartner).

An online survey of vendors' reference customers (which was conducted during October 2016 and
yielded 1,931 responses).

A questionnaire completed by the vendors.

Vendor briefings (including product demonstrations, strategy and operations).


An extensive RFP questionnaire inquiring about how each vendor delivers the specific features
that make up our 15 critical capabilities (see "Toolkit: BI and Analytics Platform RFP" ).

A prepared video demonstration of how well vendor BI platforms address specific functionality
requirements across the critical capabilities.

Access to evaluation software from each vendor.

Rankings refer to where a product or vendor is positioned relative to other vendors based on a
combination of customer reference survey and analyst opinion. Ratings refer to capability scores in
Figure 6.

Critical Capabilities Methodology


This methodology requires analysts to identify the critical capabilities for a class of products or
services. Each capability is then weighted in terms of its relative importance for specific product or
service use cases. Next, products/services are rated in terms of how well they achieve each of the
critical capabilities. A score that summarizes how well they meet the critical capabilities for each
use case is then calculated for each product/service.

"Critical capabilities" are attributes that differentiate products/services in a class in terms of their
quality and performance. Gartner recommends that users consider the set of critical capabilities as
some of the most important criteria for acquisition decisions.

In defining the product/service category for evaluation, the analyst first identifies the leading uses
for the products/services in this market. What needs are end-users looking to fulfill, when
considering products/services in this market? Use cases should match common client deployment
scenarios. These distinct client scenarios define the Use Cases.

The analyst then identifies the critical capabilities. These capabilities are generalized groups of
features commonly required by this class of products/services. Each capability is assigned a level
of importance in fulfilling that particular need; some sets of features are more important than
others, depending on the use case being evaluated.

Each vendor’s product or service is evaluated in terms of how well it delivers each capability, on a
five-point scale. These ratings are displayed side-by-side for all vendors, allowing easy comparisons
between the different sets of features.

Ratings and summary scores range from 1.0 to 5.0:


1 = Poor or Absent: most or all defined requirements for a capability are not achieved

2 = Fair: some requirements are not achieved

3 = Good: meets requirements


4 = Excellent: meets or exceeds some requirements

5 = Outstanding: significantly exceeds requirements

To determine an overall score for each product in the use cases, the product ratings are multiplied
by the weightings to come up with the product score in use cases.
The critical capabilities Gartner has selected do not represent all capabilities for any product;
therefore, may not represent those most important for a specific use situation or business objective.
Clients should use a critical capabilities analysis as one of several sources of input about a product
before making a product/service decision.
(https://www.gartner.com/technology/contact/become-a-client.jsp?cm_sp=bac-_-reprint-_-banner)
© 2017 Gartner, Inc. and/or its affiliates. All rights reserved. Gartner is a registered trademark of Gartner, Inc. or its
affiliates. This publication may not be reproduced or distributed in any form without Gartner's prior written
permission. If you are authorized to access this publication, your use of it is subject to the Usage Guidelines for
Gartner Services (/technology/about/policies/usage_guidelines.jsp) posted on gartner.com. The information
contained in this publication has been obtained from sources believed to be reliable. Gartner disclaims all
warranties as to the accuracy, completeness or adequacy of such information and shall have no liability for errors,
omissions or inadequacies in such information. This publication consists of the opinions of Gartner's research
organization and should not be construed as statements of fact. The opinions expressed herein are subject to
change without notice. Gartner provides information technology research and advisory services to a wide range of
technology consumers, manufacturers and sellers, and may have client relationships with, and derive revenues
from, companies discussed herein. Although Gartner research may include a discussion of related legal issues,
Gartner does not provide legal advice or services and its research should not be construed or used as such.
Gartner is a public company, and its shareholders may include firms and funds that have financial interests in
entities covered in Gartner research. Gartner's Board of Directors may include senior managers of these firms or
funds. Gartner research is produced independently by its research organization without input or influence from
these firms, funds or their managers. For further information on the independence and integrity of Gartner
research, see "Guiding Principles on Independence and Objectivity.
(/technology/about/ombudsman/omb_guide2.jsp)"

About (http://www.gartner.com/technology/about.jsp)
Careers (http://www.gartner.com/technology/careers/)
Newsroom (http://www.gartner.com/newsroom/)
Policies (http://www.gartner.com/technology/about/policies/guidelines_ov.jsp)
Privacy (https://www.gartner.com/privacy)
Site Index (http://www.gartner.com/technology/site-index.jsp)
IT Glossary (http://www.gartner.com/it-glossary/)
Contact Gartner (http://www.gartner.com/technology/contact/contact_gartner.jsp)

You might also like