You are on page 1of 5

Information Management

Visualizing success
Published in Information Management Magazine, July/August 2011

Analytic User Interfaces That Drive Business


by David Steier We hear it many times after disaster strikes: the warning signs were there, but the decision-maker didnt see them. Or, sophisticated technology had been installed, but no one had properly learned to use it. Such failings are often chalked up to user error or problems between keyboard and chair. In catastrophic failures that have involved analytics, weve come to see that sophisticated models wont do us much good unless decision-makers are able to interpret, understand and act on the results appropriately. Bad outcomes related to analytic models often arise because designers have not considered how people interact with the information being delivered. Designers might be too enamored with the power of their model, its options and parameters. They may have forgotten that users are pressed for time and need to focus on daily duties. As a result, users might not be able to answer basic questions such as, What information should I pay attention to? and, Now that Ive seen this information, what should I do? If users cant easily answer these questions, the interface and the analytics are plainly not meeting their needs. A more user-centric analytics design approach is needed. What users need Good analytic interfaces facilitate what the military calls situational awareness: a grasp of events relevant to a given situation, how those events relate to each other and an ability to project the effects those events will have on the situation. A situationally aware pilot uses the information presented in the cockpit to determine what course of action might be needed to respond to a situation.
As used in this document, Deloitte means Deloitte LLP and its subsidiaries. Please see www.deloitte.com/us/about for a detailed description of the legal structure of Deloitte LLP and its subsidiaries. Certain services may not be available to attest clients under the rules and regulations of public accounting.

The analytics interface is the decision-makers cockpit. Well-designed analytics cockpits are built with characteristics that promote situational awareness, including: Role-based design. A telecommunications CEO makes decisions unlike decisions made in the companys network operations center, so role-based interfaces to their analytics need to support different types of decisions. Where interfaces are tailored to roles, users wont be asking, What does that thing do? or Why do I care? Less is more. Good analytics interfaces show the information most critical to the user - not every piece of information that might be available for analysis.

Sensory cues direct attention. Good interfaces exploit peoples abilities to perceive patterns based on position, size, shape, color and movement. These properties highlight important features that might otherwise be lost in a table of numbers. Interfaces suggest actions. Analytic dashboards alert users to potential performance issues and provide actionable information. Good interfaces provide context to interpret results that suggest what the user might do next and provide mechanisms such as clickthrough to facilitate an explanation and further analysis. Designing user-centered analytics Just as cockpits could not be designed without understanding what pilots need in order to fly an airplane, analytic interfaces should be driven by an understanding of what users will do with the results. Here are some principles for obtaining that understanding and designing interfaces accordingly. Let the users lead User-centric analytics follows the approach of other user-centric designs: start from user needs and work backward to design the interface that supports those needs, ultimately to the analytics that will drive that interface. Even when users cannot specify in advance what they really want, it is critical to involve them early and often as analytic interfaces are designed. Users are likely to feel about interfaces the same way Supreme Court Justice Potter Stewart described obscenity they cant define it, but they know it when they see it. Users are even better gauges for bad interfaces if enough users believe an interface is unsatisfactory, the designer is well-advised to accept their judgment. Beyond consulting users, analytics designers should consider user preferences even in choosing basic analytic approaches. Users who are comfortable adopting an analytic approach are likely to be confident that they can explain and defend the analytics (that it is not a black box). Simple models have a greater chance of adoption than complex ones, and linear models have a greater

chance of adoption than nonlinear ones, even at the sacrifice of accuracy. If you are contemplating giving users the ability to set analytics modeling parameters, determine if they want to set those parameters and that they know how to do so (or at least give them default values). Other benefits arise when users are involved in the design of analytic interfaces. They can help identify early wins the designer may not have thought of and might provide useful introductions to other potential users and their communities. A user who feels a sense of ownership in interface design can become an advocate for the technology respected by other users. Users of different abilities may point out accessibility considerations, such as how and when color is used so color-blind users get the same information from the intensity of the display. It is always best to avoid relying on a single user for design. Vet judgments and suggestions with several other users to be sure that the input is representative of the intended user population. A picture is worth a thousand numbers The right visual display can make it much easier to understand the results of complex analytics and increase user adoption. Because of our human ability to understand relationships quickly based on size, position and other spatial attributes, the eye can summarize what might otherwise require thousands of numbers to convey. As an example, Figure 1 represents an analytics interface at a large consumer products company. It shows the effectiveness of trade promotion investments in distributors of products offered by the company. Each dot represents a distributor, with the horizontal axis showing the amount of the investment in rebates offered through that distributor and the vertical axis showing the profit or loss from that investment. The vast majority of investments are small, with correspondingly small profits or losses.

From analytics to action An analytics interface may be visually appealing, but if it doesnt stimulate action, its not going to be very effective. Good interfaces provide the context to let the user know when action might be required. Consider the treemap displayed in Figure 2, drawn from a health care organization that operates a number of hospitals. Each rectangle represents emergency room visits to one hospital; the larger the rectangle, the higher the number of ER visits. The color represents the number of visits relative to the forecasted number of ER visits. Red means the actual number of visits was higher than forecast, while green means the actual number of visits was lower than forecast. In both red and green cases, there is a problem with the forecasts.

The marketing manager wants to track performance averages, but is more interested in the outliers. The circles highlight several such outlier groups. The one in the upper left has a low investment, but an abnormally high return. The one in the upper right has a high investment with high return, while the cluster toward the bottom of the chart represents high investments with abnormally low returns. Each cluster represents an opportunity to improve investment effectiveness. We can consider this a use of analytics rather than a simple data plot because the outliers were selected from thousands of possibilities and data attributes (not all shown on the graph), and the circles were drawn automatically based on a sophisticated statistical clustering method. Automatic clustering saves weeks of work, but it is not be important for the marketing manager to know the details. What he or she needs to know in order to figure out what to do next is the size of the outlier investments and returns. Surely, the manager could have inferred this with a spreadsheet and enough time, but this display communicates the findings at a glance. It is easy to see the number of outliers in each cluster, how the clusters relate to each other and the magnitude of the problem or opportunity.

In the red case, poor forecasting might have caused a shortage in supplies or staff, while in the green case, poor forecasting might have caused overstaffing (and expensive idle capacity). The larger the rectangle, the bigger the potential impact of forecasting errors on the hospital chains overall results. The interface combines color and size to draw the eye to the hospitals or regions with the biggest forecasting issues, and it invites the users to learn more about those hospitals.

While not demonstrated here, this interface actually goes further in guiding the next action. When hovering over a rectangle (not shown), more detail is provided about the types of ER visits and forecasts for each. If this detail were provided in the iitial display, including those where the lack of red or green indicates good forecasting and no need for action, the display would be cluttered with unactionable information. A good analytics interface presents information only when the user can act in response. Relevancy follows role While scores of workers from the C-suite to the contact center might benefit from access to analytics, different interfaces will be required to serve different job roles. This is illustrated with a final example in Figure 3.

This dashboard was created for an internal auditor to track requisitions, invoices and payments and identify anomalies in timing and volumes. Each element in the dashboard is relevant to the task. The top bar chart shows number of requisitions over time and uses color to emphasize the months with higher dollar amounts. The middle chart profiles the requisition on all the days within a chosen month, highlighting the weekend, showing the lag between requisition and purchase order. The treemap at the bottom uses size to show the number of requisitions from each department and color to indicate dollar value. Filters on the right can be used to focus in on particular ranges of requisitions or purchase order numbers. The interface also illustrates the application of principles for good visual design. Displays of related information are horizontally and vertically aligned so the eye can see patterns across related variables (they do not have unintended alignments that suggest misleading or irrelevant comparisons). Color serves to highlight exceptions, not to enliven a dull dashboard. Analytic results are not presented to 10 decimal places when the user does not need such precision to make a decision. The displays have a high data-ink ratio, following Yale professor Edward Tuftes principles for designing statistical graphics. Good interfaces avoid 3-D effects or ornate gauge designs when simple numbers, charts and graphs will do.

The payoff for getting it right Analytics have enormous potential to improve business decision-making. They can improve operational efficiency, identify opportunities for growth, and identify and mitigate risks. They are the keys to increasing profitability and shareholder value. But to benefit from analytics, interfaces must be designed that improve situational awareness. Following the suggested design principles can facilitate the creation of interfaces that help users get the information they need, when they need it, to make faster and smarter decisions. Dr. David Steier is a director in Deloitte LLP, where he co-leads a group in the U.S. Deloitte Analytics Institute focused on advanced analytics and visualization. Prior to joining Deloitte, Steier worked at PwC, where he served as director of the Center for Advanced Research, and previously as a managing director at Scient.

This publication contains general information only and is based on the experiences and research of Deloitte practitioners. Deloitte is not, by means of this publication, rendering business, financial, investment, or other professional advice or services. This publication is not a substitute for such professional advice or services, nor should it be used as a basis for any decision or action that may affect your business. Before making any decision or taking any action that may affect your business, you should consult a qualified professional advisor. Deloitte, its affiliates, and related entities shall not be responsible for any loss sustained by any person who relies on this publication. Copyright 2011 Deloitte Development LLC. All rights reserved. Member of Deloitte Touche Tohmatsu Limited

You might also like