Assessing the Effectiveness of a New Dashboard’s Design
Even a brilliantly designed dashboard can be met with disapproval by those who will use it if we’re not careful to introduce it in a way that encourages them to focus on what matters. Designs that are effective for monitoring information are quite different from the designs that are usually featured by software vendors and thus emulated by those who use their products. As a result, what people expect of a dashboard’s design is often quite different from what they actually need.
I’ve been asked on several occasions to provide guidelines for dashboard designers to use when introducing a new dashboard. These requests have encouraged me to create a list of questions that the users of the proposed dashboard can be asked to help them assess the merits of its design. I’m sure that this list of questions that I’ve put together in the last few days can be improved with your help, so I’d appreciate it if you would review the following and suggest anything that comes to mind that might improve it. Please keep in mind that I define “dashboard” in a particular way. Here’s my definition:
A dashboard is a visual display of the most important information needed to achieve one or more objectives; consolidated and arranged on a single screen so the information can be monitored at a glance.
(Information Dashboard Design, Stephen Few, O’Reilly Media, 2006)
The key to this definition is the fact that a dashboard is used for monitoring purposes. Its effectiveness should be judged on the basis of its ability to help people monitor what’s going on-that is, to maintain situation awareness.
When asking people to assess the merits of a new dashboard, it usually works best to focus their attention first on the big picture-the dashboard as a whole-and to then drill into the details of each section.
The Dashboard as a Whole
- When you first look at the dashboard, where are your eyes drawn? Are your eyes drawn most to the items that deserve the most attention?
- Can you easily discern how information is organized on the dashboard (for instance, the different sections)?
- Can you easily spot the items that require the most attention?
- Does the dashboard draw your attention to the information rather than to other things that don’t actually convey information?
- Is the information that you consider most important featured prominently on the dashboard?
- Can you quickly scan the dashboard to get an understanding of what’s going on?
- Can you tell the date/time through which the data is effective (for example, as of the end of yesterday or as of five minutes ago)?
- Can you easily compare items and see relationships between items in all cases when that is useful?
- If it works best to get the information in a particular sequence, does the design encourage you to view it in this way and make it easy to do so?
- Does the dashboard provide everything you need to maintain overall situation awareness (the big picture of what’s going on)?
- Can you see everything that you need to construct an overview of what’s going on without having to scroll or change screens?
- Is there anything on the dashboard that you don’t understand? Do you find anything confusing?
Specific Parts of the Dashboard
- Does the way that each measure is displayed express the information in a way that directly supports your needs without having to do conversions or calculations in your head? This could involve something as simple as graphing the variance between expenses and budget directly, rather than making you compare two lines on a single graph.
- Can you rapidly (1) discern the value of each measure, (2) determine whether it is good, bad, or otherwise, and (3) compare it to something that allows you to judge the level of performance?
- Do you have enough information about each item to determine if you must respond in some way?
- If you need to respond to something, can you easily get to any additional information that is needed to determine how to respond?
- Can you perceive each measure as precisely as you need to without being forced to wade through more precision than you need?
- For each measure, can you tell if performance is improving, getting worse, or holding steady? For those measures that lack trend information, would the dashboard be more useful if it were shown?
Take care,
8 Comments on “Assessing the Effectiveness of a New Dashboard’s Design”
Hi Steve,
A litte trick I use quite often is the Dashboard Squint Test
http://www.bonavistasystems.com/Articles_DashboardSquitTest.html
You squint your eyes and make an assessment on the overall layout, of elements that stand out, the visual balance and other characteristics of an effective Dashboard.
A) Which dashboard elements draw the most Attention, what color pos up?
B) Do the dashboard elements Balance? Does the dashboard have a clear organization?
Andreas
This is a great post, with many points applicable not just to dashboards but to stand-alone graphics…certainly if graph/graphic designers followed the “Specific Parts of the Dashboard” part then we’d have much more meaningful graphs.
This post exemplifies why I bought 20 copies of Information Dashboard Design to give to engineering and pre-sales when we acquired a small analytics company in a prior job.
The only thing I would add, which I feel many software companies miss, is ensuring that the dashboard meets the needs of a specific user. While that is implied by “objectives”, I think it would be useful to force this point home – a VP’s metrics, trends, and timeliness of data will of course be different from a department manager’s.
With a client a while back, I described my dashboard for one of their end-users as “What are the most important measures, what other measures are there that influence any problems and trends, where do I go to solve these issues.”
The last point I feel is important – giving people the tools to solve the problem. I’m not talking about drill-downs, but instead for example, allow me to send an email to the top X managers who have quality issues on their line, with information pre-entered into that email that helps them understand the issue.
Finally, and an end-user made me think of this, don’t always focus on the negative – if there are good results then being aware of those and rewarding the respective parties can also be an important part of the dashboard.
I guess I’m somewhat simplistic, but I believe the first question to ask is “are the metrics presented here the key indicators of performance for your organization� The follow up to that would be “do these measures present a comprehensive view of your processes and performance�
I continue to be amazed as to what my users suggest would be helpful. Although we have to always be careful about “feature-creep,†the folks that are closer to these data are a much better judge of their value than those of us that have the luxury of simply presenting the metrics in a compelling way instead of actually having to act upon them.
No data system will ever be able to capture all pertinent information. Our data systems will always be incomplete. However, by carefully listening to our users, we can certainly be more aware of the gaps between our representations of reality and their experiences in the world of realities.
Stephen, you put together very thoughtful questions. While it might be implicit in how we do our work, I figured it’s worth being explicit. My clients typically give me a list of data they want on a dashboard and that is kindly set aside until we have a detailed conversation about the business questions they need to answer and what actions they might take. This approach fosters collaboration and more importantly, improves the quality of the initial prototypes and better “scores” to the questions you listed. Thanks again for starting the dialog…
Great conversation. I fully agree with Lee. Ultimately it is about what type of decisions need to be made and what’s the most effective way of presenting the data to make those decisions. The issue with operationalizing dashboards today is the limitation in the form factor set by dashboard software products that are being offered on the market. They all come with a container in which the dashboard needs to live. However, best in class UI design requires a lot of freedom to deliver the best results in terms of human interaction with a dashboard. A gap that still needs to be closed.
I suspect the initial expectation (or even specification) of dashboard design is a fair indicator of the client’s Performance Management Maturity. Following on Stephen’s excellent discussion, I have summarised his as well as other contributions as follows:
Clarity
Focus (non-distracting, attention to important and critical information)
Understandable (rapid discernment, preferably with no need for mental translation)
Discernable (clear demarcation between information packets or groupings)
Logical grouping (sequential placement, clear organisation)
Qualifiable (ranking, importance, criticality etc.)
Quantifiable
Comparable (relationships, benchmarks, targets , trends etc.)
Meaning
Urgency
Conciseness
Most important and critical information
Completeness (Necessary and Sufficient with no non-contributing embellishment)
Consolidated and arranged on a single screen
Suitable precision
Frugality in use of artefacts (Bullet graphs, Bar graphs, dials etc.)
Simplicity
Consistency
Classification of dashboard types and styles (strategic, tactical, operational , individual, group etc.)
Ordering of dashboard elements
Value
Focus on business needs (results and outcomes)
Meet specific user needs
Maintain situation awareness
Enable effective use of information
Empower users
Provide a holistic view of what needs attention
Actionability
Alerts
Diagnosis (Drill down, analysis etc.)
Enterprise collaboration
(NOTE: The indentation indicates classification)
Great advice as usual. Thanks.
Previous commenters point out that clients often have specific requirements for their dashboards even when we feel those requirements violate the principles of good design. This is why open, collaborative communication with the client is imperative. It’s also why an agile project management approach is critical. The faster the client can get hands-on experience with a dashboard, the happier both of you will be. Because both the dashboard designer and the client learn things after the project gets started, we find that it helps a lot to talk about design principles as well as functionality throughout the process.