|

|
Thanks for taking the time to read my thoughts about Visual Business
Intelligence. This blog provides me (and others on occasion) with a venue for ideas and opinions
that are either too urgent to wait for a full-blown article or too
limited in length, scope, or development to require the larger venue.
For a selection of articles, white papers, and books, please visit
my library.
|
|
January 8th, 2009
This blog entry was written by Bryan Pierce of Perceptual Edge.
In January 2006, when Steve first introduced bullet graphs as a more effective alternative to circular gauges in his book, Information Dashboard Design, they were no more than a design concept that he created using Adobe Illustrator. There were no functional bullet graphs being used in the real world and any application of them would have required custom programming. They were a useful design that hadn’t been implemented.
Now, as we start 2009, it’s been three years since bullet graphs were first introduced. Over that time, they’ve become popular as an alternative to circular gauges as people have noticed their ability to provide more information in a smaller space, which is especially useful for dashboards. Ambitious designers have found tricks to implement bullet graphs in a variety of products, and some software vendors now include bullet graphs in the graph libraries that they provide. As of today, bullet graphs are available or can be created in the following products:
Products that support bullet graphs right out of the box:
MicroCharts (Excel add-in) by Bonavista Systems
CenterView by Corda
Visual:Acuity by Visual Engineering
DExperience by Developer Express
Although not provided as a standard graph type, bullet graphs can also be constructed with:
SAS/Graph
QlikView by QlikTech
MicroStrategy
CURL
Flex by Adobe
R
HTML/CSS courtesy of Matt Grams
Google Charts courtesy of Dealer Diagnostics
SVG courtesy of Chris Gerrard
This list has grown significantly in just the last year and I expect it to continue to grow as more people discover the merits of bullet graphs. If you currently use a product that can’t create bullet graphs, be sure to tell the vendor how useful they would be, and if you know of a product that I haven’t mentioned here, please share it by posting a comment.
-Bryan Pierce
December 26th, 2008
In an article entitled “The Changing Face of Business Intelligence,” last month Dave Wells eloquently described how the business intelligence industry has strayed from its original vision and how it is now changing to recover what’s been lost. A longtime veteran of data warehousing and business intelligence, Wells is one of the leaders in the industry who have shaken free of the technology-centric perspective that holds the industry back.
Wells begins by reminding us of Howard Dresner’s original vision when he coined the term “business intelligence” (BI) in the early 1990s. Dresner defined BI as “a set of concepts and methodologies to improve decision making in business through use of facts and fact-based systems.” Over the years, the industry that took hold of Dresner’s visionary term (mostly data warehousing vendors at the time) buried the goal of decision making in an emphasis on technology. As Wells says: “The troubling thing is that all of the definitions are IT-centric” and “too much of today’s business analytics has little connection with real business analysis.”
He goes on to offer a new definition of business intelligence, which recaptures the essence of the original and enhances it to further clarify the goals. I don’t want to give too much away by quoting his definition here; you should read Wells’ words directly. I do want to include one more quote, however, which is central to Well’s vision of BI’s transformation:
It is analysts – the people who perform analysis – who find meaning in the data. These are the people who explore cause-effect relationships and who guide decision-making processes. It is they who will lead the charge to reshape decision making in business.
To recover the original vision, the business intelligence industry must shift from an emphasis on technology to an emphasis on the people who use the technology. Only then will it begin to fulfill its original promise.
(Note: While I consider Wells’ argument brilliant, I believe that some of the software products that he lists as examples of “next generation of analytics” don’t belong there. In fact, I believe that some of the products on the list exemplify little understanding of and support for data analysis. This difference of opinion suggests that our common vision must become informed by clear definitions of data analysis and analytics and clear criteria for assessing products’ ability to deliver. All in good time.)
December 22nd, 2008
In September, I wrote a rather scathing review of a product called Lyza from a new business intelligence (BI) vendor named LyzaSoft. Part of my criticism was that LyzaSoft erroneously claimed that Lyza qualifies as data analysis and data visualization software. A month later, a good friend and respected colleague, Colin White, took issue with my opinion of Lyza. Thus began an email exchange between us and several other leaders in the field of BI. In this exchange, Colin noticed that we all seemed to use the terms “data analysis” and “data visualization” differently, so he asked each of us to define them. Here are the definitions that I contributed to the discussion:
Data analysis
Data sense-making. The process of discovering and understanding the meanings of data. (Not to be confused with preliminary steps taken to prepare data for the process of analysis.)
Data visualization
The use of visual representations to explore, make sense of, and communicate data. As such, data visualization is a core and usually essential means to perform data analysis, and then, once the meanings have been discovered and understood, to communicate those meanings to others.
On December 17th, Colin wrote about this in an article titled “Business Intelligence Data Analysis and Visualization: What’s in a Name?” Colin did a nice job of summarizing the discussion, but I believe that the conclusions that he reached miss the mark and are typical of most traditional BI professionals.
Here are Colin’s concluding opinions:
At a detailed level, two questions dominate the discussion:
- Are data transformation and integration different from data analysis? There are many examples of applications that retrieve data from multiple sources, restructure and aggregate it, and then load the results into a data warehouse. Similarly, data federation and data streaming technologies allow users not only to do dynamic in-motion data transformation and integration, but also data aggregation and summarization. These are all examples of processes that perform some level of data analysis. The ability to clearly delineate data transformation from data analysis is fast disappearing, and to say data transformation is completely different from data analysis makes no sense.
- Is data presented for presentation purposes only a form of data visualization? The mere fact that some of the comments got into semantic debates about what is data and what is information, and about whether a user is actually analyzing the results or not, suggests that a more pragmatic viewpoint is required. From my perspective, if data or information is presented to a user in a format that aids decision making, then that constitutes data visualization.
At a more macro level, it is important to define the role of a so-called expert or specialist. Our job is to help people understand and use new and evolving technologies and products for business benefit. As such, we need to use clear definitions and terminology that aids in this understanding. However, it is important that we accept that other people may have different definitions, and we need to find common ground. Defending our positions at all costs does not aid the industry. We also have to accept that business users may employ technology and use some terms in a completely different way, and it is important to adjust our positions and explanations accordingly. Unless we do that, business intelligence will continue to be usable only by the small subset of users that employ it today.
I’ll come back to Colin’s position in a moment, but first, I’d like to provide some context for what I’m going to argue. The BI industry has done a wonderful job of providing technologies that enable us to collect, cleanse, and store huge warehouses of data. We now have enormous reservoirs of data available to us, but most people are drowning in them, unable to do the only thing that really matters: actually use the information to achieve the understanding that’s needed to make good decisions. This is predominantly a human task.
The technologies that are needed to help us make sense of data must be built on a clear understanding of what people must do to understand data and the perceptual and cognitive processes involved in the effort. In other words, the solutions that are needed require a human focus, not the technology focus that has produced the tools that we use to collect, cleanse, and store data. I believe most of the people who have done great work to enable the BI achievements in building a solid data infrastructure are locked in a technology mindset from which they can’t escape and rarely even recognize that they should escape. Almost every vendor that is currently offering real solutions for data sense-making—a rather small group—has emerged from outside the BI industry. Some have been working for years as statistical analysis vendors and most others are spin-offs of information visualization research at universities. None of the major BI vendors seem to understand data analytics at all. I don’t think this is for lack of interest or effort, but because they are focused on technology, an engineering focus, rather than the human beings who use technology, a social science and design focus. I believe that the discussion that Colin, I, and others in the industry had about data analysis and data visualization illustrates this situation.
Contrary to LyzaSoft’s claim that businesspeople use the term data analysis for the entire end-to-end process of working with data (you can read their position in Colin’s article, which he refers to as “The Vendor’s Position”), I’ve found that the people who actually work in business and elsewhere to make sense of data know that the tasks of collecting, cleansing, aggregating, and storing data are different from data analysis. The former tasks precede and support the process of data analysis by making data accessible and reliable, but they aren’t data analysis itself. These folks would much rather have the IT department build a good data warehouse for them so they aren’t bothered by having to prepare the data and can spend their time actually analyzing it. This distinction between data preparation and data analysis is not just a matter of semantics. Until vendors understand this difference, they will continue to produce so-called data analysis products that don’t work. In contrast, vendors such as Tableau, Spotfire, Advizor Solutions, Panopticon, Visual I|O, and SAS—examples of those who haven’t emerged from within the BI industry—already get this.
Now that buyers of BI software are turning their focus to the actual use of data—to data sense-making and communication—it’s tempting and all too convenient for BI vendors such as LyzaSoft to call what they do “data analysis.” This murky use of the term not only renders it vague, confusing, and for all practical purposes useless, it also prolongs the state of affairs that has given rise to our current desire for data analytics: the fact that BI vendors have failed to provide useful tools for data sense-making and communication. These tools, which we desperately need to make better decisions, have always been the central, but failed, promise of business intelligence.
The opinion that Colin expresses in response to the second issue concerns me: “From my perspective, if data or information is presented to a user in a format that aids decision making, then that constitutes data visualization.” I certainly agree that the goal is to achieve understanding and support decision making, but not every way of doing this is data visualization, and not everything that would like to call itself data visualization deserves the name. Information can be presented in various ways, just as it can be verbally communicated in various languages; each medium of data presentation (the spoken word, the written word, and visual representations of various types) has its strengths and weaknesses, its appropriate applications, and its rules for effective use. Saying that every presentation that aids decision making is data visualization is not a useful definition. In fact, it’s an example of what I warned against in our email discussion. Here’s what I said, as quoted in Colin’s article:
Confusion regarding terms such as data analysis and data visualization exists in the BI community because little effort has been made to sufficiently define them. Our industry tolerates a freewheeling, define-it-as-you-wish attitude toward these and other terms to the detriment of our customers. In the academic world, which I keep one foot in, a greater effort is made to define the terms to provide the shared meanings that are required to communicate, yet even in academia it gets a bit murky at times. I believe that terms are inadequately defined in the BI community in part because ours is an industry that has largely been defined for marketing purposes, rather than as a rational discipline. It serves the interests of software vendors to keep the terms vague.
I agree that we must be open to one another’s ideas and definitions, but I believe the goal of this openness, after thinking long and hard, is to narrow, not expand, our use of these terms. As it is today, these terms are barely useful because they are defined too loosely, broadly and inconsistently. Expanding the definitions will only add to the problem.
I’ll conclude this blog post as Colin ended his article, with the following question and invitation: “What do you think?”
November 25th, 2008
Visualizations of various types are used to support thinking and communication. I focus on their use for analyzing and presenting quantitative information, but they can also be used for other purposes, such as teaching concepts and procedures, and helping people understand processes and complex systems. With the publication of Visual Language: Global Communication for the 21st Century in 1999, Robert Horn made a compelling case that visualization is a language, which is different from but often collaborates with verbal language. It is definitely true that, when trying to communicate certain information, “a picture is worth a thousand words.” As technologies such as television, video games, and the Internet fill our lives with increasing amounts of visual content, the potential of visualization is now taken for granted. The question remains, however, “Are we using this visual language effectively?”
I decided to address this topic today while looking at an “infographic” about the costs of the war in Iraq shown below, which was created by Good Magazine, based on the book Three Trillion Dollar War: The True Cost of the Iraq Conflict by Nobel Prize laureate Joseph E. Stiglitz and Linda J. Bilmes.
In Visual Language, Horn defined “infographics” (short for “information graphic”) as:
Moderately sized, meaningful combinations of words, images, and shapes that together constitute a complete communication unit. Visual and verbal elements are tightly integrated. Is as self-contained as possible on 1 or 2 pages or on a large screen. Usually contains more information than a concept diagram, although an information graphic may use any of the types of concept diagrams as its central visual element. Usually contains several blocks of text.
(Visual Language, Robert E. Horn, MacroVU, Inc,, Bainbridge Island, Washington, 1998, p. 61)
This form and use of visualization has become popular in the last few years. We now see frequent examples of infographics in major news publications. I’ve seen examples that work to communicate effectively, but more that, in my opinion, do not. What accounts for these differences in the effectiveness of infographics?
I believe that the Three Trillion Dollar War visualization, which tells a story that I care about and consider important, fails as an infographic. Aspects of its visual design discourage me from examining it. It’s hard to look at. Even if the aesthetics were more pleasing to the eye, I don’t think the graphics achieve their communication objectives. The story is adequately told by the text-the ten points that are described verbally to the right of the graphics. The graphics add no value or meaning that isn’t contained in the text. The pictures themselves don’t reveal anything we can’t learn more clearly from the text. Graphics should only be used when they communicate more effectively than words or words alone. Visual displays can do a great job of revealing relationships that might be difficult to communicate with words alone, but the relationships between the various costs that appear in this infographic are buried in visual clutter.
Until yesterday, I had never heard of Good Magazine. According to their website:
GOOD is a collaboration of individuals, businesses, and nonprofits pushing the world forward. Since 2006 we’ve been making a magazine, videos, and events for people who give a damn. This website is an ongoing exploration of what GOOD is and what it can be.
Based on what I’ve read, I like these guys and support what they’re trying to do. I want their work to succeed , but as an information visualization professional, I’m concerned that in this case at least their good intentions have been undermined by ineffective graphics.
My purpose here is not to critique this particular infographic, and certainly not to criticize the work of Good Magazine. Rather, I’m writing to raise concerns once again about the quality of infographics in general and the fact that it doesn’t seem to be improving. I believe infographics have great potential, but their effectiveness must be honed through empirical study. Infographics practitioners must become more introspective, more critical of their work, if they wish to give something useful to the world. Most of the infographics that I’ve seen are filled with what Tufte calls “chartjunk.”
Why are we still producing chartjunk? Jacques Bertin put us on the road to effective uses of visualization by introducing the basic vocabulary of visual communication. Tufte refined and extended this work, especially in regards to quantitative communication. Robert Horn synthesized much of what’s being done and demonstrated the existence of visual language. But today, rather than continuing in this critical scientific tradition, infographics reminds me of Web design in the early days: free expression with little regard for practices that have been proven to produce the desired outcomes. No one seems to be doing any work to determine what works and what doesn’t, and to understand why. Or, if they are, I’m not aware of it, and am rarely seeing the results.
In Visual Language, Horn wrote:
Basic scientific research is beginning to bear out the thesis of this book-that people find it easier and more effective to communicate by using combinations of words and images. Although visual language has yet to be subjected to a full battery of cognitive science or pragmatic tests, the few available studies support that conclusion…Because visual language is so effective, it is important that standards and criteria develop for its use. These criteria need to be based on principles that come from both cognitive science and design. Criteria for good practice will evolve both from the evidence of careful empirical studies that compare different visual methods of expressing a similar message and from the reflective judgments of practitioners. Out of such aesthetic factors come the models, the criteria, and the aesthetic factors that together make a message effective, efficient, and attractive. We have clearly entered a period of exciting dialogue and development of these ideas. (ibid., pp. 233 and 235)
I share Horn’s vision, but I’m not sure that during the last 10 years since he wrote these words, the hope and enthusiasm that he expressed in the final sentence applies to infographics. Just as statistical graphics have been subjected to empirical study, and continue to be, resulting in guiding principles that can be found in the works of Tufte, Cleveland, and more recently my own, infographics must do the same if we wish to apply them effectively.
I’m interested in your thoughts, especially if you’re an infographics practitioner. Are you aware of work that’s being done to put infographics on the track to effectiveness that it needs to mature and definitely deserves?
October 28th, 2008
I spent most of last week at InfoVis 2008 in Columbus, Ohio. You might remember that I delivered the capstone presentation last year at InfoVis 2007, which also served as the keynote presentation for VAST 2007 (Visual Analytics Science and Technology). Last week the 2008 edition of this presentation was delivered by Christian Chabot, cofounder and CEO of Tableau Software. Chabot and I share the belief that visual analysis software is needed by a broad audience of people, not just those who have the term “analyst” in their titles. We also share the belief that with well-designed visual analysis tools like Tableau, visual analytics is poised to explode.
Participants in the conference consisted primarily of academics—professors and graduate students who spend their days inventing and refining visualization tools and techniques for making better sense of data. Chabot clearly wanted to challenge this audience to direct more of their efforts toward the practical needs of a broad audience of potential users.
Chabot identified four conditions that have set the stage for the current readiness of visual analytics to take off:
- Data explosion
- Technological advances
- General awareness
- Industry consolidation
The overwhelming amount of information that people now face has created a desperate need for tools that will help them make sense of it. Modern computer hardware and the Web have provided the infrastructure that is needed for people to interact with and share information effectively. Awareness of the visualization’s potential has reached a critical mass. Traditional business intelligence vendors, along with their tired, low-yield approaches to data analysis, have been bought up by large software corporations where they will languish, which has opened the door for better approaches to capture the attention of market. By rejecting the sins of traditional business intelligence vendors, refusing to compete for the hearts and wallets of customers through a litany of useless and ineffective pseudo-analytical features, software companies such as Tableau that are thoughtful, agile, design-oriented, and well-informed, have differentiated themselves from the pack and are now reaping the rewards of their commitment to give people analysis tools that really work. One result that we’re beginning to see is the gradual spread of data analysis tools to organizations of all sizes (from Google to the local bakery), and their proliferation throughout all parts of those organizations.
When the founders of Tableau Software were initially crafting their vision, they identified five core principles of visual analytics’ adoption:
- People adopt visual analytics primarily to help them see and understand complex data.
- People adopt visual analytics primarily to help them see and understand massive data.
- People adopt visual analytics primarily to help them see and understand new visual paradigms.
- People adopt visual analytics primarily to help them see and understand hidden insights.
- People adopt visual analytics primarily to help analysts save time.
Chabot is a Stanford MBA who worked for years after graduation as a high-end analyst—one of those guys that spend their days tackling complex analytical problems using complex analytical techniques. The other founders of Tableau, Chris Stolte, who earned his doctorate in computer science at Stanford by developing the prototype for Tableau’s eventual product, and Pat Hanrahan, the Stanford professor who supervised Stolte’s work, were immersed in the world of academic information visualization research. Their assumptions about what it would take to get people to adopt visual analytics made perfect sense, given their perspective at the time. As time passed, however, they kept their eyes open and learned that each of their assumptions turned out to be flawed.
Flawed Principle #1: People adopt visual analytics primarily to help them see and understand complex data.
Although sometimes complex, the data sets that people analyze are usually fairly simple. Chabot advised those of us in the information visualization community to start simple. Rather than focusing most of our attention on solving the complex, highly-specialized needs of a few, we can solve much more widespread problems that are just as important by making it easier for people to do the simple stuff that they must do over and over again each day, which are now unnecessarily onerous and time-consuming.
Flawed Principle #2: People adopt visual analytics primarily to help them see and understand massive data.
Although sometimes massive, the data sets that most people analyze are not particularly large. Chabot recommended that we start small, making it easy for people to work not just with huge corporate databases, but also with small files stored in Access and Excel.
Flawed Principle #3: People adopt visual analytics primarily to help them see and understand new visual paradigms.
Although there are times when new visual paradigms must be invented to solve peoples’ needs, most problems can be solved with proven visualizations, such as bar charts, line graphs, and scatterplots. Chabot suggested that we start proven by making it easier for people to use what we already know to work well in a seamless fashion.
Flawed Principle #4: People adopt visual analytics primarily to help them see and understand hidden insights.
While it is true that one of the great benefits of visual analytics is the discovery of previously hidden insights—those “Aha!” moments that we all crave—the primary reason, by far, that people want good visual analytics tools is more mundane, though no less useful: to save time. Chabot pointed out that we can design great tools that get out of the way, allowing people to become engaged in the act of thinking about data, rather than distracted by the mechanics of using the software.
Flawed Principle #5: People adopt visual analytics primarily to help analysts save time.
While analysts desperately need better tools to help them do their jobs, even greater benefit can be gained by providing tools that anyone can use, enabling everyone who must make sense of information to do their jobs and, as a consequence, freeing up analysts to spend their time solving the more complicated problems. With religious zeal, Chabot warned that we can no longer serve the needs of small groups with specialized needs, but should invite everyone to the table.
At the end of his presentation, Chabot reviewed his message and challenged us with these final facts:
- Millions of people need visual analytics technologies to help them understand information.
- The current state-of-the-art in business analytics (what most people rely on to do their jobs) is tragic.
- The primary barriers to visual analytics’ adoption are (1) awareness, (2) misperception, (3) ease of trial, (4) ease of deployment, (5) ease of use, and (6) ease of price.
What business intelligence vendors have still failed to do, a new breed of software company with roots in information visualization research, is poised to finally deliver. The world needs what we have to offer. To get it into the hands of those who need it, we must bridge the chasm that divides academic research and commercial software. Tableau and a few other ventures have done that. They’re inviting others to join them—not tomorrow, but now—because the time is ripe.
Take care,

|