|

|
Thanks for taking the time to read my thoughts about Visual Business
Intelligence. This blog provides me (and others on occasion) with a venue for ideas and opinions
that are either too urgent to wait for a full-blown article or too
limited in length, scope, or development to require the larger venue.
For a selection of articles, white papers, and books, please visit
my library.
|
|
April 11th, 2006
If you subscribe to the Business Intelligence Network’s business intelligence newsletter, you have already noticed that in this month’s (April 2006) data visualization issue I’ve featured the work of University of Maryland professor Ben Shneiderman. No one has done more to promote information visualization research than Ben. The Human-Computer Interaction Lab (HCIL) that he founded at the University of Maryland over 20 years ago continues to produce some of the finest work in this field. Every year they showcase this work at their annual symposium and the next one is scheduled for June 1 and 2 at the Clarice Smith Performing Arts Center in College Park, Maryland. The first day will be filled with presentations and demonstrations and the second with tutorials and workshops. If you’re interested in seeing some of the cutting-edge work in information visualization, this is an event that you should consider. For more information, check out the HCIL symposium announcement directly.

Comments Off on University of Maryland’s Annual HCIL Symposium
March 28th, 2006
Yesterday, I had a chance to wander the demo floor at this year’s SUGI (SAS Users Group International) conference in San Francisco. Eleanor Taylor of SAS’ marketing strategy group showed me around, making sure I had a chance to see some of the progress they’ve made since my visit to their campus in Cary, North Carolina last November. The highlight of the day was my first opportunity to see in action their statistical discovery product JMP, which was demonstrated by the man who created the product and continues to manage its development, John Sall, one of the co-founders of SAS. JMP (pronounced “jump”), has been around since the late 1980s, and is farily well known in academic, scientific, and statitistical circles, but not so much in the world of buisness intelligence. For those of you data analysts who are not intimidated by sophisticated statistics, this is a product that’s worth a look. For the rest of us, I would love to see a simplfied version of JMP.
People tend to think of SAS as a statistics software company. I rarely hear BI professionals mentioning SAS in the same sentence as companies like Business Objects, Cognos, and Hyperion. You can find SAS software everywhere, but so far it has appealed primarily to particular industries such as government, academia, and science, and to enclaves of the statistically inclined in the broader business world. There is such great functionality built into their software for making sense of data, I’d love to see it reach a broader audience, but to do so I believe SAS will need to consolidate its product offerings to reduce the confusion that buyers experience when faced with an entire demo floor of separate products that overlap more than they differ. In many cases, the same basic functionality is offered in unnecessarily different ways using different interfaces, when a consistent approach would better serve SAS and its customers. Quoting Thoreau, I would like to see SAS “Simplify, simplify, simplify.”
I hope to spend some time in the next few weeks taking a closer look at JMP, which is jam-packed with useful functionality. Perhaps I’ll be able to make a few suggestions for simple changes that could open this product up to a broader audience of business people–that is, to those without advanced training in statistics. Perhaps a version that doesn’t JMP (jump) quite so far, but gives business people the steady springboard they need to dive comfortably into their data. In the meantime, you might want to take a look at JMP and let me know what you think.

Comments Off on SAS can increase your reach by helping you jump (JMP)
March 6th, 2006
I received an email today from my friend and colleague Claudia Imhoff. She included a link to an article that appeared in the February 27, 2006 edition of CIO magazine, entitled “BI Versus BA: What’s the Difference?” This article opens with the following warning: “Don’t be fooled into thinking business intelligence necessarily includes analytics. Requirements and benefits for each are not the same.”
This statement is diametrically opposed to the message that I’ve worked long and hard to communicate. As I was reading the article, I was already composing a scathing response in my head, but was suddenly knocked off my high horse when I reached the end of the article and discovered that its author, Rock Gnatovich, is the president of Spotfire. Although I’ve never met Rock, Spotfire is a software company that I admire; I correspond from time to time with Rock’s boss, the man who founded the company, Christopher Ahlberg. Perhaps Rock, Christopher, and I should have a little chat.
I’ve worked in the business intelligence industry for many years; in fact, long before the term business intelligence was coined by Gartner in the mid-1990s. Before the term business intelligence, we called this industry data warehousing, which remains a prominent term, but these days refers primarily to the back-end aspects of the industry, such as the databases and the data models. Before the term data warehousing came into vogue in the 1980s, this work was often called decision support. Despite the merry-go-round of terms, the work that they describe remains the same. It’s all about collecting, storing, accessing, analyzing, and reporting business information in an attempt to make sense out of it and communicate its meaning to support informed business decisions. Despite the fact that the terminology has changed about once every 10 years, I’ll be damned if I’m going to stand by idly while an entirely new wave of confusion is introduced to sidetrack the good folks who work hard to make sense out of business data.
What’s really interesting is that I just wrote a white paper and did a webcast for Spotfire entitled “Visual and Interactive Analytics: Fulfilling the Promise of Business Intelligence.” In it I made the argument that the type of visual analysis software that Spotfire and a few other innovative companies have introduced in recent years is not only part of BI, it addresses the very heart of BI. Here’s an excerpt from the white paper, which can be downloaded from Spotfire’s web site:
It is BI’s mission to help businesses harness the power of information to work smarter. Intelligence—“the faculty of understanding” (according to the Oxford English Dictionary)—is the solid ground on which businesses must build to succeed. Information is the stuff with which intelligence works to produce the understanding needed to effect change, but more data delivered faster can actually lead to less understanding and even bad decisions if we lack the skills and tools needed to tame and make sense of it. The BI industry has helped us build huge warehouses of data that we can now access at lightening speeds, but most of us look on with mouths agape, feeling more overwhelmed than enlightened. The Gartner Group coined the term business intelligence in the mid-1990s and defined it as follows:
“An interactive process for exploring and analyzing structured and domain-specific information to discern trends or patterns, thereby deriving insights and drawing conclusions. The business intelligence process includes communicating findings and effecting change.”
(Source: A glossary on the web site www.gartner.com)
The BI industry often loses sight of this clear vision. In many ways, BI is still a fledgling industry, awkwardly struggling with good intentions to mature beyond adolescence, past the flexing and preening of raging hormones, to the responsible solution provider that it has always strived to become. The time is right for BI’s rite of passage into adulthood. Some software companies, like Spotfire, are showing the way. Some companies (I’ll resist the temptation to name names) are still trying to get by on their good looks, flirting with the sad possibility of never growing up.
The “I” of BI—intelligence—can only be achieved by fully engaging the half of human-computer interaction that possesses intelligence: the human half. BI is only as effective as its ability to support human intelligence. This requires software that seamlessly interacts with our brains to support and extend our cognitive abilities. Unfortunately, BI software too often gets in the way, interrupting and undermining the thinking process rather than complementing and extending it. When BI software does its job, however, you find yourself submerged in thoughts about the data, not about the software and the hoops you must jump through to reach insight…
Information visualization—technologies that support the analysis and communication of data using visual media and techniques—should not be seen as separate from BI. As the Gartner Group’s definition of BI made clear, when data visualizations are used to support an “interactive process for exploring and analyzing structured and domain-specific information to discern trends or patterns”, they are doing precisely what BI is meant to do. When used effectively, visualization software extends the reach of traditional BI to new realms of understanding—not as one means among many, but often as the only effective means available. Information visualization will enable the next leap in BI’s evolution.
In his CIO magazine article, Rock asserts that “BI reporting ends with the dashboard, which is sufficient only for some business planning, and BA picks up the rest.” I don’t agree. BI doesn’t end there. By its very definition it doesn’t end there.
May God save us from the confusion of yet one more addition to the alphabet soup that plagues the software industry and incites CIOs to demand that their business intelligence teams spend time researching what they assume is yet one more new industry that they can’t live without, when in fact it is just part and parcel of what they’ve been doing all along. To the extent that most BI vendors have failed to provide powerful tools for analysis, they deserve to be criticized, shamed, and called to task, but let’s not make the users suffer by claiming that business analysis (BA) is something new and different. It is not only a part of BI, it resides at the very center of BI.
Have mercy on the poor folks who spend their days struggling to make sense out of business data. Don’t distract them with a whole new set of terms. Instead, let’s work together to help BI live up to its promise of true business intelligence.

Comments Off on BI versus BA? Here we go again
February 27th, 2006
Panopticon recently gave me a preview of a new edition of their treemapping software, which was officially released today (see the press release). A learning edition of Panopticon Explorer .NET can be downloaded for free, which provides a useful means to get acquainted with treemaps for visualizing hierarchically arranged quantitative information. You have probably seen examples of this approach, such as SmartMoney.com’s Map of the Market. Treemaps were originally invented by Dr. Ben Shneiderman of the University of Maryland as a way to maximize screen real estate for simulatenously displaying two related sets of hierarchically or categorically arranged quantitative variables: one encoded as the size of rectangles (for example, a stock’s volume) and the other as the rectangle’s color (for example, a stock’s price or change in price since yesterday). When used effectively, treemaps can provide a quick overview of the data in a way that makes exceptional conditions pop out, such as big changes in stock prices. If this topic interests you, I will feature an article about treemaps by Ben Shneiderman in the April data visualization edition of he Business Intelligence Newsletter from the Business Intelligence Network.
Panopticon has incorporated some nice features into their software for viewing and interacting with the data, so you might want to take a look.

Comments Off on Panopticon Explorer .NET — Try it out for free
February 23rd, 2006
In one of my regular attempts to scan news articles about data visualization, on February 21 I ran across an interview with John Kopcke, the CTO of Hyperion Solutions, written by James Murray in IT Week. In response to the question “What…innovations are encouraging enterprise wide deployment of reporting tools?” Kopcke responded: “Role based dashboards that only present information the user needs to perform their role are a key development.” I couldn’t agree more. In fact, in my opinion, anything that calls itself a dashboard but doesn’t specifically support the information needs of a particular person, group, or role doesn’t deserve the name. Too many organizations are getting into trouble by trying to develop “the corporate dashboard”, as if a single dashboard could support the information needs of an entire corporation. A dashboard is only as useful as its ability to support the real information monitoring needs of one or more real people doing real jobs. To do so, a dashboard must be customized to support individual roles.
I was disappointed, however, when I read on and found Kopcke advocating an approach to the development of role-based dashboards that I find untenable:
Traditionally, if you wanted to deploy role based dashboards you may need 2,000 different dashboards which would take forever to put together. But the way we do it now is to have a library of dashboard components and you pull those components together to create the dashboard. We could extend that model to an on demand basis so we provide the components to the customer and allow them to assemble the dashboard so they get the uniqueness required and we keep our costs down.
I doubt that any company has ever tried to deploy hundreds or thousands of dashboards without using some method that involved reusable components. Dashboards are still in their youth and as such lack a legacy of primitive methods. This is beside the point, however, for my primary objection to Kopcke’s vision is that information consumers–the people who use dashboards to do their jobs–rarely know how to design dashboards. Based on what I’ve gleaned from frequent visits to dashboard vendor Web sites, even the so-called experts don’t know how to design dashboards that result in clear and efficient communication.
Providing people with a library of dashboard widgets, one for every possible data display that they might need, would present them with an overwhelming set of choices. Even if end users could navigate these choices, they wouldn’t know how to assemble them on a computer screen in a way that didn’t end up looking like a disorganized, cluttered mess. The visual design of a dashboard requires a set of design skills that must be learned. They certainly aren’t intuitive. Someone must be part of the process who has the expertise to design an aesthetically pleasing dashboard that is visually arranged to support varying levels of importance, meaningful comparisons, and a sequence that matches the way the data will be used. Think about this the next time you board a commercial jet, grateful that the pilot didn’t design the cockpit based on a library of dials and gauges that he had to arrange himself.
For years BI vendors have been promoting the self-service nature of BI. Those of us who have actually had to implement these systems, however, know what a pipe-dream this will remain as long as the tools are clumsy to use and the users lack the data analysis skills needed to use them knowledgeably. The plug-and-play approach to dashboard development seems a lot like the familiar self-service dream of BI–a nice idea with good intentions, but simply not practical.

Comments Off on Role-based dashboards
|