Thanks for taking the time to read my thoughts about Visual Business Intelligence. This blog provides me (and others on occasion) with a venue for ideas and opinions that are either too urgent to wait for a full-blown article or too limited in length, scope, or development to require the larger venue. For a selection of articles, white papers, and books, please visit my library.

 

News Flash: BI Discovers the Obvious

April 8th, 2009

I was surprised to find the obvious reported as news today when I read Ted Cuzzillo’s latest article for TDWI’s newsletter BI This Week titled “The New Breed of BI Analyst.” I suppose I shouldn’t have been surprised, because vendors and other organizations that support the business intelligence industry sometimes claim to have discovered something new that has existed and has been obvious all along. The huge population of data analysts who are, according to Scott Davis of Lyzasoft (as quoted in the article), “determined, resourceful, and distrustful of data managers who presume to think for them” is probably the oldest and largest group of data analysts that’s been around since the advent of information technology (IT). Anyone who has ever worked in decision support (data warehousing, business intelligence, business performance management, etc.) has been aware of these analysts all along. Some have opposed them as “annoying users who make our lives difficult by working around our controls” (my paraphrase of a sentiment often expressed by folks in IT) and others have worked hard to support them by providing access to data and useful tools, but they all know that this is not a “new breed of BI analyst” that was recently discovered as the article suggests through “research by LyzaSoft” and work by “Microsoft’s Gemini program.”

When a vendor or a consultant has nothing new to offer but is desperate to win clients, it is tempting to claim a new discovery by shrouding the obvious and familiar in new terminology. In this case, however, no attempt has been made to disguise it; it’s just old news. Come on Lyzasoft and Microsoft, don’t insult the intelligence of the business intelligence community by gluing a carrot on the head of a goat and calling it a unicorn. That only works at carnivals for children and drunks.

The fact that two of the BI industry’s longtime leaders—Wayne Eckerson and David Wells—were cited in the article should not be taken as a confirmation of its claims. I respect their work and know firsthand that they don’t view the type of analyst that’s described in this article or the importance of their analytical needs as a new discovery. I also respect the work of the article’s author, Ted Cuzzillo, and can’t imagine why he considered this news. Has the BI industry really become this desperate for something worthwhile to write about? Are we allowing vendors to tell us what’s news rather than finding it on our own? Is any BI organization supporting the work of journalists like Cuzzillo in a way that makes it possible for them to really investigate what’s going on or are they supporting them just enough to pick the low-hanging fruit?

Take care,

I do not like ill-reasoned spam; I do not like it, Sam-I-am.

April 6th, 2009

If my experience is at all typical, you probably have a friend or relative who occasionally sends you emails like the one below, which I recently received. In reading it, you’ll find several logical errors, along with errors of other types that render it absurd and unreliable (and perhaps dishonest), regardless of your political affiliation. The friend who forwarded this to me and the person who forwarded it to him are both bright people, but they accepted this argument without question. The fact that the argument supports their political beliefs no doubt played a role in their willingness to embrace it without qualm, but would they have noticed the logical fallacies if its conclusions conflicted with their own? It’s likely that they would have. Sadly, relatively few people in America have been trained in the basic skills of critical thinking, even including people with a college education. I’ve seen several examples of arguments made on the floor of congress by our elected officials that were no more reasonable than the one that appears below. And before you accuse me of targeting politicians—the low-hanging fruit of flawed and often dishonest argument—I’ll add that I’ve known CEO’s and other executives of large corporations who reasoned on this level as well. How can government function and businesses thrive when decisions are made based on such a flawed reasoning process?

Rather than sharing the email that I sent as a reply to the argument below, I think it would be more fun for you to critique it from scratch. No matter what your political leanings, what are the flaws in this argument? How would you counter it in a debate?
 

 

 

Do We Really Need More Data?

March 30th, 2009

The notion that “we need more data” seems to have always served as a fundamental assumption and driver of the data warehousing and business intelligence industries. It is true that a missing piece of information can at times make the difference between a good or bad decision, but there is another truth that we must take more seriously today: most poor decisions are caused by lack of understanding, not lack of data. The way that data warehousing and business intelligence resources are typically allocated fails to reflect this fact. The more and faster emphasis of these efforts must shift to smarter and more effective. Although current efforts to build bigger and faster data repositories and better production reporting systems should continue, they should take a back seat to efforts to increase the data sense-making skills of workers and to improve the tools that support these skills.

Even in the sensitive arena of intelligence analysis, where decisions can preserve or end lives and information is often spotty, it is much more important to teach analysts effective skills and give them the best sense-making tools than it is to give them more data. Former CIA analyst Richards J. Heuer, Jr., argues the following in his book Psychology of Intelligence Analysis (1999):

The difficulties associated with intelligence analysis are often attributed to the inadequacy of available information. Thus the US Intelligence Community invests heavily in improved intelligence collection systems while managers of analysis lament the comparatively small sums devoted to enhancing analytical resources, improving analytical methods, or gaining better understanding of the cognitive processes involved in making analytical judgments. (p. 51)

This lack of appropriate funding exists no less and probably a great deal more in the corporate world as well. Heuer cites research findings that additional information rarely improves the accuracy of analyst’s judgments. What really matters is the quality of the mental model that analysts use—the conceptual frameworks that we bring to the process of data sense-making. Additional information only improves the accuracy of analytical judgments when it helps the analyst correct and improve his or her mental model. Heuer writes:

The accuracy of an analyst’s judgment depends upon both the accuracy of our mental model…and the accuracy of the values attributed to key variables in the model…Additional detail on variables already in the analyst’s mental model and information on other variables that do not in fact have a significant influence on our judgment…have negligible impact on accuracy, but form the bulk of the raw material analysts work with. (p. 59)

Unfortunately, even the most expert among us rarely understands their own mental models.

Experts overestimate the importance of factors that have only a minor impact on their judgments and underestimate the extent to which their decisions are based on a few variables. In short, people’s mental models are simpler than they think, and the analyst is typically unaware not only of which variables should have the greatest influence, but also which variables actually are having the greatest influence. (p. 56)

Researchers, especially those who work in the cognitive sciences, have learned a great deal about the way people process information and make decisions, including the flaws in the process that often trip us up. Proper training based on these insights is needed to make us better analysts; good tools are needed to help us work around analytical limitations that are built right into our brains. It is toward these ends that the bulk of our data warehousing and business intelligence investments should be directed. Is this where you’re focusing your efforts? Is this even on your radar?

Take care,

Failures in Analysis Are Failures in Thinking

March 17th, 2009

Psychology of Intelligence Analysis, Richards J. Heuer, Jr.,
Center for the Study of Intelligence, CIA, 1999.

In any field of study, among the many written works that inform it there are a few that stand out as pillars of wisdom. In the field of data analysis, one of those pillars is the book Psychology of Intelligence Analysis, by Richards J. Heurer, Jr., who spent 45 years supporting the work of the CIA. Even though this book focuses on intelligence analysis, the principles and practices that it teaches apply to data analysis of all types, including the analysis of quantitative business data. Heuer believes, as I do, that the primary failures of analysis are less due to insufficient data than to flawed thinking. To succeed analytically, we must invest a great deal more of our resources in training people to think analytically and equipping them with tools that effectively support the process.

According to Douglas MacEachin, former CIA Deputy Director of Intelligence:

Dick Heuer makes clear that the pitfalls the human mental process sets for analysts cannot be eliminated; they are part of us. What can be done is to train people how to look for and recognize these mental obstacles, and how to develop procedures designed to offset them.

Throughout the book, Heuer identifies these pitfalls in thinking and suggests guidelines and procedures for overcoming them, and he does so in clear prose that anyone interested in the topic will find accessible.

People construct their own version of “reality” on the basis of information provided by the senses, but this sensory input is mediated by complex mental processes that determine which information is attended to, how it is organized, and the meaning attributed to it. What people perceive, how readily they perceive it, and how they process this information after receiving it are all strongly influenced by past experience, education, cultural values, role requirements, and organizational norms, as well as by the specifics of the information received.

Unlike some who would shroud the analytical process in a mystique of lofty terms and obscure references, Heuer extends a friendly, helping hand. He does so because he believes that these skills can be learned.

Thinking analytically is a skill like carpentry or driving a car. It can be taught, it can be learned, and it can improve with practice.

To give you a better idea of the book’s contents, here’s the outline:

Part 1—Our Mental Machinery

Chapter 1: Thinking About Thinking
Chapter 2: Perception: Why Can’t We See What Is There To Be Seen?
Chapter 3: Memory: How Do We Remember What We Know?

Part 2—Tools for Thinking

Chapter 4: Strategies for Analytical Judgment: Transcending the Limits of Incomplete Information
Chapter 5: Do You Really Need More Information?
Chapter 6: Keeping an Open Mind
Chapter 7: Structuring Analytical Problems
Chapter 8: Analysis of Competing Hypotheses

Part 3—Cognitive Biases

Chapter 9: What Are Cognitive Biases?
Chapter 10: Biases in Evaluation of Evidence
Chapter 11: Biases in Perception of Cause and Effect
Chapter 12: Biases in Estimating Probabilities
Chapter 13: Hindsight Biases in Evaluation of Intelligence Reporting

Part 4—Conclusions

Chapter 14: Improving Intelligence Analysis

If your job involves making sense of data to support decision making, you owe it to yourself and your employer to read this book. It won’t take long to read and it needn’t cost you anything, because the book can be downloaded as a PDF for free from the CIA’s website. If you like it as much as I do, you can also purchase a bound version.

Take care,

Looking at Data with R

March 10th, 2009

One of the tools that statisticians often use to visually explore, analyze, and present data is the free open-source product called R. Like several of the tools that statisticians use, it requires a fair amount of training to learn R, but once you’ve learned it, you have a great deal of power and flexibility at your fingertips.

A friend and infovis colleague, Hadley Wickham of Rice University, contacted me yesterday with the news that he and two colleagues from Iowa State University—Di Cook and Heike Hofmann—will be teaching a two-day workshop this summer in Washington, DC on this topic titled Looking at Data: Learning to Explore Data with Graphics. Anyone wanting to develop good visual analysis and presentation practices using R could save themselves a great deal of time and effort by attending this hands-on workshop. You can read all about it at the registration site.