Thanks for taking the time to read my thoughts about Visual Business Intelligence. This blog provides me (and others on occasion) with a venue for ideas and opinions that are either too urgent to wait for a full-blown article or too limited in length, scope, or development to require the larger venue. For a selection of articles, white papers, and books, please visit my library.


Scientific Thinking

September 30th, 2015

In my recent newsletter article titled “A Course of Study in Analytical Thinking” I included “scientific thinking” as a specific type of thinking that we should understand and practice as data sensemakers. For this particular topic, I recommended the book A Beginner’s Guide to Scientific Method, Fourth Edition, by Stephen S. Carey as a useful introduction, but admitted that I had not yet read the book. I read others on the topic that didn’t suit the need and Carey’s book seemed to be the best bet based on the author’s description and the comments of several satisfied readers. Within a day or two of the article’s publication my copy of the book finally arrived and I’m relieved to say that it’s a perfect fit.

Beginners Guide to Scientific Method

It’s a short book of only 146 pages (including the index), but it covers the topic beautifully. It even includes quizzes and exercises for the dedicated learner. I especially appreciate its thoughtful focus on the essence of science and scientific method, never venturing into territory that non-scientists would find esoteric or intimidating. If you’re like me, you probably assumed that there were many good books of this type available, but this is surprisingly not the case. Given the importance of science and the fact that everyone should understand what it is and how it is essentially performed, this is a tragic void. Thankfully, Carey must have recognized this two decades ago when he wrote the first edition and has continued to serve the ongoing need by updating it every few years with current examples.

Carey breaks the content into six chapters:

  1. Science (This chapter defines science, describes the methods that are common across all branches of science, and argues for its importance.)
  2. Observation (This chapter describe the process of effective observation.)
  3. Explanation (This chapter focuses on the goal of explaining “why things happen as they do in the natural world,” including the special role of hypotheses and theories.)
  4. Experimentation (This chapter describes the role of experimentation, various types of experiments, and the ways experiments should be designed and conducted to produce reliable findings.)
  5. Establishing Causal Links (This chapter extends the topic of experimentation by addressing the special techniques, including statistics, that must be used to establish causation.)
  6. Fallacies in the Name of Science (This chapter draws a clear distinction between science and pseudo-science, including basic tests for distinguishing science from its imitation.)

Unless you’re already trained in the ways of science, you’ll find this book enlightening and enjoyable. It’s quite possible that you’ve already published a research paper in your field of study but somehow never learned what this little book teaches. I’ve read many research papers, especially in my field of information visualization, which had the appearance of science, with technical jargon and lots of statistics (often misapplied), but were in fact pseudo-science because the researchers and their professors did not understand the basic methods of science. So many time-consuming but ultimately worthless projects might have been salvaged had the researchers read this simple little book.

Take care,


P.S. When I wrote this blog post, I’d forgotten how horribly expensive this books is. It lists for almost $100. Even discounted, it will still cost you nearly $80. This is unconscionable. I doubt that it was the author’s decision to price it out of reach. I suspect that this is an example of Wadsworth Publishing’s shortsightedness. They see it as a textbook that only students will purchase – students who will have no choice in the matter. In fact, this book would have a broad audience if it were reasonably priced; so much so that the publisher and author would earn a great deal more money. What a shame! Until this changes, try to find yourself a used copy.

Graphical Journalists Should, First and Foremost, Be Journalists

August 31st, 2015

It annoys me when I see poor journalistic infographics, in part, because I value journalism and I hate to see it done ineffectively. Good news organizations take the quality of their journalist’s writing seriously. Journalists and their editors work hard to produce news stories that are accurate, clear, and compelling. Those who can’t write effectively lose their jobs. So, why is it that some of the same publications that take great pains to produce well-written articles don’t bat an eye when they produce infographics that are inaccurate or unnecessarily difficult to understand?

Take the following infographic recently published by Time as an example. The topic is important, “Why We Still Need Women’s Equality Day,” but notice how unnecessarily hard you must work to get the information and how difficult it is to compare the values that it displays and combine them into a sense of the whole.

Women's Equality Day Infographic

This infographic provides eight measures of women’s participation in government. Each measure is expressed as a percentage of female vs. male participation. So why is each measure presented graphically in a different way? A single graphical form that makes the percentages of female vs. male participation easy to read and understand for all of the eight measures would work so much better. Also, given the fact that there is value in comparing the eight measures, why does the infographic arrange them vertically in a way that no computer or tablet screen could contain without scrolling? And even if all eight measures could fit on a single screen, because every one is expressed in a different manner, they still couldn’t be quickly and easily compared.

Has anything been gained by displaying the eight measures in these various ways? Some infographic designers would argue that by displaying the measures differently, visual interest has been increased, resulting in greater reader engagement. I suppose that there are people who might actually find this variety of expression engaging, but only in a way that draws them into the pictures independent of their meanings. Is someone who reads this article merely to enjoy the pictures with little concern for the content and its meaning an appropriate audience? Only if the journalist is trying to win a popularity contest among the disinterested.

Here’s the same story told primarily in graphical form, but this time it is clear, simple to read, makes comparisons easy, and brings the measures together is a way that makes the whole available at a glance.

Women's Equality Day Infographic - Redesigned

A great deal has been gained through this redesign, but has anything been lost? Nothing other than meaningless, complicating, and distracting variety.

Isn’t it time that we demand of graphical journalists the same standards of effectiveness that we demand of their traditional counterparts? Journalism is journalism. Whether the story is told in words, pictures, or both should be determined by the nature of the information and the integrity of that information should always be respected.

Take care,


Mind the Gap or Die

August 18th, 2015

Many books have been written in recent years about our brains—how they evolved, how they work, how they often fail us, how they sometimes serve us in a blink with little effort, and how they differ from the brains of other animals. Neuroscientists and psychologists in particular have fascinated us with advances in our understanding of the human brain. This emerging understanding reveals a challenge that we must face and overcome to preserve our species. Our brains evolved primarily to survive as hunter-gatherers on the African savannah, which was straightforward and simple. Not simple in that it was easy, but simple in that the problems that we faced were not difficult to understand. The automatic, intuitive, fast, and emotional types of thinking that made us successful in that world—sometimes called System 1 or Type 1 thinking—served us well for millennia. Something began to happen, however, when changes in our brains led to tool making and social cooperation, which gave birth to a cultural revolution. We went from living in small bands on the African savannah to living in larger groups throughout the world. That revolution, which coincided with expanding cognitive abilities that are deliberate rather than automatic, cognitively demanding rather than intuitive, slow rather than fast, and rational rather than emotional—sometimes called System 2 or Type 2 thinking—not only gave us greater ability to adapt to the world but also an ability to shape the world. The result is a world, unique to humans, that is complex. This complexity of our own making has made our lives challenging. And, because we usually try to deal with this complexity using the System 1 thinking that isn’t equipped to deal with it, we have made mistakes along the way that threaten our very existence, such as the damage to our environment that is a by-product of industrialization and the potential for mass destruction that is made possible through science and the technologies that it has produced. Learning to use System 2 thinking to solve complex problems is the fundamental challenge of our age. In his book, Closing the Mind Gap: Making Smarter Decisions in a Hypercomplex World (2014), Ted Cadsby ties together the threads of understanding that are needed to meet this challenge.

Closing the Mind Gap

Ted Cadsby is not a neuroscientist or a psychologist. He is a former banking executive with an MBA, seasoned with training in philosophy, who now works as a researcher, writer, and speaker on complexity and decision making. As such, his interest and work in decision making is not just theoretical. He is intimately familiar with the types of problems that we face day to day in our lives and especially in our work. In Closing the Mind Gap, by drawing on the work of many researchers from various disciplines, he’s managed to weave together into a coherent whole an understanding of the gaps that exist between our System 1 thinking abilities and the complex problems that we face, along with the steps that we must take to develop and apply our System 2 thinking abilities to solve these problems.

  • In Part I, “Brains Coming into the World,” he describes how our brains evolved and the ways in which our cognitive abilities are bounded.
  • In Part II, “Brains Sorting out the World,” he describes how our brains work—how System 1 thinking simplifies and satisfices in ways that served our ancient ancestors well and continue to serve us well when facing easy, day-to-day decisions, but how simplifying and satisficing fail when faced with complexity.
  • In Part III, “The Brain-World Problem,” he explains that we live in two worlds: World 1, which is simple and easy to navigate using intuitive System 1 thinking, and World 2, which is complex and therefore beyond the reach of intuition.
  • In Part IV, “The Brain-World Solution,” he explains that Systems Theory, the scientific method, and statistical thinking enable “metacognitive thinking”—a form of System 2 thinking—to deal with complexity.
  • In Part V, “Brains and People,” he explains how complexity in our selves (psychology) and complexity in others (sociology) complicate the struggles that we face.

This book consolidates and articulates a great deal of content that you could find elsewhere in greater detail, but with much more time and effort. As I was reading it, it felt a little repetitive at times, but that might have been because I’m well read in this subject matter. I recommend this book to anyone who engages in complex decision making or supports those who do. This should be required reading for anyone who is embarking on a career in analytics. Even if you’ve been doing data analysis for years, this book will help you rebuild a foundation for analytical thinking that will make you a better analyst. It’s never too late to strengthen your skills in fundamental ways.

Take care,


Registration open for 2016 Advanced U.S. Workshops

August 14th, 2015

This blog entry was written by Bryan Pierce of Perceptual Edge.

Registration is now open for the Stephen Few’s two advanced U.S. workshops. If you’ve already read Stephen’s introductory books or attended his introductory courses and are looking for a way to improve your dashboard design or visual data analysis skills further, these workshops provide a a great way to do so:

Each of these workshops will only be taught once in the U.S. in 2016. Signal will also be taught in London on Mar. 9-10, Stockholm on Oct. 10-11, and Melbourne on Nov. 7-8, though registration is not open for these workshops yet. In addition to the U.S. workshop, Advanced Dashboard Design will also be offered in the Netherlands on May 23-25 and Melbourne on Nov. 9-11, but registration isn’t open for these workshops yet, either.


It Takes More than a Good Name (or including ‘Good’ in your name)

August 8th, 2015

Let me begin by explaining the title of this piece. I recently ran across a blog article about something called the “Data Visualization Competency Center (DVCC).” Given the fact that I spend my time helping people develop data visualization competency, this caught my interest. As I read the article, I found myself saying “Yes” to a few of the author’s points, but I had to take a detour when I read that the DVCC was sponsored by a cloud-based business intelligence (BI) service provider named “GoodData.” This company was not on my radar, so I quickly accessed its website to get a sense of its work. It took no more than a minute to realize that GoodData knows little about data visualization and certainly doesn’t support data visualization competency. Many examples of its visualizations were poorly designed. In other words, its visualizations are typical of BI software—typically bad. There should be a rule that you can only include “good” in your name if you do good work. Then again, marketing is all about making claims that are seldom true. The author of the article, Lindy Ryan, does not work for GoodData, but is instead the “Research Director of Data Discovery and Visualization” for a BI consultancy named Radiant Advisors. Ms. Ryan recently wrote a report titled “The Data Visualization Competency Center: Balancing Art, Science and Culture,” which advocates an extension of the “Business Intelligence Competency Center (BICC)” to improve the use of data visualization within businesses. In respect to data visualization, Radiant Advisors fits its name only if by “radiant” you mean “shiny,” as in, “The light reflecting off of that street sign is so shiny I can’t read it.” Snarky comments aside, let me get to the point of this blog piece: most of the people and organizations who proclaim expertise in data visualization are not experts. Even worse, many of them sell snake oil. Don’t be deceived. If you want to learn about data visualization and put it to good use, take care in choosing your advisors.

A little more background will add texture to this story. When I read Ryan’s article and saw that she associated GoodData with the Data Visualization Competency Center, I immediately responded by commenting that GoodData does not support data visualization best practices. After some brief back and forth between us in the blog, she suggested that we take the discussion offline. After a few days, rather than hearing from Ms. Ryan, out of the blue I received an email from the PR agency that represents GoodData requesting that I work with them on a research project. Hmmm…I’ve had this experience before. On more than one occasion after pointing out problems with a product, I’ve soon afterwards received an offer of work from the offending vendor. If a vendor came to me in response to a negative review and genuinely asked for help in improving their products, I would be happy to oblige. I am not, however, open to bribes.

After turning down GoodData’s suspicious offer, I finally heard from Ms. Ryan. She seemed quite nice, but she knows little about data visualization, which she readily admitted. Why then is she a Data Visualization Research Director and why does she advise organizations about their use of data visualization? And, even more fundamentally, how is she qualified to develop a Data Visualization Competency Center? She isn’t. This concerns me because Ms. Ryan isn’t unique. Most of the people who set themselves up as data visualization leaders and advisors are not qualified. It has become business as usual to make fraudulent claims. This won’t change if we fail to expose them.

Take care,