Thanks for taking the time to read my thoughts about Visual Business
Intelligence. This blog provides me (and others on occasion) with a venue for ideas and opinions
that are either too urgent to wait for a full-blown article or too
limited in length, scope, or development to require the larger venue.
For a selection of articles, white papers, and books, please visit
June 22nd, 2015
When you dedicate your life to the creation of useful knowledge and tools, as Ben Shneiderman has, it is painful to see your lessons or inventions perverted in some way. It happens, however, no matter how hard you work to keep your creations on a path of proper use. Besides the original design of hyperlinks, the treemap is Ben’s best-known invention. Once you put an invention out there in the world, you lose control of it and sometimes cringe at the ways it is misused or defaced. I’ve seen many poor implementations of treemaps over the years, but this one takes the prize:
What an eyesore. According to this the current marketing technology landscape is a mess. Rather than drawing us into useful information this perversion of a treemap painfully screams “Avert your eyes, for I am a hideous thing.” May God have mercy and grant it a swift and painless death.
June 18th, 2015
I recently read an article by Stuart Frankel that appeared on HBR.org titled “Data Scientists Don’t Scale” (May 2, 2015). It began by pointing out that the recent “reverence” for Big Data and Data Science “has disillusioned many of us” and is “about to get a reality check.” I share this perspective, but not for the author’s reasons and I definitely don’t agree with the author’s solution. Frankel is the CEO of Narrative Science, “a company working on advanced natural language generation for the enterprise.” Given his role, you won’t be surprised to learn that he promotes the use of natural language processing (NLP) and artificial intelligence (AI) as the scalable alternatives to skilled data analysts that are needed. (Why does the Harvard Business Review provide a platform for biased, self-serving advertising from vendors that pose as informative articles?) Frankel argues that, unlike computers, data scientists don’t scale, but his premise is flawed. Not only do humans scale, they do so in much the same way as computers.
When you need more computing power, you have three potential choices:
- Replace the computer that you have with one that’s more powerful
- Add more computers
- Upgrade the computer that you have to make it more powerful
When you need more human power, what are your choices?
- Replace the employee that you have with one who’s more productive
- Add more people
- Help your employee upgrade his skills to make him more productive
We humans are scalable. In fact, although we are scalable in somewhat different ways than computers, humans are more scalable than computers in some fundamental and important ways.
Many organizations—perhaps most—find it easier to invest in technologies than to invest in people. Investing in technologies is only a good investment, however, when the technologies are good and they can do the job better and less expensively than people. Data sensemaking (analytics, business intelligence, data science, statistics, etc.) is not one of those cases. This is because of the fundamental human ability that makes us much more scalable than computers: we can think. When the cognitive revolution began around 70,000 years ago with the extension of language into the realm of abstract thinking, homo sapiens became the most scalable creature on the planet. If computers could think and feel, they would envy us. Despite our many flaws, which frequently get us into trouble and might eventually lead to our demise, we are gifted in ways and to a degree that our inventions cannot duplicate—not even close. This is easy to forget at a time when the flawed technologies that we’ve created are indiscriminately revered without question. Those of us who know technologies well understand their limitations and therefore seldom succumb to this absurdity. Technologists who promote this reverence usually have something to gain.
If we fail to make wise use of data to create a better future, it will not be the fault of our technologies. We’ll have no one to blame but ourselves. If we allow technologies to do our thinking for us (i.e., execute programs as an imitation of thinking), we’ll lose the ability to think for ourselves. If this happens, not much else will matter.
June 8th, 2015
Data sensemaking requires skill augmented by good technologies. Even though data sensemaking skills can be developed by almost anyone, they can only be acquired through sustained effort to learn the relevant concepts, principles, and practices, which takes time. This hard fact isn’t as appealing as the fantasy that technologies can turn us into data analysts over night. It is largely because technology vendors have been selling us this myth that we have been trapped in the “data age” but have not yet managed to enter the “information age.” For 35 years I’ve been involved in decision support, data warehousing, business intelligence, analytics, data science—call it what you will—and I find it appalling that, despite all the hype about the information age, we’ve made little progress in deriving value from data. Until we accept the fact that data sensemaking and the decisions that it informs cannot succeed without skill, we’ll remain stuck. Besides skill augmented by good technologies, there are two other prerequisites for successful data sensemaking that are usually ignored but deserve mention: time and attention.
Only those data analysts who are given time to explore and analyze data thoughtfully and thoroughly are consistently successful. Most of the great data sensemaking discoveries that we hear about are the result of work by those rare data analysts whose organizations have given them time to do their jobs well. In contrast, most data analysts are either churning out responses—usually in the form of reports—to a long list of “urgent” requests at a breakneck pace or have other jobs that take most of their time, so they squeeze their data sensemaking activities in here and there whenever they can. This is the norm because few organizations have realized that getting real value from data doesn’t just happen “techno-magically,” as they’ve been led by vendors to expect. It takes time to learn and develop data sensemaking skills and it continues to take time to apply those skills each day. This is because data sensemaking involves analytical thinking and analytical thinking takes time. Technologies can assist by doing fast calculations and other forms of data processing, but the thinking that’s required is slow.
Effective data sensemaking also involves attention. This is one of the requirements for rich thinking that has become more difficult to achieve in recent years. Our lives have become increasingly disrupted by the constant demands of the “persistently connected” technologies that we’ve adopted. For how many minutes can you become lost in concentration without being pulled out of your intellectual reverie by a beep, ding, ringtone, vibration, or the sudden appearance of an alert on your screen? This is not what’s usually meant by disruptive technologies and it certainly isn’t a desired effect. Whenever our attention is pulled away from a data sensemaking task, it takes time and effort to get it back, and much can be lost in the meantime.
Data analysts need an environment that supports concentration without interruption for extended periods of time. The open floor plans that many organizations have been experimenting with to promote collaboration (and, let’s face it, to also save money) are perhaps appropriate for some jobs, but not for data sensemaking. Most of us need to shut the door, turn off all possible sources of interruption, sit in a comfortable chair, and think attentively for long periods of time. I never had a chance to meet the celebrated Princeton statistician John Tukey before he died, but I’ve heard from friends who did that when he took on a consulting job for a client, he would begin with a meeting to discuss the problem and he would then cloister himself in his hotel room for a couple of days before emerging with a solution. Imagine Tukey trying to navigate the disruptive environment of the modern workplace. If he had, perhaps we would have never heard of him. Perhaps the box plot and his many other analytical inventions would not have graced our world. He knew that data sensemaking required attention and he structured his environment to provide it. We must do the same, which means that our employers must recognize the need and support it.
I recently gave a keynote presentation at the University of Cincinnati’s Analytics Summit, and when I mentioned this rarely addressed need for time and attention, you should have seen the heads nodding appreciatively throughout the ballroom. Their eyes brimmed with gratitude for my recognition that the conditions of their work do not match the professed commitment of their organizations to analytics. Claiming to embrace analytics is a far cry from the investment that must be made to fulfill this claim.
Is your organization analytically savvy? Does it exhibit a culture of analysis? If your answer is “Yes” but your organization doesn’t give you adequate time and a work environment fit for focused attention, you’re setting the bar too low. Imagine how much better a data sensemaker you could be with more time and attention.
June 1st, 2015
My new book Signal is finally available for purchase from booksellers, including Amazon. Let me help you decide if it’s a book that you’ll find helpful.
Two years ago, after writing the second edition of Information Dashboard Design, I began asking “What next?” I spent considerable time trying to figure out what people who have already read my existing books most need as the next stage of their data sensemaking development. I noticed that many people spent their time chasing data that didn’t matter, including random variation that meant nothing and couldn’t be changed no matter how great the effort. They spent their days exploring, dissecting, and publishing noise.
This problem is actually growing. In this day of so-called Big Data, organizations are scrambling to implement new software and hardware to increase the amount of data that they collect and store. In so doing they are unwittingly making it harder to find the needles of useful information in the rapidly growing mounds of hay. If you don’t know how to differentiate signals from noise, adding more noise only makes matters worse.
When we rely on data for decision making, how do we tell what qualifies as a signal and what is merely noise? In and of itself, data is neither. It is merely a collection of facts. When a fact is true, useful, and deserves a response, only then is it a signal. When it isn’t, it’s noise. It’s that simple.
In Signal, I provide straightforward and practical instruction in everyday signal detection. Using data visualization methods, I teach how you can apply statistics to gain a comprehensive understanding of your data, which will serve as the context for signal detection. I then adapt the techniques of Statistical Process Control in new ways to detect not just changes in the measures but also significant changes in the patterns that characterize your data.
Only the signals matter.
April 11th, 2015
Sometimes in life the fates attack from all sides and leave us heaving for breath, overwhelmed. The last few months have been such a period for me. So much that’s been happening in my personal life has been nuts—the product of idiocy, incompetence, and at times pure meanness. A few days ago, in utter frustration I exclaimed, “I want to live in a world that makes sense.” Wouldn’t that be wonderful? To live in a world that works according to thoughtful and compassionate principles. Brainless bureaucracies would be a thing of the past. People who complicate our lives through incompetence or pettiness would suddenly grow up and give a damn. Systems that have developed to promote the interests of some to the detriment of others would be torn asunder. To the degree that natural inequities still exist, we could balance the playing field by identifying the causes and addressing them. The world wouldn’t be perfect, but we could address life’s problems thoughtfully and compassionately.
As I was thinking this, it occurred to me that this wish might be common among people who, like me, work to make sense of data. Perhaps we’re drawn to data sensemaking because we long for a sensible world, and this is our attempt to create a bit more order in the midst of chaos. I meet many fellow data sensemakers in my work and, based on the fine and dedicated people who attend my courses and read my books, I suspect that this correlation is real. If we pair this desire with the right skills and tools to make better sense of the world, we can use that knowledge to make the world a more sensible place. This dream is too precious to fritter away. The signals that live in our data are too precious to miss in the midst of deafening noise. Let’s focus our vision and double our effort. Let’s turn down the noise.