Malcolm Gladwell, modern problems, and the analytics age

I had the great pleasure last Thursday of hearing Malcolm Gladwell, journalist and author of the books Outliers, Blink, and The Tipping Point, speak at SAS Institute’s Innovators’ Summit in Chicago. I gave one of two keynote presentations in the morning and Gladwell gave the keynote in the afternoon. I believe that Gladwell is one of the great thinkers and communicators of our time, and his words on Thursday afternoon led me to believe this even more fervently.

Gladwell’s topic was well-chosen for a group of people who spend their time making sense of data (mostly statisticians) using SAS’ visual analysis product JMP. He spoke about problem solving and the fact that our problems today are different from those of the recent past. Our former problems were usually solved by digging up and revealing the right information. He used Watergate as an example, pointing out that key information was hidden, and the problem was solved when Washington Post journalists Woodward and Bernstein were finally able to uncover this information that had been concealed. Modern problems, on the other hand, are not the result of missing or hidden information, Gladwell argued, but the result, in a sense, of too much information and the complicated challenge of understanding it. Enron was his primary example. The information about Enron’s practices was not kept secret. In fact, it was published in several years’ worth of financial reports to the SEC, totaling millions of pages. The facts that led to Enron’s rapid implosion were there for anyone who was interested to see, freely available on the Internet, but weren’t understood until a journalist spent two months reading and struggling to make sense of Enron’s earnings, which led him to discover that they existed only as contracts to buy energy at a particular price in the future, not as actual cash in the bank. The problems that we face today, both big ones in society like the current health care debate and smaller ones like strategic business decisions, do not exist because we lack information, but because we don’t understand it. They can be solved only by developing skills and tools to make sense of information that is often complex. In other words, the major obstacle to solving modern problems isn’t the lack of information, solved by acquiring it, but the lack of understanding, solved by analytics.

Gladwell’s insights were music to my ears, because he elegantly articulated something that I and a few others have been arguing for years, but he did so in a way that was better packaged conceptually. Several months ago I wrote in this blog about Richards J. Heuer’s wonderful book Psychology of Intelligence Analysis, and featured his assertion that we don’t need more data, we need the means to make sense of what we have. More directly related to my work in BI, I’ve stated countless times that this industry has done a wonderful job of giving us technologies for collecting and storing enormous quantities of data, but has largely failed to provide the data sense-making tools that are needed to put data to use for decision-making.

The title of my keynote at the Innovators’ Summit this year was “The Analytics Age.” I argued that the pieces have finally come together that are needed to cross the threshold from the Information Age, which has produced great mounds of mostly unused information, to the Analytics Age, when we’ll finally learn how to understand it and use it to make better informed, evidence-based decisions. The pieces that have come together include:

  • plenty of information
  • proven analytical methods (especially statistics enhanced through visualization)
  • effective analytical tools (only a few good ones so far, but this will change)
  • a growing awareness in society that analytics are needed to replace failed decision-making methods, based on whim and bias, that have led us to so much trouble

Although many software vendors claim to sell analytics tools, seeking to exploit the growing awareness that analytics are powerful and necessary, few actually understand this domain. Their products demonstrate this fact as silly imitations of analytical techniques. This is true of every traditional BI software vendor in the market today. As Gladwell pointed out, the paradigm has shifted; the skills and methods that worked in the past can’t solve the problems of today. Only a few software vendors that play in the BI space (none of which represent traditional BI) have the perspective and knowledge that is required to build tools that can help us solve modern problems. Most of these have either evolved from a long-term focus on statistics, such as SAS Institute, or have emerged as spin-offs of academic research in information visualization, such as Tableau and Spotfire. If traditional BI vendors want to support the dawning analytics age, they must retool. They must switch from an engineering-centric worldview, focused primarily on technology, to a design-centric perspective, focused primarily on the human beings who actually work with data. Only then will they be able to build effective analytical tools that take advantage of human visual and cognitive strengths and augment human weaknesses.

Borrowing another insight from Gladwell, I believe we are approaching the “tipping point” when people will no longer be fooled by analytical imitations and will begin to develop the skills and demand the tools that are needed to get the job done. Business intelligence vendors that fail to catch on or to turn their unwieldy ships in time will be left behind. The times are changing and so must they.

If you’re among the minority in the workforce today who understand analytics and are willing to tie your talents to good tools that utilize them fully, you are in for the ride of your life. As Hal Varian, University of California, Berkeley professor and current Chief Economist at Google, recently stated in an interview, “statistician” will become the sexy job of coming years, just as software engineers enjoyed that position for years, beginning in the 1980s. Evidence of this can already be discerned. Even in today’s depressed job market, graduates with degrees in statistics are in extremely high demand and are being rewarded with high salaries. You don’t need a Ph.D. in statistics to be a good data analyst, of course. You must, however, have the soul of an investigator and a flexible, analytical mind. You must be able to think critically. Daniel Pink made this case brilliantly in his book A Whole New Mind (2005). What I’m calling the “analytics age,” he called the “conceptual age.”

Our schools are not geared up to produce this kind of workforce, so if you’ve somehow managed to develop these skills, there’s a place of honor for you in the world that’s emerging. You’ll be appreciated in ways that were rare during those years when I worked in the corporate world. Won’t it be refreshing to actually be thanked when you reveal, through painstaking analysis, faults in your organization’s policies, practices, or assumptions, rather than being ignored or punished? Won’t it be nice to be rewarded when you save your organization millions of dollars by warning against a doomed decision rather than being demoted for speaking a politically unpopular truth? Won’t it feel good to prepare a well-reasoned case for a specific course of action and not have your hard work discarded in the blink of an eye by a manager who says “No, we’ll do it my way, because I’m the boss.” If your heart sings at these prospects, hold your head up and stay true; your day is coming.

Am I dreaming? Can a society and a workplace in which reason and evidence trumps whim and bias really emerge with enough strength to shift the balance? I hope so, but there’s no guarantee. I’m going to do everything I can to help usher it in. The opportunity is now. I don’t want to live in the sad, dumb, unjust society that is our future if this opportunity is missed.

Take care,

27 Comments on “Malcolm Gladwell, modern problems, and the analytics age”


By Snehal Desai. September 21st, 2009 at 1:22 pm

Excellent blog article. Couldn’t agree more. The techniques that got us from Industrial Age to Information Age will not help to make the transition to Analytics Age.

By Mark Dykeman. September 21st, 2009 at 4:03 pm

It’s a very good and optimistic piece. However the point “a growing awareness in society that analytics are needed to replace failed decision-making methods, based on whim and bias, that have led us to so much trouble” – is hogwash. Much of society is yet governed by religious and magical thinking. I deeply hope you’re right, but this final point, I think, has been expressed since the Enlightenment without ever actually coming true.

By Stephen Few. September 21st, 2009 at 4:55 pm

Mark,

Your statement that “much of society is yet governed by religious and magical thinking,” which I agree is the case, does not negate the truth of my statement that there is a growing awareness that analytics are needed. If you believe my statement is “hogwash”–a rather strong criticism–you should provide a better argument against it. Read my statement more carefully. I do not claim that society as a whole has become enlightened, nor do I believe it ever will. We have an opportunity, however, to swing the pendulum toward greater rationality.

By Tim van Gelder. September 22nd, 2009 at 12:37 am

Agree. One company secretary said: ““ There are two equally effective ways of keeping a board in the dark. One is to provide them with too little information. The other, ironically , is to provide too much.” However analytics is not the whole story. Analytics transforms data into insights which may (or may not) constitute good evidence for one path of action or another. Arguments bridge the gap between evidence and conclusions. Argument structures can be complex and can also be visualised.

By Jonah Feld. September 22nd, 2009 at 9:25 am

Gladwell wrote a similar piece that appeared in The New Yorker. With the same back story, it touches on the differences between a puzzle like Watergate and a mystery like Enron:
http://www.gladwell.com/2007/2007_01_08_a_secrets.html

By Doug Okamoto. September 22nd, 2009 at 11:07 am

Prof. Few,

I also enjoyed hearing Malcolm Gladwell speak on September 18, 2009, at the Discovery 2009 and Innovators Summit meetings sponsored by JMP in Chicago. Towards the end of his talk, Gladwell stressed the need for those of us in analytics to be persuasive and convincing when making our arguments even when blogging.

During your equally enjoyable talk on Friday morning, you played a recent Gapminder video of Prof. Hans Rosling showing US State Department officials emergent global trends in health and income over the last 50 years. On second viewing, I found Prof. Rosling’s use of bubble-charts persuasive, but his emotionally-charged, vocal delivery even more compelling.

In his current best-seller, “Outliers: The Story of Success,” Malcolm Gladwell refers to this emotional intelligence as “practical intelligence.” Perhaps, we as analytics experts need to do more than just visualize data to fully engage our audience whether it is in the boardroom or society at large.

By Chuck Pirrello. September 24th, 2009 at 10:47 am

Stephen, excellent recap of Malcolm’s presentation. His message definitely supports your preachings about the importance of good visualization techniques supported by analytics that help the viewer better understand what the data represents.
However, the opening comment of your speech was a bigger hit, by far, with the attendees: You recapped a conversation with a Berkeley grad student who called JMP “Statistical Porn.” It certainly caught the attention and elicited the delight of the statistician-heavy audience!

By Doug Okamoto. September 25th, 2009 at 10:08 am

As a fellow statistician, I found the video clip of Prof. Hal Varian equally delightful:

http://www.mckinseyquarterly.com/Hal_Varian_on_how_the_Web_challenges_managers_2286

Unfortunately, the following statement was inaudible (censored by a non-statistician?) at the beginning of the video clip:

“I keep saying the sexy job in the next ten years will be statisticians.”

Hal Varian on how the Web challenges managers, McKinsey Quarterly,
January 2009.

By Rick. September 26th, 2009 at 11:33 am

I wonder if we perhaps fail to appreciate the challenges the analyst faces. In my experience, the primary challenge in moving organizations to evidence based decision making is not a want for evidence or even for analysts capable of presenting data well.

Rather it’s a managerial class often unable to distinguish between good and bad analysis and not humble enough to act on information that doesn’t necessarily align with their existing experience-based biases. “Old school” managers feel, perhaps justly, threatened by a system of decision making which they do not understand.

The continued advancement of analytical tools will hopefully speed the transition. However, the tipping point will truly come when the criteria of a good manager universally includes the ability to use advanced analytical tools for him/herself and a commitment to make decisions accordingly.

By Zak. September 30th, 2009 at 5:26 am

Fascinating read, one that’ll I’ll be sure to forward to a few statistician friends of mine.

By Keith Fortowsky. September 30th, 2009 at 8:08 am

Excellent posting, Stephen.

But in response to Rick’s wondering “if we perhaps fail to appreciate the challenges the analyst faces”, I also wonder if we also fail to appreciate the challenge the *manager* faces. The problem of analysts falling in love with their decision models is as old as analysis itself. But the manager is faced with actually making the decision, and reaping the consequences, which inevitably involve many more factors than were foreseen by the model.

I think the key is ultimately neither the manager nor the analyst using “advanced analytical tools for him/herself”, but rather using them jointly, together with staff with the widest possible range of perspectives, to advance what Jeanne Liedtka calls a “strategic conversation”. Fortunately, we are getting dramatically better tools, and great examples from people like you, Stephen, to support this.

Also, Stephen (or Doug), I’m wondering if you have a URL or other source information for the item mentioned by Doug Okamoto: “a recent Gapminder video of Prof. Hans Rosling”.

By Joe Oviedo. September 30th, 2009 at 1:27 pm

Wow. Excellent article, I couldnt agree more. Its like, everything throws huge amount of data at us, without reason, like if we were just another computer chuncking numbers. We are people, a human being, working with another human being to cover the needs of another human or group of people. I also see a great opportunity not only on my field, but everywhere, LOOK AT IBM AN THEIR “SMART PLANET”. Come on! I want in! By the way Im buying your new book as soon as I can! In suscribing to this blog now!

By Jerome Pineau. October 1st, 2009 at 2:36 pm

One thing I would point out is that, often enough, it’s not about information under or overload, but about simply not processing the information in front of your face. People want to believe what they’re comfortable with :)

By Jim Goodell. October 2nd, 2009 at 7:02 am

In general your observation is right on that “the major obstacle to solving modern problems isn’t the lack of information, solved by acquiring it, but the lack of understanding, solved by analytics.” However, there are cases where the problem still lies in acquiring the right information. There are still some rare cases where facts are not being measured and captured in a way that is needed to inform analytics and decision-making. The problem is not in quantity of information, but in data quality. For example, in education an enormous amount of data is being captured that can support some higher-level problem solving. However, consistent daily measurement of student learning, the core business of schools, is generally not done in a way that can inform analytics.

By Robert. October 9th, 2009 at 10:35 am

Of course, let’s hope that the statisticians and analysts (and the people making decisions with the insight they provide) will use that information for “good rather than evil”.

For example, it was probably a statistician/analyst that crunched the numbers and told their CEO that a fortune could be made … doing corporate-raiding, giving credit cards to people who would likely get into debt over their heads, giving subprime & interest-only mortgages to people who shouldn’t be living in such an expensive house in the first place, day-trading, moving jobs overseas, manipulating markets, sacrificing long-term viability for short-term profits, etc, etc.

As the old saying goes – “With great power, comes great responsibility”.

By Kenny. October 26th, 2009 at 3:21 pm

For me this debate rings parallel of the debate on whether knowledge comes from an inductive or deductive process. When staring at a wealth of information, you can “play with it” until patterns begin to “emerge” and identify some pattern in the data. Other times, you will be presented with a wealth of information, but you will have some question or test that you apply to the data, which enables you identify some pattern or answer.

Ultimately, induction and deduction both contribute to the development of knowledge; indeed both are required! To this, analytics — which I interpret here as a deductive process given the following ” the Analytics Age … evidenced based decisions” — is only one step; and to think that its somehow replacing induction based processes is to ignore how knowledge really develops historically and culturally.

Nice blog entry, though

By Stephen Few. October 26th, 2009 at 4:10 pm

Kenny,

By analytics, I mean “data sense-making,” which includes both deductive and inductive reasoning. Effective data analysis flows freely between these two modes of reasoning. I’m curious–what is it about “evidence based decisions” that implies deductive and not inductive reasoning?

Steve

By Chris Gerrard. October 29th, 2009 at 1:45 pm

Anecdotally, there has ben an increase in the use of the phrase “evidence-based” in discussions about public policy.
This is heartening, as are the receptions to Gladwell’s books and Stephen’s work.

One huge barrier to the emergence of the better world that Gladwell imagines, and readers of Stephen’s work hope for, is the inability of many people to engage in the type of thinking that’s amenable to persuasion in rational evidence-based argument.

I come across plenty of people who are climate change doubters.
One of the barriers in discussing the situation with such people is the frequent inability to distinguish between weather and climate.
Sometimes they’re not so much skeptical of the reality of changes in climate, but highly dismissive of any suggestion that human activity has any contribution to whatever increase in global temperatures may exist. This is usually expressed in the form: climates have always changes (due to sunspots, regular orbital variance, etc.) and ours might well be changing now, but human activity can’t be the cause. The great volume of readily available, well-presented, persuasive scientific evidence has little effect.

The problems with cultural innumeracy are well documented. The culturally widespread inability to think critically is not so well publicized.

Maybe this is changing. I hope so.

But the frequency that I encounter the phrase “very unique” in the popular media leads me to think that we might have further to go that we would like.

By Stephen Few. October 29th, 2009 at 1:57 pm

Chris,

Well put, my friend. Despite the idiocy that so often prevails, we can hope for better and each do what we can to bring it about.

Steve

By Joe Openshaw. October 29th, 2009 at 5:40 pm

Excellent conversation! So how does one start to develop the above skills? How do you start developing these capacities in your analysts and managers so that they can more effectively turn information and analysis into insight?

Joe

By Judson Bliss. October 30th, 2009 at 9:39 am

All,

I agree with much of the above thoughts about the importance of good data analysis to challenge mental models. However, are any of us immune to overlooking system boundary problems and other contextually relevant information about our data? I really like being a system analyst, but I don’t think I want to simply be a slave to collected data.

judson

By Stephen Few. October 30th, 2009 at 10:57 am

Judson,

Please explain what you mean by “system boundary problems and other contextually relevant information about out data.”

Steve

By John Maslen. November 2nd, 2009 at 6:10 am

Steve – really enjoyed your Blog entry and the discussion above which I agree with on almost all fronts.

I am slightly uneasiy that the role and impact of tools like SPSS that statisticians have been using for years has had. These have been used extremely widely to effectively analyse data. They are mature and, while not necessarily built around data viz principles (which you can legitimately argue is a big weakness), they have been the foundation of a great deal of statistical analysis. More genrally the argument makes me a bit uneasy in terms of how it down-plays the whole discipline of statistics (a science for undertaking data analytics) that has been around for several hundred years.

For me the argument is more about delivering effective analytical tools to a new non-expert audience of non-statisticians – the so-called “democratisation of data analysis”. That’s where many of the holes are.

In relation to this, as a company we continue to hear arguments from those in government saying they are uncomfortable about the idea of delivering self-service data tools to non-experts (like managers) but want to keep this area as something only ‘trained analysts’ can support with expert insight – this just seems to be a ‘denial’ attitude of how things are moving. What they should be doing, in my view, is rolling out training programmes to these staff so they are more confident in their analytical abilities.

This raises the whole challenge of how we raise the bar amongst our citizens for understanding data so they can challenge, scrutinise and make government services more accountable.

There’s a hot debate right now in the news here in the UK – the Government Scientific Advisor on drugs policy has just been fired for daring to tell the media he thinks the government is wrong in not taking his panel’s advice based on sound scientific evidence (http://www.guardian.co.uk/politics/2009/nov/02/drug-policy-alan-johnson-nutt). The argument goes that “scientific advisors are there to advise, government ministers are there to make the tough decisions based on the wider political context”. It is true there are many factors impacting on decision-making – scientific evidence is unlikely to be followed if it is deemed politically unpalatable. But I find the whole argument is symptomatic of the lack of trust those in government have for ‘evidence’ – I still feel they would much rather base decisions on their political philosophy and (often mis-informed) understanding of reality. I’m wondering if this is more down to psychological factors in that decision-makers need to base a decision on issues that they feel they understand and have some control over (perhaps in order to be able to justify them) rather than base them on scientific evidence which they probably don’t really understand and don’t feel they control. One (personal) theory – no doubt there is lots of academic research on this!

Thanks again for initiating this discussion.
John

By Judson Bliss. November 2nd, 2009 at 9:16 am

Steve,

What I mean about the system boundaries, etc. is that all models have boundaries (see Forrester, 1968, chapter 4: “Structure of Systems”). For instance, in one if the above posts, Chris talks about pollution causing global warming. Climate data show gradual increase in average global temperature over the past several decades. These changes coincide with the growing use of fossil fuels. The boundary problem some people create is that they interpret lower local temperature changes to mean that global warming is not occurring. However, another boundary problem is created when others view the earth as a closed system by overlooking changes in solar output (e.g. sunspots and solar flares) which are causing polar ice cap reduction on Mars. As G.E. Box said, “all models are wrong, some are helpful.”

Judson

By Stephen Few. November 5th, 2009 at 2:58 pm

John,

I don’t intend to under-appreciate or undermine the role of statisticians. Actually, by working to make analytics available to a wider audience, including folks with little or no formal training in statistics, I hope to free statisticians to spend more of their time to work on the hard stuff, which is a whole lot more fun, rather than the easy stuff that makes up most of the analysis that is typically needed by organizations. Most of the questions that we seek to answer through data analysis are not complicated. With good tools and basic skills, the easy work could be distributed among many people, including some of the decision makers themselves.

I don’t downplay the fact that skills are needed. I believe, however, that the analytical skills that are needed for most of the work are relatively simple and easy to learn–but yes, they must be learned. Software vendors that sell analytical tools can contribute to this effort by offering courses in data analysis; not just in how to use their software but in the basic analytical concepts and skills that must be understood to use their tools well. Equipped with basic skills, people all most organizations can do a great deal of analysis on their own for their own purposes.

One of the skills that they should develop is the ability to recognize when an analytical problem requires expertise that they lack. When this happens, they can turn to those with greater skills for help–either to do the analysis for them or to show them how to do it for themselves. Statisticians have an opportunity to come out of the back room where they practice their esoteric art, into the realm where the rest of us dwell. They can serve as mentors, sharing their love of statistics with others to enlarge the community. Most people who work with data will never develop a high level of analytical expertise; they don’t have the time or the need. Statisticians should never worry that by helping others to take small steps toward greater analytical expertise they are working to put themselves out of a job. By helping to spread a deeper layer of analytical expertise more broadly through the organization statisticians will in fact be able to off-load the routine work that they’d rather not do so they can attend to the greater challenges that tap into and expand their skills.

Steve

By Stephen Few. November 6th, 2009 at 11:37 am

Judson,

Part of the reason that I questioned your use of the phrase “system boundary problems,” other than the fact that I didn’t want to guess your meaning, was to point out that we too often throw around terms that mean nothing to our audience, which results in a failure of communication. As analysts, our work is worthless if we can’t clearly communicate our findings to those who need the information.

It seems that the main point you were hoping to make in your comments was contained in your last sentence, but I don’t understand your concern. In what sense would you be a “slave to collected data”?

By Stephen Few. November 6th, 2009 at 11:47 am

Joe Openshaw,

One way you can help people become better analysts is to encourage them to take advantage of the resources that are available for developing these skills. My books, courses, and website provide useful resources. Several other excellent books are also available, a few of which I’ve written about in my blog (for example, “Head First Data Analysis”). I also recommend Jonathan Koomey’s book “Turning Numbers into Knowledge” as a good source for some of the fundamentals.

In addition to this, you should do whatever you can to create a culture of intelligence: one that encourages people to ask questions, to think and explore, and to based their judgments on evidence rather than whim.

Oh yeah, and you can make good analytical tools available to them. Great analysts can’t do a great job if they use poor tools.