Direct the Course of Evolution, or Perish

When evolution was purely biological, there were no reins to direct it, for evolution followed the course of nature. With homo sapiens, however, another form of evolution emerged that is exponentially faster—cultural evolution—which we can direct to some degree through deliberate choices. We haven’t taken the reins yet, however, and seldom even recognize that the reins exist, but we must if we wish to survive.

In the early days of our species, when our brains initially evolved the means to think abstractly, resulting in language and the invention of tools, we were not aware of our opportunity to direct our evolution. We are no longer naïve, or certainly shouldn’t be. We recognize and celebrate the power of our technologies, but seldom take responsibility for the effects of that power. Cultural evolution has placed within our reach not only the means of progress but also the means of regress. The potential consequences of our technologies have grown. Though we can choose to ignore these consequences and often work hard to do so, they’re now right up in our faces, screaming for attention.

Some of our technologies, beginning with the industrial revolution and continuing until now, contained seeds of destruction. Technologies that rely on fossil fuels, which contribute to global warming, are a prominent example. We can work to undo their harm either by (1) abstaining from their use, (2) developing new technologies to counter their effects, or (3) developing alternative technologies to replace them. When we create technologies, we should first consider their effects and proceed responsibly. We’re not doing this with information technologies. Instead, we embrace them without question, naively assuming that they are good, or at worst benign. Most information technologies provide potential benefits, but also potential harms.

Technologies that support data sensemaking and communication should be designed and used with care. We should automate only those activities that a computer can perform well and humans cannot. Effective data sensemaking relies on reflective, rational thought, resulting in understanding moderated by ethics. Computers can manipulate data in various ways but they cannot understand data and they have no concept of ethics. Computers should only assist in the data sensemaking process by augmenting our abilities without diminishing them.

You might think that I’m fighting to defend and preserve the dignity and value of humanity against the threat of potentially superior technologies. I care deeply about human dignity and the value of human lives, but these aren’t my primary motives. If we could produce a better world for our own and other species by granting information technologies free rein, I would heartily embrace the effort, but we can’t. By shifting more data sensemaking work to information technologies, as we are currently doing, we are inviting inferior results and a decline in human abilities.

Despite our many flaws, as living, sentient creatures we humans are able to make sense of the world and attempt to act in its best interests in ways that our information technologies cannot. We don’t always do this, but we can. Computers can be programmed to identify some of the analytical findings that we once believed only humans could discover, but they cannot perform these tasks as we do, with awareness, understanding, and care. Their algorithms lack the richness of our perceptions, thoughts, values, and feelings. We dare not entrust independent decisions about our lives and the world to computer algorithms.

We must understand our strengths and limitations and resist the temptation to create and rely on technologies to do what we can do better. We should not sit idly by as those who benefit from promoting information technologies without forethought do so simply because it is in their interests as the creators and owners of those technologies. No matter how well-intentioned technology companies and their leaders believe themselves to be, their judgments are deeply biased.

Technologies—especially information technologies—change who we are and how we relate to one another and the world. We are capable of thinking deeply about data when we develop the requisite skills, but we lose this capability when we allow computers to remove us from the loop of data sensemaking. The less we engage in deep thinking, the less we’re able to do it. So, we’re facing more than the problem that computers cannot fully reproduce the results of our brains; we’re also facing the forfeiture of these abilities if we cease to use them. By sacrificing these abilities, we would lose much that makes us human. We would devolve.

At any point in history, one question is always fundamental: “What are we going to do now?” We can’t change the past, but we must take the reins of the future. Among a host of useful actions, we must resist anyone who claims that their data sensemaking tools will do our thinking for us. They have their own interests in mind, not ours. Resistance isn’t futile; at least not yet.

Take care,


5 Comments on “Direct the Course of Evolution, or Perish”

By Dale Lehman. September 3rd, 2016 at 8:53 am

I believe you have touched on a critically important and under-appreciated subject. For example, last week’s Economist had a column on the effects of technology on trust which included the following tidbit:

“Historically, however, technology has done more to open up society than to segregate it. New technologies make it easier to trust unfamiliar groups. Public ratings, for instance, can undercut discrimination.”

There is much that can be said both in support of this statement (and others like it) and those against. But there is precious little serious research into how communication is changing rapidly – and without appreciation of how it is affecting human relationships with each other, with our planet, and with our own spiritual well-being.

I find it a bit hard to believe that trust has become more widespread when I look at how civil discourse is eroding, how respect is disappearing, and how violence is becoming acceptable. I do see how much we are willing to trust distant and unidentified commercial entities in our commercial relationships. But I don’t find that particularly encouraging – in fact, it can arguably be said that this serves large scale financial interests but without any basis in fact. My “trust” of software vendors, telecom providers, hotel chains, airlines, etc. is based on the fact that it is often easier for them to settle complaints than to engage in discussion and investigation. This is hardly the edifying type of personal trust that old-style commercial relationships used to entail. It is true that those were subject to personal biases and blatant discrimination, but there was still a human connection involved. I think that is why violence of the types shown in “To Kill a Mockingbird” are so much more disturbing than the typical violence portrayed in modern movies (and modern technologically mediated environments).

So, while much can and should be said (and debated) about the benefits and costs associated with information technologies, I think it is important to note how little these effects figure in public discussion. We are truly addicted to our devices and communication is rapidly becoming an “app” when it used to be a “telephone call” (how many people still know what that is?).

By Nate. September 6th, 2016 at 6:56 pm


To be fair, history isn’t exactly full of trust, civil discourse, and nonviolence. There’s a reason that, historically, family businesses were kept in the family. As to civil discourse… the 1800 election featured attacks against Thomas Jefferson: “a blind, bald, crippled, toothless man who is a hideous hermaphroditic character with neither the force and fitness of a man, nor the gentleness and sensibility of a woman.” As to violence, where to even start?!

By Mike. October 13th, 2016 at 2:50 pm

Just curious… What prompted you to write this post?

By Mike. October 14th, 2016 at 9:20 am

To explain a little more, as I read the post, it seemed that there was some underlying frustration. I’m a BI developer that grew up as a marketing guy and I find that most folks never really ask for what they want. They ask for what they think they can get or what they think you can do. I like to ask them a bunch of questions to get to that base need/want. I guess that’s what prompted my question to you.

By Stephen Few. October 14th, 2016 at 9:45 am


Let there be no doubt that I wrote this blog piece in response to frustration, but mostly because of significant concerns. The concerns that I’ve addressed go far beyond frustration with the general ignorance that prevents organizations from getting real value from data. While it is definitely true that people don’t know what questions to ask of their data, the problem is much deeper. By increasingly relying on technologies for answers to our questions, we are actually going backwards, withdrawing ever farther from skilled data sensemaking. By over-relying on machines to do our thinking for us, we will cease to think for ourselves. I don’t want to live in that world. What’s been going on in the current U.S. presidential campaign is a frigtening wake-up call that our ability to think critically is in sad need of repair.

Leave a Reply