Confessions of a Bipolar Technologist

Last June I celebrated my 62nd birthday. As I look back on my life, my early years seem like distant memories of a different age, yet the years also seem to have flown by in an instant. Our lives are brief when superimposed on history, but they can be rich if we find a way to contribute to history. I feel that my work in the field of data visualization has provided that opportunity, and I’m incredibly grateful.

I have worked as an information technologist for 33 years. Similar to many other thoughtful IT professionals, I have a love-hate relationship with technologies. My feelings about them range from ecstasy to depression and disgust. I love technologies that are useful and work well, but I dislike all else, which includes most of the IT products on the market.

We humans are distinguished from other species in part by our creation and use of tools (a.k.a., technologies). Our relationship with these technologies has changed considerably since the hunter-gatherer days, especially since the advent of the computer. The human condition is increasingly affected for both good and ill by our technologies. We need to evaluate them with increasing awareness and moral judgment. We need to invite them into our lives and the lives of our children with greater care.

In the early days of my IT career, I spent a decade working in the world of finance. I was employed by one of the financial institutions that later contributed to the meltdown of 2007 and 2008. In fact, If I’m not mistaken, my employer invented the infamous reverse-interest mortgage loan. I was a manager in the loan service department at a time when a large group of employees had the job of explaining to customers why their loan balances were increasing. Fortunately, I never had to answer those questions myself, which I would have found intolerable.

During those years, I remember learning about the famous 80/20 rule (a.k.a., the Pareto Principle), but what I learned at the time was a perversion of the principle that says a lot about the culture in which I worked. I was told that the 80/20 rule meant that we should only work to satisfy 80% of the need, for the remaining 20% wasn’t worth the effort. When we built IT systems, we attempted to address only 80% of what was needed with tools that worked only 80% of the time. Excellence was never the goal; we sought “good enough.” But good enough for what? For most technology companies, the answer is “good enough to maximize revenues for the next few quarters.” A product that is only 80% good or less can be camouflaged for awhile by deceitful marketing. By the time customers discover the truth, it will be too late: their investment will have already been made and those who made it will never admit their error, lest they be held responsible.

Traditional theories of economics assume rational behavior. A relatively recent newcomer, Behavioral Economics, has shown, however, that human economic behavior is often far from rational. The same can be said of the human production of and use of technologies. When our progenitors became tool users and eventually tool creators, for eons those tools always arose from real need and they rarely caught on unless they worked. This is no longer true, especially of information technologies. Much that we do with computers today did not emerge in response to real needs, is often misapplied in ways that produce little or no benefit, and far too often works poorly, if at all. This suggests that a new scientific discipline may be needed to study these technologies to improve their usefulness and to diminish their waste and harmful effects. I propose that we call this new field of study Itology (i.e., IT-ology, pronounced eye-tology). Its focus would be on the responsible creation and use of information technologies. Whether the name “Itology” is adopted doesn’t matter, but making this area of study integral to IT certainly does.

Take care,

Signature

8 Comments on “Confessions of a Bipolar Technologist”


By Marty Gierke. January 6th, 2017 at 5:48 am

On the one hand, technology has enabled more people to create stuff and fail faster, which many will argue is progress. Chin up, the next failure will probably be less bad, and it’s right around the corner. On the other hand, more people are creating more bad stuff that fails, forcing us to adapt to the least bad stuff we have to choose from. Good stuff is rare, and is too often replaced by a “new and improved” version of itself, which often fails. Permission to fail is all the rage in the workplace, as long as we “fail forward”. Unfortunately, we’re getting really good at it.

Oh, and FYI: On Friday, May 06, 2016, a U.S. federal trademark registration was filed for IT-OLOGY by IT-OLOGY FOUNDATION, COLUMBIA, SC 29201. The USPTO has given the IT-OLOGY trademark serial number of 87027268. The current federal status of this trademark filing is PUBLISHED FOR OPPOSITION. The correspondent listed for IT-OLOGY is MICHAEL A MANN of NEXSEN PRUET, LLC, 1230 MAIN STREET, 7TH FLOOR COLUMBIA, SC 29201 . The IT-OLOGY trademark is filed in the category of Advertising, Business & Retail Services . The description provided to the USPTO for IT-OLOGY is Promoting public awareness of careers in information technology.

By Dale Lehman. January 6th, 2017 at 7:40 am

Steve – you have captured very well my own feelings. I alternate between being an early adopter of technologies and being a Luddite. I used email, lecture capture, simulation software, etc. before most people knew these existed – but to this day, I still have a Blackberry (on which I almost never make phone calls and never have the ringer on), don’t stream or surf from my mobile device, find apps totally unappealing, etc.

I share your perception of the need, but perhaps disagree a bit about the diagnosis. The problem is not human irrationality but more the opposite. People are rational enough that it is good business practice to provide inferior products (try the smoke detector with a 10 year life for but one example) and sufficiently confusing pricing that leads people to buy things they don’t want (try any entertainment or communications service pricing). The only real protection consumers have is competition – and competition is rapidly disappearing, despite the regulatory misperception that it is rampant.

In a larger sense, I think I am agreeing with you. I view the economy as a technology (I was trained as an economist and the market system is a wondrous thing – much like any of the more traditional “technologies” you are writing about). This is a very powerful technology that we don’t understand well enough to manage responsibly. Until we do, I suspect that the more tangible “technologies” will continue to follow the paths that disturb you (and me). I fear that looking to the technology community to provide the revolutionary antidote to our economic and ethical defects is unrealistic and naive.

By Jason Bradfield. January 6th, 2017 at 11:38 am

I couldn’t agree more. Dale is right that the producers of this technology are generally behaving rationally – they’re maximizing their personal profits. However, the consumers of this technology are afflicted by numerous cognitive biases. Supposedly “smart” enterprise software purchasers are no better than the “average Joe” when it comes to making poor technology choices. In fact, they may be a good deal worse since the decision-makers are quite often not the actual users.

I think the incentive to “induce” demand through marketing and sales is a major contributor to this. If you examine the financial statements of a lot of enterprise software vendors you’ll see that Sales & Marketing spend is almost twice R&D spend. That should be a red flag as to where their priorities are. When you’re purchasing software you’re basically paying in part for being marketed to, not necessarily for technology that actually fills a real need.

The tools that sell are not necessarily the tools you need.

By Stephen Few. January 6th, 2017 at 1:43 pm

Marty — I’m not invested in the term “itology,” but if I choose to pursue it, a trademark for “IT-OLOGY,” with a hyphen, shouldn’t pose a conflict.

Dale — Please elaborate on your point that the problem is due to rationality rather than irrationality. I don’t follow.

By Dale Lehman. January 7th, 2017 at 6:30 am

Steve
It could be due to my training as an economist (something I cannot rid myself of, just like my NY accent despite not living in NY for 40 years now). I recognize there are many cognitive biases and find the field of behavioral economics rewarding – but I believe you usually get further by assuming that people’s behavior is rational – and in this case, I think that is mostly true.

Many people live month to month, paycheck to paycheck, can’t pay off their credit cards and pay 18% interest on their balance – yet many of these people have cell phones and unlimited data plans and pay hundreds of dollars each month for these. Is that irrational? Unfortunately, I don’t think it is. It represents their values and how short their time horizon is. The apps they put on these phones don’t make them smarter, don’t help them get better jobs (or even better loans) and often feed them fake news, inflammatory social media, etc. Still I don’t view that as irrational, unfortunate though it is.

The difference is that if you view it as irrationality then you may believe that more education will solve the problem. But I don’t think that is true. The reason why apps have become so popular is that they fulfill some kind of need – a need that is certainly cultivated by the technology firms and their marketers. Perhaps we end up with a form of addiction – and, in that way, I guess I am somewhat hedging on whether our behavior is rational or not. Either way, however, I don’t think the technology firms are irrational – it is very profitable to sell to addicts and that is what they are doing.

So, the important point you are making then becomes: how do we responsibly develop technologies when it is more profitable to cultivate addiction? You can ask that question about the development of visualization software, statistical software, news broadcasts, consumer products, or virtually anything we surround ourselves with. Whether or not our behavior is rational seems to me to be a red herring. Unfortunately, I have no solutions, but I don’t think that people can be convinced that their behavior is bad for them (my ultra-cynical side).

By Stephen Few. January 7th, 2017 at 10:58 am

Dale,

If the CEO of an IT company intends to make a fortune personally, it is not irrational for him to produce unneeded and poorly designed software if those products are perceived by consumers as desireable. He will make money in the short term. If that same CEO instead wishes to build a company that lasts or to provide consumers with useful products, producing unneeded and poorly designed software would then be irrational. Unfortunately, corporate culture today tends to focus only on profits and only on the next few quarters. Selfish corporate leaders take advantage of this. Apart from the selfish interests of a few individuals, producing unneeded and/or poorly designed software is irrational.

You said that apps have become popular because they “fulfill some kind of need.” This is true of good apps. Most, however, are popular because they “fulfill a desire.” There is an inportant and fundamental difference between need and desire. Consumer culture is the result of manufactured desire. Creation of demand is a central activity of marketing. We would be better off if we focused on actual needs rather than manufactured desires. Stripping away unnecessary desires is a rational activity. It requires a great deal of attention and intention, the very things that consumer culture strives to eliminate.

Unless you think only in the short term, which is the luxury of the few who can make their money and run, it is more profitable to produce technologies that are both needed and well designed. Can people be shown that irrational behavior does not serve their interests? They can, but it isn’t easy. People can learn. The challenge is getting people to pause and consider their true interests for a moment. If I didn’t believe this, I would find a nice retreat in the mountains of Tibet where I would live the rest of my life in solitude. Actually, this would be the most rational path if I cared only for myself, but it is an anemic expression of rationality that considers one’s own interests only.

By Dale Lehman. January 7th, 2017 at 11:49 am

Tibet it is, then. I agree with everything you have said, but I am not optimistic. I work in a business analytics program with a strong emphasis on ethical use of data. I believe in that and many students do as well. But the CEOs you are referring to are more likely to pay for need creation/addiction/satisfaction than for building long-term relationships that are mutually (and beyond) beneficial. That doesn’t stop me from advocating, discussing, and exploring the issues associated with “should we?” I am late enough in my career to refuse to compromise my principles – but that does not make me optimistic. Tibet is always an option and I may someday take it (though it might be Hawaii for me).

By Stephen Few. January 7th, 2017 at 12:51 pm

Dale,

The climate in Hawaii is indeed much better, but the tourism is ghastly. I’ll keep my eyes open for a tropical island that doesn’t appear on any maps.

Leave a Reply