The Myth of Self-Service Analytics

Exploring and analyzing data is not at all like pumping your own gas. We should all be grateful that when gas stations made the transition from full service to self service many years ago, they did not relegate auto repair to the realm of self service as well. Pumping your own gas involves a simple procedure that requires little skill.

Pumping Gas

Repairing a car, however, requires a great deal of skill and the right tools.

Car repair

The same is true of data exploration and analysis (i.e., data sensemaking).

Self service has become one the most lucrative marketing campaigns of the last few years in the realms of business intelligence (BI) and analytics, second only to Big Data. Every vendor in the BI and analytics space makes this claim, with perhaps no exception. Self-service data sensemaking, however, is an example of false advertising that’s producing a great deal of harm. How many bad decisions are being made based on specious analytical findings by unskilled people in organizations that accept the self-service myth? More bad decisions than good, I fear.

Describing analytics as “self service” suggests that it doesn’t require skill. Rather, it suggests that the work can be done by merely knowing how to use the software tool that supports “self-service analytics.” Data sensemaking, however, is not something that tools can do for us. Computers are not sentient; they do not possess understanding. Tools can at best assist us by augmenting our thinking skills, if they’re well designed, but most of the so-called self-service BI and analytics tools are not well designed. At best, these dysfunctional tools provide a dangerous illusion of understanding, not the basis on which good decisions can be made.

Some software vendors frame their products as self service out of ignorance: they don’t understand data sensemaking and therefore don’t understand that self service doesn’t apply. To them, data sensemaking really is like pumping your gas. The few software vendors that understand data sensemaking frame their products as self service because the deceit produces sales, resulting in revenues. They don’t like to think of it as deceitful, however, but merely as marketing, the realm in which anything goes.

How did it become acceptable for companies that support data sensemaking—the process of exploring and analyzing data to find and understand the truth—to promote their products with lies? Why would we ever put our trust in companies that disrespect the goal of data sensemaking—the truth—to this degree? Honest vendors would admit that their products, no matter how well designed, can only be used effectively by people who have developed analytical skills, and only to the degree that they’ve developed them. This shouldn’t be a difficult admission, but vendors lack the courage and integrity that’s required to make it.

Some vendors take the self-service lie to an extreme, arguing that their tools take the human out of the loop of data sensemaking entirely. You simply connect their tools to a data set and then sit back and watch in amazement as it explores and analyzes the data at blinding speeds, resulting in a simple and complete report of useful findings. At least one vendor of this ilk—BeyondCore—is being hailed as a visionary by Gartner. This is the antithesis of vision. No skilled data analyst would fall for this ruse, but they unfortunately are not the folks who are usually involved in software purchase decisions.

Let’s be thankful that we can save a little money and time by pumping our own gas, but let’s not extend this to the realm of untrained data sensemaking. Making sense of data requires skills. Anyone of reasonable intelligence who wishes can develop these skills, just as they develop all other skills, through study and deliberate practice. That’s how I did it, and these skills have been richly rewarding. The people and organizations who recognize self-service analytics for the absurd lie that it is and take time to develop analytical skills will emerge as tomorrow’s analytical leaders.

Take care,

Signature

33 Comments on “The Myth of Self-Service Analytics”


By Jason Mack. August 17th, 2016 at 2:09 pm

I don’t think that ‘self-service’ implies no skills required. Rather, what we have seen is that all knowledge workers have had to become more data-literate to do their jobs.

Software like Tableau and Alteryx have abstracted away the need to be skilled in coding or programming, but they still require the skill of understanding data.

Certainly agree with your assessment on vaporware like BeyondCore that promises to magically connect to everything and just give you results.

By Stephen Few. August 17th, 2016 at 2:21 pm

Jason,

If you read the claims of most vendors, you’ll see that “self service” does indeed imply “no skills required” other than “skills in using our software, which in fact requires almost no skills at all because our software is so user friendly.” When Tableau and Alteryx promote their tools as self service, are they careful to point out that their tools “still require the skill of understanding data?”

By Andy Cotgreave. August 17th, 2016 at 2:31 pm

Hi Steve
I totally agree about the fallacy of taking the human out of the loop. There’s a fantastic professor of AI in the UK called Maggie Boden who said on BBC’s Inside Science “If you want more human intelligence… get more humans”

http://www.bbc.co.uk/programmes/b04lpzyr

It’s a great interview with a very wise woman. One of the few AI professionals who aren’t predicting the AIpocalypse.

I’ve heard ONE “take the human out of the loop” story that made me stop and think, “Hmmm, maybe it could happen.” IBM Watson is used at Wimbledon to respond to each point with interesting anecdotes based on a player, which are fed to the commentators. For example, let’s say Andy Murray serves the 2nd fastest tournament ace. Apparently Watson would search the entire body of tennis scores for interesting related stats the commentators could use immediately. That struck me as a great case for replacing the human. If implemented correctly, the AI would be so much faster than a human. A commentator needs those facts instantly. If Watson does do it as IBM claimed, I take that as an example of the humans being effectively taken out of a data-related loop. It’s not quite data-sensemaking, more data-mining, but I think it’s an interesting story. I can’t find a link to verify the story. Whether it was a system that actually worked or not, I don’t know.

By Stephen Few. August 17th, 2016 at 2:43 pm

Andy,

Have you ever noticed the number of times that Google News pairs irrelevant photos with news articles? The algorithm generates some hilarious pariings? I don’t share your optimism about Watson’s ability to replace sports commentators. I don’t doubt that this could be done, but only that it could be done well.

Have you really found that most AI professionals predict the AIpocolypse, by which I assume you mean disaster? Some do, but I have the impression that most are exceedingly optimistic.

By Andy Cotgreave. August 17th, 2016 at 3:00 pm

Let me rephrase the AI disaster!

Within AI, as far as my reading has taken me, you’re right: optimism and pragmatism.

The MEDIA likes to focus on those who are proclaiming that AI will spell doom. So we see “Bill Gates says AI might be the biggest threat to humanity” style headlines all the time. And the authors who profess doom get the most coverage in the media, too.

Also I see headling grabs at conferences. AI is becoming the hot topic, and there’s no better way to pique interest than a keynote entitled “Nick Bostrom on the End of Humanity” http://www.ipexpoeurope.com/

So, yes, I agree. Most AI professionals are realists. The media loves to focus on the disaster side of things.

By Stephen Few. August 17th, 2016 at 3:07 pm

Andy,

Bill Gates is not an AI professional. The media has mostly featured the concerns of technologists who are not AI professionals, perhaps in part because few AI professionals are fully considering the potential harmful effects of AI. Like Gates, I too am concerned about AI. Not nearly enough thinking about its implications is being done by those who are promoting it.

By Benjamin. August 18th, 2016 at 5:30 am

I totally agree that Self Service BI is a huge marketing campaign, and quite a dangerous one in the wake of good decisions. Unfortunately I do not agree with the core of the article: in my opinion SSBI tools are the new spreadsheets.

In the past (let’s say past) decision makers used spreadsheets -with quite poor visual capabilities- as a key part of their own Decision Support System. Now it’s the time for SSBI tools to do that.

By Benoit Bernard. August 18th, 2016 at 7:30 am

Hey Stephen,

You know what I find as equally deceitful as self-service BI solutions? It’s when they claim that they “can connect to any data source”. And then, when you browse through their website, you realize that they support only a limited number of data sources, or they provide some kind of SDK to develop your own connector. Personally, I find it profoundly misleading. As a programmer myself, I would not be fooled, but I guess that for an executive with no technical background, it would be easy to think: “Cool, I will be able to connect to any data source. No programmer needed!”

By Stephen Few. August 18th, 2016 at 7:43 am

Benjamin,

The core of my argument is that data sensemaking requires skills and that no tool eliminates the need for these skills. Are you saying that so-called self-service BI (SSBI) tools eliminate the need for data sensemaking skills or rather that it’s now time for data-based decisions to be made poorly with SSBI tools rather than spreadsheets?

By Benjamin. August 18th, 2016 at 10:35 am

Stephen,

Your question made me laugh! (In a good way, of course). I do believe that now is the time for data-based decisions to be made with SSBI tools rather than spreadsheets and I do not think that additional skills are required.

Let me say that I am entirely focusing on the end part of a BI process, a SSBI tool like MicroStrategy Visual Insights where you just need to drag an drop objects to get a somehow fancy visualization. Most SSBI tools also offer data digesting capabilities, but I believe that is a different issue, where in most cases additional skills are required.

I may be wrong, of course, but I have worked in a few different industries and I always noticed that decision-makers are the ideal people to make sense of the data. They are the people that used spreadsheets before and they are the correct addressees for SSBI tools.

By Stephen Few. August 18th, 2016 at 10:49 am

Benjamin,

I appreciate your reply. It vividly demonstrates my contention that people believe the self-service BI/analytics claim that, with the right tools, data sensemaking can be done without developing analytical skills. I contend that this claim is a lie that is quite harmful. Decision makers who rely on spreadsheets to make sense of data without first developing the requisite analytical skills cannot acquire the understanding that is needed for good decisions. Self-service BI tools do not eliminate the need for analytical skills. Assuming that they do is a grave and costly error.

If you’re inclined to discuss this further, I’m interested in knowing which self-service BI tool you feel can be used for data sensemaking without first developing analytical skills.

By Benjamin. August 18th, 2016 at 1:03 pm

Stephen,

Thank you for your reply. To better understand your position, could you please tell me what do you understand for Analytical Skills? What would be an example of person without them?

I may have miss-expressed myself, but of course I believe that Analytical Skills are needed in order to make sense of data. They are needed to interpret a report or dashboard and they are needed to make it.

I see a use in SSBI tools as they allow companies to move a bit faster and I do believe that with the right Data Governance risks can be minimized.

By Stephen Few. August 18th, 2016 at 2:20 pm

Benjamin,

Now we’re getting somewhere. By data sensemaking or analytical skills, I mean the following:

1) A basic understanding of statistics
2) Knowledge of the data domain (e.g., if you’re analyzing sales, you must understand the sales process and the data associated with it)
3) Data visualization skills (e.g., how to visualize data for various purposes; how to avoid mistakes in graph design that produce misleading representations of the data)

How many of the people who use so-called self-service BI tools have developed these skills? Learning to use the tool is not the same as developing these skills. How many self-service BI vendors make it clear to people that these skills must be developed?

My point is that a good data analysis tool can definitely “allow companies to move a bit faster” by assisting skilled data analysts, but it cannot eliminate the need for data sensemaking skills.

By Jimmy Shen. August 20th, 2016 at 3:16 pm

Stephen,

This is a great article about self-service. I’ve worked with two companies to get essentially start a business intelligence department from scratch. This means working with somewhat rudimentary IT departments. I would like to give a generalized view of my experience and my thoughts about self-service software.

I believe the problem is not with software companies, but with the decision makers. Established companies that don’t have a data department at this point have a business model that doesn’t rely on technology. The decision makers that are involved in purchasing software tend to be business people who do not have technology background or people who used to work in technology before game changers like cloud computing and analytics software. This leads to decisions based on either what other people in the industry bought or from flashy advertising in Forbes or CIO magazine.
What industry people don’t talk about or don’t even know that their data is a mess that requires architecture, a validation system and constant maintenance on the originating systems.

This is absolutely positively against the rules of industry talk and will not win you customers because they won’t understand it or like to think about it.

However, getting them to purchase software like Tableau is better than nothing because eventually when they see that their numbers are not consistent and why their reports can’t be accessed, this gives one an opportunity to talk about building views, upgrading servers and possibly getting a reporting environment.

I would definitely love your thoughts.

Jimmy

By Stephen Few. August 20th, 2016 at 3:33 pm

Jimmy,

Although our tools for accessing and analyzing data have always been lacking, they have always been able to reveal problems with our data. More recent tools such as Tableau have not suddenly revealed this to us. Excusing tools vendors for false advertising doesn’t benefit anyone except the vendors. Let’s hold them responsible for telling the truth. Even the vendors will benefit from telling the truth in the long run.

By Bill Droogendyk. August 22nd, 2016 at 9:46 am

Re: data sensemaking or analytical skills.

Stephen, your definition is right on!

In my efforts to help quite a number of people do a better job of Data Visualization, I’ve often found that they cannot make a (good) picture of the data because they don’t understand the story that’s in the data. That story comes out when one understands the data (knowledge of the data domain) and knowing/using the power of simple charts (distributions, scatter plots and trend charts). Is the story significant? Basic statistical knowledge(especially statistical process control chart thinking) is required to separate real change from normal variation. Higher or lower than yesterday/last week-month-year does not necessarily indicate change.

“Two points is not a big enough sample to determine if there’s a change” – Stacey Barr

By Nate. August 22nd, 2016 at 12:55 pm

Too true. Every time I hear “self service” come up, I know it will be a disaster. I don’t want to sound like a pessimist here… but here we go anyway.

In many business departments, ‘self service’ is already done, every day, by asking the IT department “Where is the Export to Excel Button?” The end users may not be analysts but they seem to have an instinct with their data (they fit into the #2 of your list of skills). What they often cannot do (or will not, depending on the political optics of the situation) is the legwork to figure out why the data looks like it does, what it might look like if they change process X or Y, etc.

To be honest, I am convinced that these skills are rare, and will always be rare, because they are not all teachable skills. Being able to make intuitive leaps is a uniquely human attribute, and one that not all humans are very good at in all areas.

An example: I worked with someone once who had a lot of medical problems. Went to many, many doctors for various things, and finally, while being treated for vertigo (of all things) a doctor walked in the room, took one look at him, and said “hyperthyroidism.” All of his other doctors, with all of their medical training, had missed it.

If the medical profession has trouble, which employs some of the smartest, most talented and hard working individuals in our society, can we expect to ever get enough of the folks that can make these intuitive leaps in our businesses?

By Stephen Few. August 22nd, 2016 at 1:09 pm

Nate,

Although it might very well be true that data sensemaking skills “are rare, and will always be rare,” they are all teachable skills. Intuition is the result of deep learning. Anyone who takes the time to learn data sensemaking skills and do so deeply will develop the kind of intuitions that will allow them to spot patterns in data instantly, on occasion, that are invisible to others.

By jlbriggs. August 23rd, 2016 at 9:42 am

The problem that I see all too often is that the people with the “Knowledge of the data domain” in a management role, tend to assume that they either have, or that they do not need, the understanding of statistics and data visualization.

I have to assume that this is a very common version of this problem.

This has very often lead to 1) a lot of very bad charts, and very bad explanations of the data by people in such roles, and 2) a lot of time having to explain things like basic statistics (why did you use the monthly average for that, and not just the month end number??), and basic statistical charts (what does the y axis on the histogram measure??).

Seems like a simple problem to solve (teach people basic statistics and data visualization skills). But the people who don’t think they need to know such things also tend to be the people who would have to make the decision to have people learn such things…

So we remain stuck with the idea that Data Analysis means “I made a table, and sorted it!”, or, for the adventurous, “I made a table, filtered it, *then* sorted it!”

By Stephen Few. August 24th, 2016 at 8:18 am

I received an email today that relates to this discussion. A reader mentioned that the software vendor Qlik is attempting to incorporate natural language processing (NLP) into their software. Several vendors are flirting with NLP to support their “self-service BI/analytics” ventures. As I see it, NLP doesn’t have a useful role in data sensemaking. Using words as an interface for input is much less efficient than using other means, and using words as an interface for output is based on a misunderstanding. Here’s an excerpt from a Qlik blog on the topic:

As we know, data visualization is a powerful medium for both revealing and communicating data facts. Unfortunately it relies heavily on the viewers ability to read, process and understand what they are looking at. Much of the problem with the lagging adoption of analytic practices is that the tools and data artifacts are rarely designed to help people in that endeavor. At best it’s designed for the trained analyst, which is great, if you’re a trained analyst. But data-driven decisions need to be made by a wide variety of people with varying skills. For ‘self-service’ BI to work it needs to be accessible to the many, not just the few. That’s where artificial intelligence and natural language generation (NLG) can help out. NLG in this scenario, is the ability to automatically communicate analytical observations in clear, understandable language.

Data visualization is useful because some information is best presented visually, not in words or numbers. Attempting to translate into words what is shown in a data visualization won’t work, even if you can write software that is smart enough to figure out everything useful in the data visualization, which is doubtful. The interest of BI/analytical vendors in NLP is misplaced. It’s part of their naive belief in “self-service BI/analytics.” People cannot make sense of data without developing analytical skills. Skilled humans cannot be removed from this process.

By David. August 24th, 2016 at 3:36 pm

I see a lot of misleading examples of SSBI out there. Tableau has been circulating one about an ED physician who did some after-hours analysis and discovered 100M+ in revenue leakage. That’s nice, but I can also tell you a story about an ED physician at Mayo Clinic who wrote his own ED patient management dashboard that was ultimately deployed and used for a long time. The former doesn’t provide evidence that self-service BI is useful any more than the latter suggests that self-service computer programming will lead to better electronic health records.

Let’s not even ask Tableau to provide an accounting of all the bad decisions that leaders have made after being misled by data they visualized on Tableau without the training necessary to vet it properly and understand it fully.

Another example from Tableau describes “How UMHS Saved Millions and Won an Award with Self-Service Analytics” However, the actual story describes self-service “analytics” only in passing, and really focuses on the work of a dedicated analytics team that quite effectively used Tableau.

The UMHS story is definitely a success story, it’s just not a self-service analytics success story.

Steve, do you know of any literature that tests the thoroughness of understanding or quality of insight gleaned by lay users of self-service BI tools versus trained analysts using the same?

By Stephen Few. August 24th, 2016 at 4:58 pm

David,

I can say with a fair amount of certainty that no research study has ever compared the “throroughness of understanding or quality of insight gleaned by lay users of self-service BI tools versus trained analysts using the same.”

By G. August 24th, 2016 at 5:48 pm

True.
Tried microsoft’s BI and, while I got some sense that this was good, a part of me still kept thinking “this can’t show any actionable particular insight, this won’t show me something that I could only find by diving into the data”.

By Robert. August 24th, 2016 at 8:35 pm

Anecdotal evidence in support of your point of view, Steve. For 14 years, I was a technical writer in a Fortune 100 company producing enterprise software. As customers always call in due to usability problems and confusing or incorrect documentation, the head of the usability/documentation org that I was in put together a working group to look at the data gathered from customer calls, assess how many calls were due to usability problems and/or documentation, fix the problems, and then look in the subsequent data to see whether calls for those problems dropped. I was selected to be in that group. As with every other member (also pulled from the ranks of writers and designers), I had virtually no knowledge of stats and absolutely no experience in data analysis.
In my initial enthusiasm for being part of this high-profile project, I bought a personal license for Tableau, imported data via CSV files, and started slicing and dicing to find insights. After playing at this for a week, I noticed many inconsistent results. So, I looked at the data and found it to be very dirty and inconsistent. I talked to the support people who entered the data for every call and found — surprise — that they entered data in idiosyncratic ways. I found interpretations of problems by support people and customers could vary tremendously. Moreover, I studied the rudiments of stats and discovered that, even if we had perfect data, there was no way for us to validate the efforts to reduce calls for problem types by simply looking at subsequent data. Think of all of the work it would take to tease out the consequences of release cycles, seasonal fluctuation, IT trends, and more.
Nobody else on the working group cared about these problems or understood them. They simply did what they were mandated to do, using the company’s own internal (and inferior) analysis tools. Every month, we had to give presentations on the progress that each of us was making in reducing calls in the product areas we represented. And every month the other members confidently claimed knowledge of what the problems were and showed time-series charts to demonstrate that the calls for those problems were dropping. The head of our org listened, gave praise, and passed the news up to her superiors to help justify funding for the organization. Apparently, they never asked questions, either.
Again, this was in a long-lived Fortune 100 corporation. I’m sure we weren’t the only unqualified org working on “data analysis” to gather “actionable insights” into how we could improve how we did things.
We’d probably have done better and saved much time and money by just calling some of the customers who complained and finding out directly what had led them to complain. However, doing that doesn’t let you pretend to yourself and others that you’re smarter than you are. Essentially, the problem is Dunning-Kruger syndrome fouling the realm of data analysis, as it does for every realm of human endeavour that I can think of.

By G. August 25th, 2016 at 6:48 am

Ah, I also remembered talking with a professor years ago, explained an odd result I found while exploring some data I collected, and his eyes lit up! He had the answer, and then MY eyes lit up! lol.
I still remember his words to this day. That will be impossible in current BI.

By Matt Langeman. August 25th, 2016 at 8:40 am

I see some similarity here with the “ad hoc reporting” projects that I was involved with that always seemed to end poorly. Those seemed to fail for two reasons:
1. The tools were either not good enough, or too expensive
2. The clients didn’t actually have the analytical skills required

The goal was to eliminate the bottleneck of IT and allow others in the organization (typically business analysts) direct access to the data and tools for analysis. It is a worthy goal.

“Self service analytics” seems to have a similar goal, but problems occur when you push the job of analytics to those who do not have the analytical skills.

I do think tools have improved tremendously. Yes, marketers hype and promote their products in misleading ways. It is unfortunate and they should not be given a pass. However, I also agree with Jimmy that organizational decision makers have to take some responsibility as well. They need to have some understanding of the skills needed for good analytics.

Previously there were more technical and usability barriers to using analytical tools. As these are removed, more people are able to use them, sometimes before they actually have the analytical skills required to use them well. While this will result in many poor decisions, hopefully it will also lead more people to learn the required skills.

By Nate. August 26th, 2016 at 10:35 am

Matt,

Just one comment – People have been making poor decisions with the analytical tools “Microsoft Excel” and “Lotus 1-2-3” since they were invented.

By Matt Langeman. August 29th, 2016 at 7:35 am

Nate, I totally agree. While I don’t remember the early days of Lotus 1-2-3 and Excel, I’m thinking my last two paragraphs could apply to that time as well.

You’ve indicated some pessimism regarding the ability to teach the skills needed for data sensemaking. I think there are scenarios that are complex and require high-level skills and specific past experience. However, there are many more scenarios that would greatly benefit from people having basic to intermediate skills.

By Robert Rouse. August 30th, 2016 at 10:50 am

Steve,

I mostly agree with your assessment of tools that try to do the thinking for you in a “black box” way. But, I define the term “self-service” more broadly. In a previous role as an engineer at a private utility company, I needed to analyze business data. I had training in statistics, knowledge of writing SQL queries, and was rapidly developing visualization skills. But, I couldn’t do that job effectively because I couldn’t get the data I needed. That was left up to a specialized group who had zero domain knowledge of the business problems we faced. The “experts” resisting self-service analytics had sysadmin and DBA expertise but nothing else.

I pushed against this common organizational philosophy and championed Tableau internally. I felt that people like me with domain knowledge and analytical expertise ought to be allowed to do the analysis. For me, that’s what self-service means: getting and analyzing the data myself instead of leaving it up to people who lacked appropriate knowledge. I’ve moved on from that and now help both IT and business experts as a consultant.

What are your thoughts on this definition of self-service as a way overcome organizational barriers? Do you see the value in opening up data to business domain experts with appropriate skills?

By Stephen Few. August 30th, 2016 at 1:19 pm

Robert,

The problem that you described is an old one. I faced it 30 years ago. At that time I helped to establish small analytics groups within business departments, who were positioned between business users and IT. Essentially, analytics must be handled by people who both possess the necessary analytical skills and understand the data domain, which includes understanding the business domain. Even though this challenge has existed for a long time, we’ve made little progress. Part of the reason for this is the misplaced notion that technologies eliminate the need for skills and knowledge, which is the basic message of “self-service analytics” as it is usually promoted. So, I absolutely can empathize with your situation and I very much believe that people who understand the data domain can develop the analytical skills that are necessary, but this rarely happens. Instead, organizatios purchase software, such as Tableau, and expect instant data analysts. Most of the analytical work that is needed in organizations can be done with a relatively simple set of skills, empowered by good tools. Most of my work focuses on teaching these basic skills rather than the more advanced skills that are sometimes needed. The problem that we face is the fact that there are many resources like me out there helping people develop these skills. This is because there is little awareness that skills beyond learning a particular software product are needed.

By Raj. September 19th, 2016 at 7:55 am

Stephen:

I do agree with you on the skill of an human, combined with technology maximizes smart decision making. However, for well defined actions and data sets, the need of human becomes irrelevant if designed probably (by humans, with the aid of technology). As a business intelligence analyst working on building self-driving cars, we have a well defined goal (much more complex than self-serving gas stations). With a well defined goal and a finite set of data parameters that are used in driving a car, the need for a human driver will be eliminated.

Similarly, for problems with well defined data sets, and resulting actions – the human touch can be removed if and its a big if, the design is done right.

But for most (if not all) open end analysis, the tools are just a means to an end – and self-service BI is a scam.

By D. October 21st, 2016 at 6:25 am

Robert’s sage commentary on the prevalence of inappropriate bias from unskilled precincts is also well-represented in medical care services. In that field, the result of the Dunning-Kruger syndrome is often bodily harm. Thank you for writing your story Robert…and thank you for this forum (…and of course your work) Steve.

By Laszlo. December 8th, 2016 at 8:46 am

Great Article!
I see the “Self Service Analytics” for almost all BI vendors but never actually see any implemented solutions. These always end up in a tech team creating reports for the business.
I only saw one tool which I can call a self service reporting (not analytics): Sap Spend Performance Management (SPM). It’s phasing out by SAP and Lumira will be the new reporting solution. SPM is basically a web-based pivot table, connected to a SAP DataWarehouse. It’s very easy to use however the users still need a minimal technical skill (can be trained in 30 minutes).
The understand of the data is a must, it’s the first step, no matter what.
The business users already know what they want to see, they just select the right dimensions / measurements and click run report. No additional software, no charting, no extra and complex functions.
As a business user, this is I call Self-Service.
But imagine, an average business user opens a tool like Tableau, Spotfire, Lumira (and the list goes on…). Can you imagine, a sales manager will start code in R? A business user will start to build a Knime workflows from BEx queries?
No, a business user wants to click 3-8 times and want to see the figures. Let’s face it, most of the times, these reports end up in excel and the users will do some afterwork due to the current processes / state of the business and incorporate their own exceptions.

In my opinion, the old-school Excel pivot is still the cornerstone of the self service analytics.
The best measure of a self service tool would be: the least time the user use the tool the better.

Leave a Reply