Data Visualization and the Blind
Recently, I received an email from a fellow name Mark Ostroff who has written a guide to designing “accessible” content using the Oracle Business Intelligence Suite (OBIEE). In particular, the guide addresses issues regarding impaired vision, such as colorblindness and total blindness. Despite the fact that Mark began by saying that he and I “could be ‘twins separated at birth’ in our orientation about business intelligence,” by the second email in our conversation it became clear that he had a bone to pick. He accused me of shirking my responsibility by not teaching people to design information displays in ways that are accessible to the blind—dashboards in particular. Actually, his accusation was a bit harsher. He suggested that, by failing to teach people to design dashboards in ways that were accessible to the blind, I was encouraging my clients to break the law. Mark’s bold accusation prompted me to write about this issue.
I’ll begin by stating my fundamental position: a dashboard that is accessible to the blind is a contradiction in terms. “A dashboard is a visual display of the most important information needed to achieve one or more objectives, consolidated and arranged on a single screen so the information can be monitored at a glance” (Few, 2005). No forms of data visualization, not just dashboards jam-packed with graphics, can be made fully accessible to someone who is blind. I am not insensitive to the needs of people who are visually or otherwise impaired. I am merely pointing out what anyone who understands data visualization knows: no channel of perception other than vision can fully duplicate the contents of graphs. Similarly, what someone can communicate through the audio channel in the form of music cannot be fully expressed visually. If it could, why bother performing or recording music? Why not just distribute the written score? Vision is unique in its abilities to inform and enable thinking. Those who lack vision can develop their other senses to compensate to an amazing degree, but never in a way that fully duplicates the visual experience.
The information that is displayed in a dashboard can and should be presented to people who are blind in a different form when needed. Despite Mark’s bold challenge, current laws regarding accessibility require some organizations—mostly government—to provide the information contained in something like a dashboard in a way that is accessible to the blind, not necessarily to make the dashboard itself accessible. Unfortunately, an alternative form of presentation will not convey all of the information contained in a well-designed dashboard and it won’t communicate the information as efficiently, but if someone who is blind needs the information, it behooves us to provide a reasonable, even if imperfect, alternative. The alternative, however, will not be a dashboard. By definition, a dashboard is a visual display, because the visual channel provides the richest and most efficient means of presenting information for monitoring purposes, which no other channel can match—not even close. If airlines were required by law to provide flight-phobic customers with an earthbound form of transportation, that alternative would be called a train or a bus, not an airplane. In like manner, a means of monitoring that uses braille or a screen reader as its medium should not be called a dashboard. There’s enough confusion about the term already. Let’s not muddy it further.
When quantitative information is presented graphically, it offers the following advantages over written or spoken words and numbers:
- Patterns in the values are revealed
- Series of values (e.g., 12 months worth of sales revenues) are chunked together into visual objects (e.g., a line in a line graph), which makes it possible for us to see the entire series at once and compare it to other entire series of values, thus augmenting the capacity of working memory
- Much more information can be presented in the limited space that’s available on the page or screen
- The visual cortex processes the graphical information in parallel and more efficiently than the slower, sequential process that’s required for language processing
Data visualization is not only useful, it is finally being recognized as essential. It’s hard to imagine how any other channel of perception will ever be able to provide viable alternatives for these advantages of vision. It certainly isn’t possible to come close to doing this now.
I support the Americans with Disabilities Act (ADA). The ADA became law to prevent discrimination against people with disabilities. It does not, however, heal disabilities. It cannot give sight to the blind. It can require that organizations remove roadblocks to equal rights for those with disabilities and accommodate them in reasonable ways, but it should never try to equalize the playing field between those with sight and those without by forcing those with sight to wear blindfolds. Unfortunately, some efforts to expand accessibility venture into this territory, and I find that intolerable.
Mark seems to believe that all dashboards should be designed so that every bit of information is accessible to a screen reader to accommodate the needs of those without sight. To do this, a great deal of information would have to be added to dashboards and much of it would have to be expressed in inferior ways to make the contents of a dashboard accessible to a screen reader. Despite Mark’s good intention, this would result in dashboards unworthy of the name. The experience of those with sight would be unnecessarily compromised to a costly degree. I say unnecessarily, because the needs of the blind would be better served by a separate display that was designed specifically for a screen reader without compromising the design of the original dashboard. This approach, rather than the way that Mark advocates, would result in less time, effort, and cost. We should approach accessibility intelligently. What might work for a general purpose website might not work for a dashboard. One size definitely does not fit all.
It was hard for me to imagine what Mark had in mind as an accessible dashboard, so I downloaded his guide to take a look. I quickly learned that his idea of a dashboard is quite different from anything that I would qualify as such. Here’s an illustration from the guide:
![Oracle Dashboard](http://www.perceptualedge.com/blog/wp-content/uploads/2013/09/oracle-dashboard.png)
What he calls a dashboard looks a lot like an online report with a couple of tables on it. A few graphs do appear in the guide, and Mark suggests that they should be made accessible to those who are colorblind in the following manner:
![Oracle Dashboard Graph](http://www.perceptualedge.com/blog/wp-content/uploads/2013/09/oracle-dashboard-graph.png)
That’s right—according to the guide, crosshatching should be used in addition to colors. Crosshatching can create an annoying shimmering effect known as moiré vibration. This affects people who are colorblind as much as anyone. What this recommendation fails to take into account is the fact that people who are colorblind can see color (except for extremely rare cases of complete color blindness), they just can’t discriminate particular colors, primarily red and green. Avoiding combinations of colors that those who are colorblind cannot discriminate solves the problem without resorting to the scourge of crosshatching.
Despite a search, I failed to find anything in the accessibility guide that explained how information contained in graphs (i.e., images) and thus inaccessible to screen readers could be communicated to those without sight. Text descriptions can be attached to a graph that can be accessed by screen readers, but those descriptions would not contain any information about the values in the graph. Apparently, a dashboard that is accessible to the blind would need to eliminate graphics altogether. As I said before, the result would not be a dashboard. When accessibility to information in dashboards is needed by those who are blind, it currently works best to give them an alternative that displays text and tables of values formatted for easy accessibility by screen readers. A table, even though it information, such as patterns of change and the means of comparing entire series of values, but no automated presentation of the data that isn’t visual could achieve that. At best, someone could write a description of the patterns and summarize the story contained in the graph with words, but that would require human intervention, which cannot be automated—at least not yet.
We should be concerned about accessibility to information, not only for those with disabilities. Good design makes information accessible. It is a sad fact of life, however, that everything cannot be made equally accessible to everyone. People differ in ability and experience. Accessibility is achieved by understanding these differences and designing communications in a way takes them into account. Accessibility is not achieved by slighting one audience in an attempt to meet the needs of another. So far, the business intelligence (BI) industry in general has not taken even the shared needs of humans into account, let alone the unique needs of particular groups. I’m not surprised that Oracle’s attempt to accommodate the needs of the visually impaired fails to exhibit thoughtful design. Oracle’s approach to accessibility so far is simpleminded, and certainly is not worthy of the name “business intelligence.”
Take care,
30 Comments on “Data Visualization and the Blind”
I’m so sorry that you had to take the time to address Mark’s correspondence(s).
Unfortunately, some people’s passions blind them (yes, pun intended) to the realities and associated implications of a course of action they feel will rectify inequalities, be they physical disabilities, economic, or other.
I personally don’t have the scientific background to know how the eye and brain process everything but as a Usability Architect I have some basic understanding of this from reading research done by folks in the usability realm…and everything you said makes total sense.
And, I’m glad you made clear that the accessibility features that can help the blind, while worthy, do not provide dashboards. Hopefully Mark will get that distinction since it was lost on him from the start.
Stephen,
As I just posted to you in our private email, my intention was to ask for ideas about how the information concepts you so aptly discuss in your books could be applied to designs for users with disabilities. It seems that you dont think they can be, but it would be nice to hear of any ideas that you think COULD be applied.
It is unfortunate that in this blog posting, you took an Oracle whitepaper I had written solely intended for a different purpose and assumed it was meant to be a discussion of how to efffectivel;y design dashboards. It was not. It was meant to show how OBIEE designs needed to be adjusted to meet the legal requirements of the published VPAT document found on Section508.gov. The screen shot you reference in your blog was intended NOT as a design guide, but rather as an identification of the names of the various screen areas that would be discussed in the document.
And the cross-hatching example was an illustration of one method for meeting the legal requirement that “information not be conveyed by color alone”.
Mark,
If you’d like, I’ll gladly share everything that you wrote to me privately. As you know, it was only after I pointed out your false statements and rude accusations that you backed off and said that you were merely looking for advice.
In your document “A Guide to Accessible Content for Oracle Business Intelligence Suite 11g” (2012), you recommend that designs be designed such that they can be read by screen readers and thus made accessible to the blind. At no time did you mention that in doing this, the dashboard for those with sight will be reduced to something ineffective and unworthy of the name. As I wrote in my blog article, you should have encouraged readers to create a separate display for access by screen readers rather than altering the design of the dashboard itself. Solving the problem in this manner would give the information contained in the dashboard to everyone in the best manner possible. Even though a text-based version of the information cannot come close to the information contained in a dashboard (i.e., a well-designed visual display), it could be structured specifically for ease of use by screen readers.
Regarding the screen shot that I showed, it supposedly labeled the sections of a dashboard, but what it displayed was in fact not a dashboard. That’s a bit confusing, isn’t i? The use of crosshatching in graphs to solve the problem of colorblindness is never a good solution. You should not have recommended it.
As a BI software vendor, Oracle is responsible for providing products that can be used to explore, analyze, and communicate information as effectively as possible. As the writer of a guide for using Oracle’s BI products, you share in this responsibility. What you have provided, however, will undermine the usefulness of dashboards. I’m confident that this was not your intention. You now have an opportunity to fix the problem by revising the guide.
Stephen,
You are correct. The whitepaper was never intended to undermine the effectiveness of dashboards as a general communication medium. And it was not intended as a guide to designing effective dashboards. It was designed to document the accessibility feature set of the specific product itself, as required by Section 508 of the U.S Rehabilitation Act.
And as the Oracle product evolves, I do intend to update the guide. Given that opportunity, I would welcome any design ideas that you think COULD be applied to displays intended for accessibility needs users to convey at least SOME of the information that a visual display provides.
Mark,
I appreciate your humility and willingness to find better ways to support accessibility. I’ll be in touch. In the meantime, let’s see what others have to say. A good starting point would be to read Colin Ware’s book “Visual Thinking for Design” to better understand visual perception and what’s lost when we design a visual monitoring display such as a dashboard for accessibility to screen readers.
It seems to me that the real problem isn’t the design of the dashboards but the inability of the screen reading applications to process the information.
I can envision a program written specifically to process data visualisations. It could give you exact or approximate values for chart series and detect the percentage changed. Then using interactive techniques you could allow the user to see different aspects of the data.
Does this series trend upwards or down?
How many data points appear to be outliers? What are they?
Within what value ranges are the points on this scatter plot clustered?
Etc…
Of course this wouldn’t give all of the functionality and certainly not the efficiency of actually visually processing the information on the dashboard but it may give extra accessibility to certain visualisation elements. It could also give accessibility to data visualisations which have no alternate form that can be processed by a traditional screen reader.
Now I’m not blind and I have no experience using screen reading software. So perhaps this product already exists or wouldn’t be as useful as I’m imagining. Just the thoughts of an up & coming BI dev. Dashboards aren’t the problem, and removing their most useful elements is certainly not the answer.
As I was reading some articles about dealing with color-blindness of some users I was wondering about this issue. Thanks for answering some of my questions and some that I had not even thought of yet.
~Tricia
@Mark:
I guess I’m confused as to why a blind user needs to access information on a dashboard specifically. Dashboards are meant to be taken in all at once (as Stephen has been trying to convey), but the only way a blind user seems to be able to digest the type of information on a dashboard is piece by piece. Wouldn’t traditional reports (screen-reader-enabled, of course) better meet their needs?
Mike D,
While it is true that screen readers are limited in what they recognize (text only), adding the ability to read values in a graph and perhaps even describe the patterns won’t solve the problem. People with vision can look at a graph and often detect what’s significant and therefore worth examining or comparing in an instant, but software wouldn’t be able to do this because it cannot perceive or think visually. The time of blind users would be wasted listening to long series of values and description of patterns, most of which wouldn’t be useful. The fundamental problem resides in the fact that our senses are not interchangeable. They perceive differently and therefore the information contained in one channel cannot be fully expressed using a different channel. We will serve the needs of the blind better by finding the best ways to communicate the essential information that’s contained in a dashboard using sound or touch as well as possible rather than trying to get technology to translate visual information into verbal information. Perhaps an appropriate analogy would be an attempt by a computer programmer to read the pixels in a painting and translate them into sound, which would result in noise, when he should instead seek to find a musician to express the essence of the painting musically.
Stephen,
I mostly do agree with what you’re saying. I don’t think that anything could literally replace visually processing a dashboard. However, I do think you’re perhaps lacking a bit of vision here.
Our senses are not interchangable but there have been great advances in computer vision and image processing such that it could allow quick analysis of visual elements that are then translated into figures and values. Simply reading out a long list of values in a series doesn’t sound terribly useful either. But what would a normal screen reader do with a table of values converted from a bar chart?
What I feel is the most important part of my idea is that this fictional application should be able to answer the common questions that people get answers for from visually processing the information. This functionality would the key to going from “screen reader” to “data visualisation interpreter”. It would allow quicker information retrieval from visual presentations of data.
While you could write a separate report to do this for the screen reader, and that would probable be the ideal solution, you aren’t always in control of the reports that you want to get information from. This could allow blind users to get data from any chart or graph they find and want to get information from.
Then there is the fact that to some extent blind people do replace their sense of sight with other senses and methods of processing information. (This is my understanding based on a very limited knowledge from certain documentaries and anectdotes.) I’ve seen a blind man who can sense his surroundings with echo-location. You can’t echo-locate a chart, but perhaps they have other ways of processing information. I feel like this could mean there is an innovative way to “display” information to blind people and allow them to interact with the visualisations rather than have a screen reading application rotely go through the text on the screen.
You are correct to say that without sight you cannot hope to get the exact same functionality or efficiency from a dashboard or data visualisation. Still, what I get from Mark’s complaint isn’t that dashboards need to be changed it is that screen readers have limitations that need to be overcome in order to better allow the blind to get more benefit from data visualisations (even if it isn’t the full benefit that a sighted person would get).
Mike D,
Perhaps it is not that I lack vision but that you’re vision is naïve. The advances in computer vision and image processing that would be needed to describe the graphical contents of dashboards in ways that would enable those without sight to use the data as intended do not exist and perhaps never will. Human visual perception and visual thinking (e.g., pattern detection and interpretation) is unique and powerful. Work in computer vision is primitive compared to human vision and might remain so. It would actually be fairly easy for a computer program to read the values in a graph, because those values are encoded geometrically (position, size, length, etc.) in ways that a computer could easily decode. Actually, if I’m not mistaken, programs of this sort already exist. The problem isn’t in decoding the values in a graph, but in understanding the meanings that are encoded in patterns and through comparisons of patterns that a computer cannot decipher, and even if it could, it could not communicate in words or sounds.
Many people who are blind have developed extraordinary abilities, especially using their senses of hearing and touch, sometimes in ways that compensate for aspects of vision (e.g., echo-location for sensing the environment) and sometimes to do things that can only be done with hearing or touch. There is a blind fellow who does close-up magic with playing cards that is unprecedented, because he has learned to do things with hearing and touch that could never be done with vision; things that allow him to fool the eyes of those with sight. As talented as he is, however, he will never be able to analyze data as I do visually, because many aspects of vision cannot be duplicated by his other senses.
Your hope that screen readers could be improved to provide better accessibility for the blind has merit, and I share your hope, but it did not stem from anything that Mark said. At no point during my interaction with Mark did he express concern about the limitations of screen readers or suggest improvements. Had he done so, I might have addressed the possibility. Improved screen readers would provide a benefit in general, but would have no bearing on the points that I’ve made about dashboard design.
This may be way off topic, but this exchange triggered a concern. Another form of blindness I worry about (and suffer from all too often) is the ever-shrinking scale of digitally-reproduced information. With increasing rarity can we see the entirety of what is being presented to us for the sake of convenience and mobility. A dashboard on a large screen in a conference room is not the same on an smart-phone. Even a single bar chart loses clarity as it’s squeezed to fit a palm-sized screen. Colors or patterns used on letter-sized printouts change dramatically when reduced in size. When I transitioned from hand-drawn layouts at full scale in orthographic projection for the product designs I was working on (years ago) to computer screens showing the same designs at a smaller scale requiring constant scrolling and zooming to keep track of where I needed to look, something important was lost. With all due respect to the visually impaired, I for one am suffering from the less-accessible nature of information that once was presented at a comfortable, human scale.
Perhaps my idea is naive, but from belief in the impossible comes innovation. I don’t really think my idea is that far fetched though. I never tried to say that it would be a replacement for how people visually process a dashboard.
I was never much taking a side in your argument with Mark either. If anything I thought I was agreeing with you that dashboard design shouldn’t be changed.
Mostly I thought it was an interesting idea that was a different perspective on the issue being discussed. Perhaps I’ve just veered a bit too far off topic.
Mike D,
Isn’t belief in the impossible a description of insanity? Sorry, I couldn’t help myself. It is absolutely true that innovation can sometimes emerge from the recognition that, what everyone assumes is impossible, is in fact attainable. I agree, as I said previously, that some benefit could be gained from better screen readers. The part of your vision that strikes me as naive is the notion that developments in computer vision and image processing can currently contribute to the way that the information in a dashboard could be made more accessible to people without sight. My point is that these developments could decipher the values in a graph but not interpret and then communicate the meanings in graphs or support useful interactions such as pattern comparisons, etc. There is a limit to the degree that we can fully express visual information in ways that are available to the other senses. It is certainly possible that I am the one who is naive and that what I perceive as limitations built into our senses can be overcome. If that is so and you find a way to overcome these perceived limitations, I will owe you an apology and those who are blind will owe you their gratitude.
Marty,
Your observations are perhaps a bit off topic, but they’re important nonetheless. Mobile devices have driven a frenzy of business intelligence development that in many cases is absurd. The “human scale†is a reality to which technology must adapt. Unfortunately, too many technological developments are ass-backwards in that they force humans to adapt to them, which doesn’t work because it isn’t possible. The evolution of fundamental human ability occurs over eons, not years. Small mobile devices are wonderful, but they are only useful for tasks that can be done with tiny displays. Even though I show an iPhone-sized dashboard of sorts on the cover of the new edition of Information Dashboard Design, I am not suggesting that mobile devices can replace devices with larger screens. They can’t. Only a simple and limited dashboard can be viewed on an iPhone. Exploratory data analysis cannot be done on a small device, because it requires a lot more space to display multiple views of data simultaneously. As with any new technological development, in its infancy everyone goes crazy and vendors enjoy the feeding frenzy and the resulting revenues, but in time the marketplace finally recovers its sanity as people begin to recognize the limitations and proper uses of these new technologies. This will happen with mobile devices, but we’re still in the midst of the feeding frenzy.
Long time reader, first time responder.
(in the near future, say roughly, 5-20 years) I could conceive of a physical device which translates a dashboard or any visual encoding into a portable, digital braille system – including filters, interactivity, endless archetypes of visuals, etc. As for color – which appears central to both the original post as well as the comments, and appears to be the missing link – this could in fact be translated to sound, something blind people might happily get used to. Considering that best practice for dimensions is to limit color to 20 or so values, this could be a three octave pentatonic scale (or whatever you want). For measurements/quantitative, a simple sine wane with a range of non-annoying values would do. In both cases you could have a nearby “audio legend/key” to reference against. Some people like certain sounds, melodies, chords. Maybe this device would allow them to encode their desired sounds to the colors of the dimensions or measurements (via some type of admin console with spoken text) as desired. Problem solved.
Since there are approx 6 million people in the USA who have some type of blindness, this magical device turns out to be a billion dollar device in the making. You’re welcome for my having thought it through for you, Mr. and Mrs. Inventor! :)
Daydreaming aside and put in simplest terms: there’s nothing inherently “wrong” or “bad” with a belief in the impossible as described by a hypothetical device that solves the problem (the future), simultaneous with solid fundamentals – dare I say, doctrine? – for the existing world (the now).
Hello Ty,
The solution that you’ve imagined has been considered but rejected, because it won’t work. The problem resides in the difference between audio/verbal/tactile perception and visual perception. Braille must be read serially, one letter at a time. Imagine a series of 12 monthly sales values encoded as a sparkline. Visually, we can perceive that sparkline all at once, as a single pattern, in parallel. We can understand it as a whole. Reading the same 12 values encoded in Braille would not only be slow, it would also never make it possible for the reader to construct a sense of the whole from the individual values. Let’s take it up a notch. Imagine trying to compare two lines in a line graph by reading the values in Braille. It can’t be done because you cannot construct a clear sense of the whole from the parts and hold the two sets of values in memory in a way that would allow comparisons. Information encoded as sounds has the same problem. The tones that might be used to encode a series of values must be heard one tone at a time, serially.
You see, I don’t dismiss the possibility of fully translating the information in a dashboard into something that’s accessible to the blind due to a lack of vision or imagination. My objections are rooted in an understanding of human perception. As long as human perception continues to work as it does now, this translation of information from graphics to words, Braille, or sounds simply won’t work in a way that duplicates or even comes close to the visual experience.
What I miss in this whole discussion is blind persons giving their opinion. I would like to hear from someone who indeed needs non-visual Information, what they would like to get. Will a description of the data be sufficient? Will a table work better? It’s pointless to discuss about something all of us have not experienced ourselves.
Ute S.,
Hearing from people who are blind would indeed be useful, but it is not true that discussions without them are pointless. On the contrary, many worthwhile points have been made in this discussion. We have not already heard from anyone who is blind (at least I assume this is the case) because people without vision do not frequent blogs about data visualization, for obvious reasons.
The two questions that you’ve posed on behalf of the blind have already been answered. A description of the data in words cannot duplicate the information that’s contained in a well-designed graphic. If by “sufficient” you mean that it the verbal description would provide all of the content of the graphic, then it will never be sufficient. The point that I’ve made about the advantage of tables as an alternative display for those who are blind is that the essential information that’s provided could be better organized for screen readers if presented in tables. Structuring the data in this way for this specific purpose will produce a better result. Hearing from those who are blind might clue us into questions that we haven’t thought to ask yet, which would be useful.
Steven,
In this article (http://www.livescience.com/23709-blind-people-picture-reality.html) it says that “When blind people read Braille using touch, the sensory data is being sent to and processed in the visual cortex.†Could touch, and temperature be used to allow the blind a parallelism that would allow the visual cortex to receive and perceive two different stimuli?
Kris,
The fact that the visual cortex is used by those who are blind to process information sensed through touch when reading Braille does not mean that they perceive and process that information in the way that visual information is processed. Our brains are plastic (malleable and adaptable) in that areas ordinarily used for one thing can be repurposed to do other things when not being used in the usual way. Think of the brain as a mass of circuits that can be wired in a number of different ways depending on experience and sensory deficits (e.g., lack of vision).
Our eyes take in information at a much greater bandwidth than touch or any of the other senses. We know that reading Braille through touch is a slow, serial process. This sensory process, which takes place before the information is processed by the brain, is not sped up by the fact that the visual cortex processes it. You’re asking a reasonable question, however, which I can’t answer definitely: “Once tactile sensory data gets to the visual cortex, is it processed in parallel then?” Perhaps, but I doubt it, because the bandwidth at which Braille is sensed and the one-dimensional nature of the information, would not benefit from parallel processing. By “one-dimensional” information, I simply mean that it consists of words alone. Unlike vision, which includes multidimensional information (2-D position, size, length, width, hue, saturation, texture, etc.), there is no need for parallel processing to decode a multidimensional stream of information from Braille and then weave it into a picture. Is it possible that information encoded as Braille could be used by the visual cortex to form multiple patterns that could be simultaneously seen and compared, such as lines in a line graph? Perhaps, but I’m not aware of any evidence that this is the case.
The article that you cited is interesting in that it uses terms for sight–picture and image–several times to describe non-visual perception. Expressed in this way, the article suggests that those who are blind construct a visual representation of the information that they sense through touch or sound. Unfortunately, however, we don’t know how this information is represented in a blind person’s brain and how that representation compares to that of someone with sight. A statement in the article by Paul Gabias, a psychology professor at the University of British Colombia who is blind, seems to draw an unwarranted conclusion:
“If you know that blind people know where to put their plates on their table, and you know that blind people deal with tables in the exactly the same way you do, then you presume that they imagine them in the same way you do. You have got to presume that what’s inside their head is like yours.”
In fact, we cannot and should not presume that what’s inside the heads of blind people is the same as what’s in the heads of people with sight. Knowing where objects are located on a table as well as the shapes and textures of those objects, which enables people who are blind to use those objects in a way that is not noticeably different from people with sight, actually tells us little about the way that information is represented in their heads. A conceptual construct exists, but we don’t know how it compares to one that is built from vision.
It seems sensible that a straightforward translation of a visually designed dashboard would not be accessible, as, of course, the tricks of the trade are all designed to be seen. However, as the purpose of the ‘design’ bit is to convey useful information, I suspect completely different tools would need to be developed and until they were it would be impossible to tell whether visual perception is intrinsically better for interpreting information.
For example, a radio play will be perceived differently from the book, from the film adaptation, to the song, to the critics’ reviews, to the painting inspired by the book of the film of the play of the song. Dance a tango blind to the sights and deaf to the melody, listen to the music, watch the musicians, etc.. All will convey something different — a different perspective of the ‘truth’ at the heart of it.
By analogy, I can imagine something (futuristic — think VR suits!) that played to the other senses. Mass, force, temperature, vibration, pitch, tempo, texture, taste/smell. Imagine, someone scans their hands over the smooth surface of the KPIs and finds one that feels hot. They pick it up and pull it apart — north region is much heavier than south and the texture of sales is rougher and its pitch over time has descended into the lower registers. Has this person experienced less than the person who observes a sparkline with a red dot beside it? Or potentially more for the intimacy of the experience?
To clarify, I wasn’t suggesting a hypothetical device that would translate graphics to braille. Rather, I was recommending a device that simply embosses the graphics entirely, as-is. The burden at that point would be on the blind to learn how to “read graphics” or simply to “Experience The New”. If I were blind, I’d probably figure it out. (who knows… that’s a bold statement.. :)
Carry on. Interesting conversation to be sure.
Ty,
Representing graphics through an embossing technique that could be read by touch would transform something that’s perceived holistically and in parallel into something that is perceived serially, never resulting in a clear sense of the whole and certainly never retained in a way that would allow comparisons. The limitation isn’t our inability to translate the data into something that can be read, but rather the inability of senses other than vision to perceive whole patterns made up of many data points simultaneously. No encoding method will overcome this limitation that’s built into the brain.
Neil,
We don’t need to wait for completely different tools†to be developed to determine the benefits of visual perception compared to our other senses. Although we’re learning more about perception every day, we already know a great deal. I’m not arguing that visual perception is a superior channel for all information. Each of our senses has its unique strengths. I am arguing, however, that for pattern perception and interpretation, as well as pattern comparison of the types of information that we display in dashboards, vision trumps all. This is not an opinion; this is well established by science.
Your radio play analogy is not appropriate. Yes, someone can listen to a play on the radio and have an entertaining experience that is worthwhile, even though it lacks the visual elements of the same play presented on stage. Words can be used to paint pictures in the imagination of listeners. The goal of a radio play, however, is not to convey a specific collection of information for particular purposes, but this is the goal of a dashboard. When we display information on a dashboard, it isn’t enough for someone to get something—anything—worthwhile from it. We want people to get a clear and accurate understanding of particular information.
A VR suit that communicates information to our senses of touch, hearing, smell, and taste, but not sight, will remain limited by the abilities of the human brain. The primary limitation that would defeat us in this venture is the fact that only vision processes information in parallel rather than sequentially, making it possible to simultaneously perceive patterns made up of a great deal of data that could never be stitched together in our brains from a sequential stream of facts. It is wonderful to dream about the great things that we might achieve through technology, but dreams that don’t take into account what is and isn’t possible will never be anything but fantasy. If and when we discover perceptual abilities that are presently unknown, then we can build on them. That’s where our efforts should be focused, not on constructing systems from parts that we already know don’t work.
From the perspective of someone who is legally blind, i think Stephen’s points are spot on. I am a dashboard developer, so I have the unique perspective of looking at it from a visual impairment point of view. I am legally blind in both eyes, with the main focus of my impairment being near-sightedness. From my perspective, dashboarding best practices have helped a great deal and has been all that has been needed for my understanding and interpretation of dashboards.
I do not believe their will ever be any screen reader software that will be able accurately translate a dashboard. There may be some advancements in the future, but right now I agree with Stephen that the only viable solution is to create a version for the visually impaired.
I have spent many years (and been recognized for) working to identify ways that visually-focused electronic end products (such as dashboards, graphs and presentations) can be made more accessible. Not only for persons using assistive technology but all persons. None of us are exempt from aging and we’ll all face physical challenges at some point.
I do agree with Stephen mostly. Providing an accessible alternative to the dashboard is currently the best option for persons using a screen reader. Not because I think it’s impossible to make a dashboard accessible, because technically it is, I just don’t think the technical answer is the best answer.
It’s not a case of whether you can, it’s whether you should.
Providing an easily navigated alternative with a clear summary (of the most important information needed to achieve one or more objectives) and the data behind the graphs laid out in an accessible table format is one example of a more effective alternative. It allows the blind user to absorb the point of the dashboard as well as the opportunity to review the details if they so desire. Even if the way they absorb the information is different, I think it’s still important to allow them the opportunity as much as possible.
I’ve learned that blind users can absorb much more information than we might think. Several years ago a friend of mine wrote an add-in for PowerPoint that automatically added the data behind graphs (series, category, value)in a linear fashion to the alt-text of the graph. While I couldn’t read that and absorb it, we heard back from many blind users who were grateful they now had access to the data. Sadly, the add-in broke when Excel became the graphing engine for PowerPoint, but it does still work for version 2003.
The mind is an amazing thing. I suggest reading See What I’m Saying: The Extraordinary Powers of Our Five Senses by Lawrence D. Rosenblum if you truly have a commitment to creating end products that are accessible to all persons and want to be inspired to put forth the extra effort needed to create them.
Thanks Glenna. I greatly appreciate your expert perspective on this.
This discussion is insightful but also full of misconceptions.
Go on Google books and browse Polly Edman’s book on Tactile Graphics. You will see thematic maps and line graphs designed for blind people. Also search for Susan Osterhaus’s video series on youtube, and see how blind students learn mathematics and geometry by building 2D graphs with pins and rubber bands.
On the topic of sonification, there has been a conference (ICAD) dedicated to this and related topics since 1992. That’s more than 20 years of research. I don’t know much about this research, but I would certainly feel the need to skim a few papers before I can start discussing this topic seriously.
Vision is not only for spatial representations, and non-visual senses are not only for text or speech. Sight, hearing and touch can all be used to convey either spatial information (data visualizations), numbers (data tables) or prose (the data “message”).
Vision is not purely parallel, and non-visual senses are not purely serial. We can certainly quickly spot motion and pre-attentive visual features within a wide viewing angle, but visual inspection requires eye movements. As observed in a later blog post, no one can perceive and interpret a visualization in half a second. Conversely, touch is not purely serial: no one uses his right index finger tip to search for a light switch in the dark. Audition has some degree of parallelism too: we can hear and interpret sounds coming from multiple sources at the same time.
Perception is both parallel and serial, and to paraphrase Gibson and many others, effective perception requires exploratory action: vision is naturally coupled with eye movements, head movements and occasionally locomotion. Touch is naturally coupled with motor movements. Audition is mostly passive in our everyday experience but some work in HCI has shown how it can be effectively coupled with motor movements like touch input.
Perhaps the power of visual data representations do not stem from their visual nature, but from their spatial nature. Space exists independently from vision. Vision certainly has a much higher bandwidth than all our other senses, but we might be able to achieve a lot by finding effective ways of coupling non-visual (auditory and tactile) displays with motor actions.
Pierre,
Thanks for weighing in. I have a few responses to your comments.
Regarding so-called “tactile graphics,†it is of course inaccurate and somewhat misleading to refer to non-visual forms of expression as graphics. As I’ve stated throughout this discussion, we can attempt to provide information that is visual through non-visual channels, but we cannot duplicate or match the visual experience.
I also specifically mentioned attempts to express through sound the information that’s contained in a visualization. As someone trained in human-computer interaction, you probably know that these attempts are limited in that the information is perceived more slowly and that it cannot produce the same understanding or support the same tasks as the original visualization.
I am indeed aware of the fact that vision is not only for spatial representations and that non-visual senses are not only for text and speech. Did I say something that led you to believe otherwise?
It is true that “vision is not purely parallel, and non-visual senses are not purely serial.†In regards to data visualization, dashboards in particular, attempts to express the information contained in a dashboard using sound or touch would be limited to serial communication.
While the fact that vision can perceive 2D and to a lesser degree 3D space, it is not the spatial attributes of visualization alone that make it uniquely powerful.
Like you, I hope and in fact fully expect that we will find richer ways of expressing information that involve senses other than vision, collaborating among multiple senses for greater efficiency and richer understanding.