I was pleased and frankly surprised to receive 91 submissions to my dashboard design competition. Surprised because designing a student performance dashboard from scratch based on the data that I provided was not a trivial task. I was especially pleased to find a dramatic improvement over the general quality of entries since the last competition that I judged back in 2006. Almost every entry exhibited qualities that far surpass the dashboards that are typically produced and used today. I’m grateful to everyone who took the time to participate.
This competition served many purposes:
- To give dashboard designers an opportunity to test and further hone their skills.
- To provide me with many fresh examples of dashboards that were all designed to serve the same audience and purpose—a teacher who needs to regularly monitor the performance and behavior of her students—that I could include in the second edition of my book Information Dashboard Design. I now have a rich and varied set of dashboards that I’ll use to demonstrate effective design and to illustrate common problems that still show up, even in dashboards that are created by experienced designers who take dashboard design seriously.
- To showcase examples of exemplar dashboards that could actually be used for an important purpose: to improve educational outcomes.
All three purposes were well served by this rich and varied collection of entries.
Now, let’s get to the winners. Out of the 91 entries, I narrowed the list to the 8 and scored them using the following criteria:
Each criterion was weighted according to importance, producing a total possible score of 100.
At no time during the judging process was I aware of the competitors’ identities. After scoring the top eight dashboards, to get a final reality check I sent them and a record of the scores to a couple of friends who both support better uses of data in schools. They both concurred with my judgment.
Having finalized and double-checked the selection, I asked for the identities of the competitors. And the winner is Jason Lockwood. His dashboard received the highest score of 90.4 out of 100. This morning when I sent an email to Jason to congratulate him, I learned that he currently works as a usability and design consultant for IMS Health and is based in Switzerland. Although I didn’t recognize Jason’s name, he reminded me that he attended a data visualization course that I taught at IMS Health in London about two year’s ago. Jason originally studied art in Canada. Here’s his winning dashboard.
One of the first things you probably notice is its fine aesthetics. Its use of color, layout, and reduction of non-data-ink make it pleasing to the eye in a way that enhances usability. Because color has been used sparingly, the red alert icons make it easy to spot the students that are most in need of immediate attention (although the icons could be a little bigger to make them pop more). The tabular (rows and columns) arrangement of student information (one student per row) makes it easy to see everything about a particular student with a quick horizontal scan and easy to compare a particular metric across all students with a quick vertical scan. All of the most important metrics were consistently represented using the same dark shade of blue, which featured them above other items nicely (although the dark blue horizontal bars in the bullet graphs would have been easier to see and compare if they were thicker). This design is scalable in that the addition of more students could be easily accommodated by simply expanding the dashboard vertically. Meaningful patterns in individual student attendance information (days absent and tardy) can be easily seen. Rather than going on with my own description, which I’ll elaborate in the new edition of Information Dashboard Design, I’ll let Jason describe the work in his own words:
In the course of my work as a UX engineer, I have the chance to try to bring good data visualization practices to my clients. However, many of the “dashboards” requested by those clients are closer to reporting solutions. Seeing this competition, I was delighted to be able to try my hand at a real dashboard. It was a very challenging and satisfying exercise, during which I learned a lot. I have designed this purely as a visual mock-up in Photoshop. I have the great luck of working with some very talented programmers who are incredibly adept of translating my mock-ups to pixel perfect, working solutions, which provides great freedom for me. This usually leads to small inaccuracies in the data portrayal, but I have taken extra care this time to ensure all the representations are accurate.
2 Overall design strategy
There is a lot of information contained within the data sheet, so one of the major challenges would be how to be able to display all of it in a clear way, on a single screen. I felt that all the information was pertinent to the goal of the dashboard, so did not want to exclude anything. That led to the compromise of designing to a slightly higher screen resolution of 1400px width than what perhaps may be standard. However, that being said, I have designed it in a way that on a SXGA monitor, the entirety of the student information would be visible, and the less important, class comparison information would be off screen.
I usually base the overall colour palette on the visual identity of the client. As this was not provided, I invented the idea that the school colours were blue and grey. I would therefore use monochromatic shades of blue for data representation, grey for text and labels. For the background, I am using an off-white with a slight orange tint. This creates a subtle compliment to the blue, making the data stand out a little bit more.
I chose Avenir as a font as it provides a good contrast between upper and lowercase letters for good legibility, as well as very clear numerals. With only a few exceptions (title and legends), I kept a 12pt font size to provide consistency.
3 Student data
Breaking down all the data in the excel sheet was an interesting exercise. First step was to prioritize the information. What would the teacher want/need to see first. I decided that the grades were crucial (that is, after all, the overall measurement of the student’s performance). With the grades I grouped together the other pure assessment information: last 5 year assessments, last 5 assignments. The assignments completed late info provides a nice segue (and visual break) from scores to more behavioural information: Absences/tardies, Disciplinary referrals and detentions.
I sorted the students by their current grade, from worst to best, so the teacher can view the problem cases first. Secondary sort is on difference from current grade from target.
Having ordered the information, the next step was to visualize. The grades lent themselves very well to a bullet chart, efficiently portraying the target, current and previous scores. I used sparklines for the last 5 year assessment scores (being an interval axis), and micro-columns for last 5 assignments. For assignment late count (and later detentions and referrals) I used dots to represent the counts, as I find these are clearer to view than straight numbers.
I chose to try to represent not only the amount of the tardies and absences but also their temporal occurrence. Hopefully this can allow the teacher to identify patterns not just for each student, but for the entire class. This ends up almost like a scatter chart.
Last up for the behavioural data are the detentions and referrals, which again I represent as dots, with past term information in a lighter shade and to the left of the implied axis for comparison.
Once all the student information was portrayed, I decide that some sort of aid was needed to help the user view the information in rows. I decided on zebra striping as I believe, while it is technically more non-data ink than row lines, it is clearer and subtler at the same time (a line has two edges, top and bottom, as does a solid box, but only half as many boxes are required).
To compare the overall class performance to other classes/school/district, I combined the information from the summary tabs to create two graphs: a dot graph to show latest median assessment scores and percentage of students’ assessment scores in percent groups. I chose a dot graph in order to emphasise the variation between the groups, but also to line up with the percentage groups of the second graph.
On the second graph, I have unfortunately had to rotate the category labels. I would normally not do this, but I did not want to reduce the font any more (even reduced to 10pt, it would still be too crowded) or expand the screen any further.
I finally added indicators on the student name to show English proficiency and special ed status, with the legend in the footer, along with the data qualification note.
Overall, I am quite pleased with the outcome of this design exercise. I believe I have managed to represent all the information in a clear and well structured way that would fulfill its user’s needs. I have shown this to a couple teachers and received positive feedback (and requests to produce it).
My only concern may be the colours: I design on a Mac and the colour fidelity is very good, however the subtleties sometimes disappear when viewed on less well-calibrated screens. This would be usually something we would fix during implementation, so hopefully it is not too bad here.
Just like Jason, overall I am also quite pleased with this design. The primary improvement that comes to mind is the addition of more information in the right-hand section about the class as a whole, such as a frequency distribution of student achievement on course assignments.
Congratulations to Jason Lockwood for exceptional dashboard design.
In addition to our winner, I’d like to showcase the runner up as well. The entry below was created by Shamik Sharma using Excel 2010.
Once again, notice the fine visual aesthetics of this design. Also notice the additional class-level information that appears on the right that doesn’t appear in the winning dashboard, especially the two distribution graphs on top, which are quite useful. And finally, notice how the frequency distribution graph of assessment scores in the bottom right corner, which uses lines (called a frequency polygon) is easier to read than the one that uses bars (called a histogram) in the winning solution. A few features in this solution don’t work as well, however, as those in the winning solution. For example, it isn’t as easy to spot the students in need of attention, and per student attendance information is aggregated in a way that hides patterns of change through time. Overall, however, this is excellent work.
I’ll show many more examples of dashboards that were submitted in the new edition of Information Dashboard Design, both to illustrate useful design ideas and a few that didn’t work.
I invite all competitors who are interested in specific feedback about their designs to post them in my discussion forum on this site where I and others may appreciate them and offer suggestions.