2012 Perceptual Edge Dashboard Design Competition: We Have a Winner!

I was pleased and frankly surprised to receive 91 submissions to my dashboard design competition. Surprised because designing a student performance dashboard from scratch based on the data that I provided was not a trivial task. I was especially pleased to find a dramatic improvement over the general quality of entries since the last competition that I judged back in 2006. Almost every entry exhibited qualities that far surpass the dashboards that are typically produced and used today. I’m grateful to everyone who took the time to participate.

This competition served many purposes:

  1. To give dashboard designers an opportunity to test and further hone their skills.
  2. To provide me with many fresh examples of dashboards that were all designed to serve the same audience and purpose—a teacher who needs to regularly monitor the performance and behavior of her students—that I could include in the second edition of my book Information Dashboard Design. I now have a rich and varied set of dashboards that I’ll use to demonstrate effective design and to illustrate common problems that still show up, even in dashboards that are created by experienced designers who take dashboard design seriously.
  3. To showcase examples of exemplar dashboards that could actually be used for an important purpose: to improve educational outcomes.

All three purposes were well served by this rich and varied collection of entries.

Now, let’s get to the winners. Out of the 91 entries, I narrowed the list to the top 8 and scored them using the following criteria:

Each criterion was weighted according to importance, producing a total possible score of 100.

At no time during the judging process was I aware of the competitors’ identities. After scoring the top eight dashboards, to get a final reality check I sent them and a record of the scores to a couple of friends who both support better uses of data in schools. They both concurred with my judgment.

Having finalized and double-checked the selection, I asked for the identities of the competitors. And the winner is Jason Lockwood. His dashboard received the highest score of 90.4 out of 100. This morning when I sent an email to Jason to congratulate him, I learned that he currently works as a usability and design consultant for IMS Health and is based in Switzerland. Although I didn’t recognize Jason’s name, he reminded me that he attended a data visualization course that I taught at IMS Health in London about two year’s ago. Jason originally studied art in Canada. Here’s his winning dashboard.

One of the first things you probably notice is its fine aesthetics. Its use of color, layout, and reduction of non-data-ink make it pleasing to the eye in a way that enhances usability. Because color has been used sparingly, the red alert icons make it easy to spot the students that are most in need of immediate attention (although the icons could be a little bigger to make them pop more). The tabular (rows and columns) arrangement of student information (one student per row) makes it easy to see everything about a particular student with a quick horizontal scan and easy to compare a particular metric across all students with a quick vertical scan. All of the most important metrics were consistently represented using the same dark shade of blue, which featured them above other items nicely (although the dark blue horizontal bars in the bullet graphs would have been easier to see and compare if they were thicker). This design is scalable in that the addition of more students could be easily accommodated by simply expanding the dashboard vertically. Meaningful patterns in individual student attendance information (days absent and tardy) can be easily seen. Rather than going on with my own description, which I’ll elaborate in the new edition of Information Dashboard Design, I’ll let Jason describe the work in his own words:

1 Introduction


In the course of my work as a UX engineer, I have the chance to try to bring good data visualization practices to my clients. However, many of the “dashboards” requested by those clients are closer to reporting solutions. Seeing this competition, I was delighted to be able to try my hand at a real dashboard. It was a very challenging and satisfying exercise, during which I learned a lot. I have designed this purely as a visual mock-up in Photoshop. I have the great luck of working with some very talented programmers who are incredibly adept of translating my mock-ups to pixel perfect, working solutions, which provides great freedom for me. This usually leads to small inaccuracies in the data portrayal, but I have taken extra care this time to ensure all the representations are accurate.

2 Overall design strategy


There is a lot of information contained within the data sheet, so one of the major challenges would be how to be able to display all of it in a clear way, on a single screen. I felt that all the information was pertinent to the goal of the dashboard, so did not want to exclude anything. That led to the compromise of designing to a slightly higher screen resolution of 1400px width than what perhaps may be standard. However, that being said, I have designed it in a way that on a SXGA monitor, the entirety of the student information would be visible, and the less important, class comparison information would be off screen.

I usually base the overall colour palette on the visual identity of the client. As this was not provided, I invented the idea that the school colours were blue and grey. I would therefore use monochromatic shades of blue for data representation, grey for text and labels. For the background, I am using an off-white with a slight orange tint. This creates a subtle compliment to the blue, making the data stand out a little bit more.

I chose Avenir as a font as it provides a good contrast between upper and lowercase letters for good legibility, as well as very clear numerals. With only a few exceptions (title and legends), I kept a 12pt font size to provide consistency.

3 Student data


Breaking down all the data in the excel sheet was an interesting exercise. First step was to prioritize the information. What would the teacher want/need to see first. I decided that the grades were crucial (that is, after all, the overall measurement of the student’s performance). With the grades I grouped together the other pure assessment information: last 5 year assessments, last 5 assignments. The assignments completed late info provides a nice segue (and visual break) from scores to more behavioural information: Absences/tardies, Disciplinary referrals and detentions.

I sorted the students by their current grade, from worst to best, so the teacher can view the problem cases first. Secondary sort is on difference from current grade from target.

Having ordered the information, the next step was to visualize. The grades lent themselves very well to a bullet chart, efficiently portraying the target, current and previous scores. I used sparklines for the last 5 year assessment scores (being an interval axis), and micro-columns for last 5 assignments. For assignment late count (and later detentions and referrals) I used dots to represent the counts, as I find these are clearer to view than straight numbers.

I chose to try to represent not only the amount of the tardies and absences but also their temporal occurrence. Hopefully this can allow the teacher to identify patterns not just for each student, but for the entire class. This ends up almost like a scatter chart.

Last up for the behavioural data are the detentions and referrals, which again I represent as dots, with past term information in a lighter shade and to the left of the implied axis for comparison.

Once all the student information was portrayed, I decide that some sort of aid was needed to help the user view the information in rows. I decided on zebra striping as I believe, while it is technically more non-data ink than row lines, it is clearer and subtler at the same time (a line has two edges, top and bottom, as does a solid box, but only half as many boxes are required).

To compare the overall class performance to other classes/school/district, I combined the information from the summary tabs to create two graphs: a dot graph to show latest median assessment scores and percentage of students’ assessment scores in percent groups. I chose a dot graph in order to emphasise the variation between the groups, but also to line up with the percentage groups of the second graph.

On the second graph, I have unfortunately had to rotate the category labels. I would normally not do this, but I did not want to reduce the font any more (even reduced to 10pt, it would still be too crowded) or expand the screen any further.

I finally added indicators on the student name to show English proficiency and special ed status, with the legend in the footer, along with the data qualification note.

4 Conclusion


Overall, I am quite pleased with the outcome of this design exercise. I believe I have managed to represent all the information in a clear and well structured way that would fulfill its user’s needs. I have shown this to a couple teachers and received positive feedback (and requests to produce it).

My only concern may be the colours: I design on a Mac and the colour fidelity is very good, however the subtleties sometimes disappear when viewed on less well-calibrated screens. This would be usually something we would fix during implementation, so hopefully it is not too bad here.

Just like Jason, overall I am also quite pleased with this design. The primary improvement that comes to mind is the addition of more information in the right-hand section about the class as a whole, such as a frequency distribution of student achievement on course assignments.

Congratulations to Jason Lockwood for exceptional dashboard design.

In addition to our winner, I’d like to showcase the runner up as well. The entry below was created by Shamik Sharma using Excel 2010.

Once again, notice the fine visual aesthetics of this design. Also notice the additional class-level information that appears on the right that doesn’t appear in the winning dashboard, especially the two distribution graphs on top, which are quite useful. And finally, notice how the frequency distribution graph of assessment scores in the bottom right corner, which uses lines (called a frequency polygon) is easier to read than the one that uses bars (called a histogram) in the winning solution. A few features in this solution don’t work as well, however, as those in the winning solution. For example, it isn’t as easy to spot the students in need of attention, and per student attendance information is aggregated in a way that hides patterns of change through time. Overall, however, this is excellent work.

I’ll show many more examples of dashboards that were submitted in the new edition of Information Dashboard Design, both to illustrate useful design ideas and a few that didn’t work.

I invite all competitors who are interested in specific feedback about their designs to post them in my discussion forum on this site where I and others may appreciate them and offer suggestions.

Take care,

49 Comments on “2012 Perceptual Edge Dashboard Design Competition: We Have a Winner!”

By John. October 15th, 2012 at 11:30 am

Congratulations and great work on your student performance dashboards, Jason and Shamik! Thanks also to Stephen (and Bryan) for having this competition. What a great learning experience! I will definitely post the dashboard I created for feedback.


By Robert Allison. October 15th, 2012 at 11:34 am

Congratulations to the winners – those are some nice looking dashboards!

Not sure which discussion forum to post it to (Dashboards? Graph Design? Articles?), so I figured I’d post a link to my entry here, if anybody would like to see it…


Rather than having the class-level summaries as part of the main dashboard, they’re “drilldown graphs” in my design. There’s also lots of mouse-over info on the text & graphic elements. (Unfortunately, neither of these features fit into the judging criteria, since it was just for the layout/design of the static dashboard main page).

By John. October 15th, 2012 at 11:40 am


I wasn’t 100% sure either, but Dashboards made the most sense to me, so I posted mine there.


By John. October 15th, 2012 at 11:43 am

Correction, the forum I posted to is actually named “Dashboard design”


By Nelson. October 15th, 2012 at 12:10 pm

Congratulations to the winners; it’s heartening to see such fantastic examples how it is possible to achieve high data density without compromising aesthetics, clarity and ease of use.

What I particularly like about Jason’s submission is how the dates of the absences and tardies were represented for each student.

Also of interest is his decision to use data bars to show each of the students’ assignment scores. It does not appear that the y-axes of the bars start at 0. I imagine this was done intentionally to make it easier to make comparisons across the assignments by maximizing the differences in height between the bars. The side-effect is that this introduces a bit of a ‘lie factor’ into the cost/benefit equation.

By Daniel Zvinca. October 15th, 2012 at 12:36 pm

Congratulations for the winner, the runner up and all the participants. What we see here are very well designed dashboards. Would be nice to receive by mail the scores Stephen gave to all our solutions, maybe together with the top 8 scored dashboards, attached to it. In this way we can understand better the level of our solutions in this contest.

By Andew Fox. October 15th, 2012 at 12:38 pm

Well done Jason.

Out of interest what technology did you use Jason to develop your dashboard ? Is it something that could become a reality on a users desktop or mobile device ?


By Stephen Few. October 15th, 2012 at 12:41 pm


You’ll find the answer to your question in Jason’s description of his dashboard above.

By Andew Fox. October 15th, 2012 at 1:05 pm

Apologies, scan read before asking the question.

By Jason. October 15th, 2012 at 1:08 pm

Thanks guys, and thanks Stephen, it was a challenging and satisfying exercise.

@Nelson: this was one of the data points (the assignments) I struggled with. I tried both with a standardised axis starting from 0 for each student, but then the differential was so small that you couldn’t easily identify the difference between assignments. I figured that it was more important for the teacher to compare results for the students data more than between the students, so I then used a variable scale to show the difference.

@Andrew: as I mentioned, I am really lucky that I work with some super talented programmers who do the real heavy lifting in taking my designs and making them work. This can be in a variety of technologies, from .net to apex to sharepoint to straight html5/jquery. I’m just happy I don’t have to do it, and am amazed that they can!

By John. October 15th, 2012 at 1:14 pm


If I’m reading it correctly, Stephen picked the top 8 dashboards and then assigned scores, based on the criteria listed, to determine the winning dashboard. So I think only those top 8 actually have scores. With that said, it would be nice to know (and see) what the other 6 top dashboards were.


By Stephen Few. October 15th, 2012 at 1:33 pm


Your understanding is correct. I only assigned scores to the top eight dashboards. Scoring all 91 in this manner would have taken many days. To keep things simple, I’m not going to identify the other six dashboards that made up my list of the top eight. I certainly understand your interest in knowing, but I’d rather not open this can of worms, which could lead to questions such as “Why was my dashboard left out of the top eight?” Anyone who wants feedback may post their entry in the discussion forum. That seems more useful than finding out one’s ranking in the competition.

By Andew Fox. October 15th, 2012 at 1:47 pm


It’s a real shame you are not going to display the Top Eight irrespective of their individual positions as this will certainly give designers of the future inspiration for improving. Designing dashboards is a journey for both the designer and end user and a map and compass really helps in finding your way.

By Stephen Few. October 15th, 2012 at 1:55 pm


I want to encourage all of the competitors to feel as I do, that we all have much to learn and are nowhere near the level we can potentially attain. I’m sure that Jason, who won the competition, feels this way. One’s position in the competition should have no bearing on this perception.

By Andew Fox. October 15th, 2012 at 2:01 pm

Absolutely agree with your sentiment, the position you attained in the competition should have no bearing.

I was not seeking a leader board to measure myself against but gallery of submissions for inspiration.

By Stephen Few. October 15th, 2012 at 2:07 pm


I’ll include several examples in the second edition of “Information Dashboard Design” and comment on them extensively for training purposes, including effective and ineffective features.

By Jason. October 15th, 2012 at 2:10 pm

@andrew: people are posting their submissions in the discussion forum, and I would encourage everyone to do the same. I am really enjoying seeing the different (and similar) paths people have taken to display the data. As Stephen said, there is so much to learn, and I see lots of things I wish I would’ve done different, and probably will shamelessly steal in the future! I don’t think there is ever a perfect dashboard, but seeing everyone’s submissions is providing a lot of inspiration to keep trying and learning.

By Joey. October 15th, 2012 at 4:21 pm

Nice work Jason, congratulations. Though I must say I’m truly shocked that Stephen Few picked a dashboard that featured bar charts where the axis does not start at zero.

By Stephen Few. October 15th, 2012 at 4:39 pm


Had this actually been a bar graph, I would have objected, but it isn’t. It’s a series of graphs that look like bullet graphs but are in fact a variation on the theme. Unlike a true bullet graph, it’s scale is not quantitative. Notice that the scale is ordinal, ranging from grades F on the left through A on the right. In this case, the length of the bar actually works as a way of comparing the letter grades, not the underlying scores. In other words, a bar that represents a D grade is twice the length of one that represents an F grade, and so on. Make sense?

By Nitin. October 15th, 2012 at 8:14 pm


First of all I must thank you dor organizing such a competetion. Learnt a lot by designing and now by seeing what Best ones look like. I would like to ask your inputs on dashboard I have submitted. Also, at times there are some limitations in the visualization tools (for example: Legend can be shown only in Right or Left side of visulaization), how we can ensure Visulaization Best Practices in such cases? If you can share some thoughts on this, it would be really helpful.


By Shamik S.. October 15th, 2012 at 9:15 pm

Congratulations Jason, superb dashboard. Apart from all the design features, the clean font is great indeed. I too struggled with whether I should use the full width of a widescreen monitor and examined the dashboard at standard resolution to see what would be cut off.

Thank you Stephen and Bryan for organizing this contest. Thanks also to others for their comments with kind words. I am quite keen to see more entries in the forum and look forward to checking out different ideas.

By tc. October 15th, 2012 at 11:28 pm

Regarding the bar chart discussion, is Stephen referring to the horizontal bar/bullet chart for prior/goal/current letter grades, while Joey’s referring to the vertical bar chart for the numeric scores for assignments 1-5?

By Ulrik. October 16th, 2012 at 1:35 am

Congratulations to the winners – it is obvious that you put a lot of thought in to the design (love the colors and fonts)and overall stucture which makes for a pleasant and very informative viewing experience!

It is also nice to see some of the other submissions in the discussion forum – clearly a lot of effort has gone into this, and it is fantastic source of inspiration, now that you have worked with the same data yourself.

Thanks to Perceptual Edge for making the contest – it was great way to practise, learn and get inspired. Really enjoyed it!


By Stephen Few. October 16th, 2012 at 8:54 am


Did I misunderstand which chart you were referring to? I was indeed referring the the column of horizontal bars immediately to the right of the students’ names. If you’re referring to the vertical bars for the last five assignments, I assumed that they did start at zero, but I see now that this isn’t the case. Jason’s dashboard scored far enough above the runner-up dashboard, however, to keep him firmly in first place despite this error. Thanks for pointing this out. I’ll be sure to mention the bar graph problem as something in need of correction when I describe Jason’s dashboard in my book.

By tc. October 16th, 2012 at 9:45 am

Jason mentioned he used Excel, which unfortunately defaults to an “automatic” minimum value for the vertical axis of column sparklines. The good news is that the fix is literally one click away – the default for a “custom value” is 0. Glad to hear he had enough of a lead that he didn’t lose his deserved win over one missed click to fix the ill-advised Microsoft default. That would have been the Data Viz equivalent of a golfer losing a tournament over a scorecard signing error! :-)

By Pieter Hendrikx. October 16th, 2012 at 11:07 am

Congratulations to both Jason and Shamik!
It’s very interesting to see how many different designs can arise from one case.

I’m looking forward to the new version of the book and hope to see not only the bottom-N examples but also or even rather prefer to see the designs in the top-N. This would be of great interest for designers that are more or less experienced already to see what the impact can be of subtle differences in design.

Looking at the submitted designs in the forum, many people are really investing in their data visualization skills and experience and that’s just great to see!

By Daniel Zvinca. October 16th, 2012 at 11:31 am

Regarding the winning solution, I made several remarks in http://sfew.websitetoolbox.com/post/A-bit-of-criticism-we-were-polite-enough-6050222
The bar graph issue was one of the remarks I made there. I think there are some others as well.

By Jamie. October 17th, 2012 at 6:18 am

Stephen –

The bar charts in question are the ‘last 5 assignment’ bars.
Personally, I would have preferred that be a line chart or dot plot.

A lot of nice work all around with this competition.

By Michael S.. October 24th, 2012 at 8:12 am

Congratulations to all participants. Working on performance dashboards for higher education, Steve’s challenge provided me with a lot of great ideas. Thanks Steve for doing this and hopefully this can be a yearly thing.

By AKChandarana. October 24th, 2012 at 9:22 am

I like the winning dashboard quite a bit and was heading a similar direction with my own (unsubmitted effort). I think the contest delivers on “purposes 1&2” – but falls short on point 3. Comments from a few elementary & H.S. teachers I know after seeing it:

Teacher #1. We have similar software with comparable capabilities, it is very helpful for data analysis…My thoughts…All brains, no heart. Good thing we will always need teachers for the latter…
Teacher #2. I’m drowning in data!!!! Can I just focus on what I started teaching for…? The kids. Dump grades, and just let them learn!
Teacher #3. Totally agree.
Teacher #4. Just saw this. What I can tell you is that the more time you spend analyzing endless data, the less time you have to spend with the students and creating a meaningful learning experience. I don’t need three different modalities to know a student is struggling!

By Anir. October 24th, 2012 at 10:30 am

I have to disagree with this selection as the topper, here’s why

Column 2: No order to the student listing (I would have ordered by either First name, Last name or Metric value). Also, the red alert icon tells us which students need the most attention – but not why they need the most attention (Correction: I have been told me that the order is actually based on their current grade but unless pointed out, I would not have guessed or understood this). Also, the grades are listed way below the graph in small font and nobody would even notice that the first 2 students have a “F” as the current grade which is why they need attention.
Column 3: Don’t see the value of this metric going back 5 years – also what is the number next to the sparkline (I’m assuming it is the last value of the sparkline but this is not obvious – it can just as well be the average of the last 5 years?)
Column 6: “+” Symbol is extremely confusing, basically the # of vertical lines tells us the # of days absent, but the legend shows a “+” sign and I would have expected “+++” for 3 days absent. (maybe it is +++ and the horizontal line is mixed up but I would have selected a different symbol).
Column 7: We cannot compare “This term” and “Last term” when both are in a straight line, they should be arranged one below the other if you want to compare one to the other
Last Column (vertical bar chart): Breakdown gives no value or insight?
– Colors not very attractive – although this is very subjective
– Student level information and Class level information is mixed up (Right charts are at a Class level) – This makes it impossible to compare the student performance to the rest of the class.

By Sreedhar. October 25th, 2012 at 11:12 am


Greetings! Thanks for the post and congratulations on the winners and participants.

Would you mind posting all the submissions or their links (so that we can visit, look, learn). Please :-)

The reason behind is to get to see all and also in real life, we may do mistakes due to various factors. Given the list and combinations, we can apply our needs and correct. Appreciate your help in advance …



By mdev. October 31st, 2012 at 4:17 pm

Congratulations to Jason Lockwood and thanks to Stephen Few for stimulating such interesting work.

Personally I’d lean towards Shamik Sharma’s entry for its presentation of the aggregate class data and the overall use of space. I’d rate that criteria a little higher and consider ‘overall visibility’ and ‘scalable design’ and ‘legibility’ as functions of the ‘use of space’. I probably need to read more into how each are conceived in this context, in the forum threads the competition prompted.

I look forward to the book in 2013.

By Jason. November 2nd, 2012 at 3:31 am

@Joey, tc, Steve,

Ah, those #@%“&ing bar charts. I am actually aware that the scale is incorrect by not starting at 0. If you do use the full 0-100 scale, the differences between the bars in the space available are so minute as to not provide any useful comparison. I figured that it was more important to see the variation per individual student assignments, rather than student to student comparison. I still believe this, but should have come up with a better way of showing this, perhaps as a dot plot as Jamie suggested, or the sparklines as Shamik did. Again, this illustrates how useful this competition is – in real life I rarely get such well-informed feedback and pointing out these errors will make me a better designer.


You also make some great points. I also struggled with the labels for the grades. However, I assumed that since this is a tool that would be used everyday, the need for legends and labels to be blatant is less, as this will be learned. But I agree that the grade levels could be indicated better – I toyed with sectioning the dashboard according to grade level, but ultimately found this to be too cluttered and distracting. I think Shamik nailed it and his solution worked much better.

Here’s a question to the group about grades. I’ve noticed some people start the grade scale at F, some, like me, start it before. I did this as I felt an F should be represented by something, rather than just the absence of data. That being said, again looking at Shamik’s design, the absence is highly visible, and perhaps a better way to show it, to highlight the failing grade. I’m really not sure which is better – what do you guys think?

About the issue of the “+” symbol. This was also a tough decision, but I needed two symbols, and they had to have a strong enough contrast between each other, so I went for the circle and the plus. I actually feel that the “+” works really well at showing the precise time of the absence, and a quick impression about frequency, and it even clusters very well – better than the circle, which is why i used it for absences, as I believe they are more important than the tardies.

I’d disagree that you can not compare items in a straight line, we do it all the time. I do agree that the comparison would be easier if they were stacked. However, with two rows of information per student, the sheer complexity would make it very hard to do a comparison as well. So I compromised on the less optimal comparison strategy, but believe it works better in context of the overall design. Again, this is possibly something I could have done better, and that’s why getting this feedback is so valuable.

…and you don’t like my colours?!? Haha, just kidding – you are right, that is pretty subjective.


By Michael Moretti. November 3rd, 2012 at 8:26 am

Great work by your contestants. I’m not only impressed, but come away with some new ideas for my own work.

With the consideration of class size and administrative interest in performance metrics (school/district-wide) I would add some filtering (drop list) and dynamic text search. Also, I would include column sorting, where appropriate. Finally, some time-based data visualization would be very useful for trending analytics – motion scatterplots or tree-maps, for example.

Oh, and have to add, I love sparklines!

By Steve Wexler. November 5th, 2012 at 6:21 pm

I’ve composed a blog post with some thoughts on the competition. I love the idea, but think the data set (and the need to display everything) is flawed. I also think the two cited entries missed the most important story.

By Stephen Few. November 5th, 2012 at 9:38 pm


The purpose of this dashboard is not storytelling. It is used daily by a teacher to monitor her students’ performance. Storytelling and performance monitoring require different information and different designs. For performance monitoring, the status of each student should be visible every day.

The data set was constructed in part by people who are experts in the field of student performance analytics and monitoring. In their opinion and mine, all of the data that I provided is needed for effective performance monitoring. If you believe otherwise, please identify the items that you feel are unnecessary and explain why a teacher would not need them for the purpose of performance monitoring. It is entirely possible that the two winning entries could have included other information that might have been useful for performance monitoring. If so, please describe what they missed and if the experts agree that this information should be included on the dashboard, I’ll find a way to incorporate it in my version.

By Steve Wexler. November 6th, 2012 at 6:14 am


To me the biggest story is that 43% of Susan’s students are below goal and the course is 80% completed. Can one easily discern this from either dashboard?

Also, having thought about this some more, I think the dashboard would be considerably more useful for the school’s principal who might need to see what’s happening on a monthly basis. In that instance I can see including information that I don’t think the teacher needs.

By Stephen Few. November 9th, 2012 at 3:19 pm


The fact that many students are at a level of achievement that is below the targets that they set for themselves is actually quite easy to see in both dashboards by scanning the first column of graphics. The fact that this represents 43% is not obvious, but for performance monitoring purposes with a primary goal of helping individual students in need, a precise estimate isn’t necessary. This particular story is certainly meaningful and something that the teacher could use to motivate her students and to coach them in future classes during the goal-setting exercise, but the primary purpose of this dashboard is to inform action that benefits individual students who are in need, not to improve the overall percentage of achievement in relation to the goals that students determined for themselves.

By Steve Wexler. November 12th, 2012 at 4:33 am


If the primary purpose of this dashboard is to inform action that benefits students that are in need, not to improve the overall achievement in relation to the goals that students determined for themselves, why include a graphic showing the classes overall standardized test scores with respect to the other class, the school and the district? That is a graphic that stays the same day in and day out and has nothing to do with the individual student.

By Stephen Few. November 12th, 2012 at 12:04 pm


Seeing how the students in this class are doing in relation to other students is of secondary interest. Given the fact that the summary assessment graph does not change, however, you make a good point that it could be provided through a link for easy look up when needed rather than included on the dashboard. Given the fact that it is not robbing space from or complicate the reading of the primary information, including it on the dashboard seems to work fine.

By MichaelS. November 21st, 2012 at 12:54 pm

Hi Stephen:

I wanted to ask if any of the 91 submitted entries to the contest were created using MicroStrategy? I was going to work on an example using MicroStrategy v9.3 (the latest version) and share it in this forum. I work with both Tableau and MicroStrategy. I feel that this is something Tableau should be able to handle fairly well; it is MicroStrategy that has my curiosity up for this challenge.

I also wanted to let you know I am a real fan of your work and am awaiting the release of the second edition of your Information Dashboard Design book.



By Stephen Few. November 21st, 2012 at 3:26 pm


I don’t know if any of the entries were created using software from MicroStrategy. No one who participated identified themselves as a MicroStrategy employee. It’s likely that at least one was created using MicroStrategy, but I don’t have a way of knowing. Perhaps someone who did so will read this and respond.

By MichaelS. November 28th, 2012 at 11:07 am

Thanks Steve.

I am working on recreating the winning entry in MicroStrategy. I will share how I did it (and examples) with your discussion group when I finish. My goal is to have it done by mid-December.

Once I get that one done, I plan to create it next in Tableau. I will share that one too. My goal is to have that one done by early February 2013.

Thanks for all you do for the data visualization community.


By Jacob. December 6th, 2012 at 1:06 pm

Beautiful work and thanks Steve for sharing. I think the more we empower learners to own and track their own progress, the more motivated they’ll be. Teacher dashboards are important, but they’re only one half of the equation.

Has anyone seen good *student-facing* dashboards?

By Stephen Few. December 7th, 2012 at 9:22 am


You’re absolutely right that students could benefit a great deal from performance monitoring dashboards of their own. Unfortunately, I haven’t seen any that are well designed. I designed something similar for an organization that provides Web-based self-paced courses for students. It was designed for students to manage their courses (select and schedule) and monitor their progress, but I can’t share it publicly.

By Barry. December 13th, 2012 at 4:34 am


I have read the discussion with great interest. Seeing the depth of ideas and ability I would be very keen to invite input on a small project.
Being involved with fundraising for a large schooling network I think a dashboard form of reporting fundraising progress to the school community would be extremely powerful. Just as an example of the type of figures that would need presenting lets say a school costs $3.7 million to run annually. Lets say it has been funded 38% Government Funding, 16% School Fees, 34% by large corporate donations and 12% by various fundraising drives.
Then further break the fundraising drives down into 6 categories listing cat 1 at 35%, Cat 2 34%, Cat 3 17%, Cat 4 11%, Cat 5 and 6 and so on.
Be interested in comments or ideas. Even some drafts if anyone is willing to play around with it a bit. I just dont seem to be able to nail it.

By Dani Long. December 17th, 2012 at 11:27 pm

I’m writing as a teacher in a district that is heavily into data and “data boards”. The amount of staff development and team time spent on the formation and maintenance of these great, big tri-fold poster creations is disheartening to me, because all this labeling and sorting is what computers are best at. And in the end, the data boards sit in a storage area, hardly ever used.

A dashboard like this, which could import spreadsheets of data from standardized testing results, classroom grading software, and administrative reports, would be excelle, especially if it were easily update-able.

By RMax. January 14th, 2013 at 4:10 am

Hi, MichaelS, how does your progress in terms of recreating the dashboards in Microstrategy and Tableau looks like? Have you got something to show?