There’s a special assignment I have designed for my visualization course that I have been using successfully for a number of years. I call it “mini-projects”. A mini-project is a project students can develop over 1 or 2 weeks. It’s big enough to simulate nontrivial decisions designers have to make in real projects, and it’s small enough to be carried out in a small amount of time. That is, it’s efficient and effective.
→ How do mini-projects work?
Students receive two main “ingredients:” a data set and a set of “data questions.” The second ingredient is crucial from the pedagogical standpoint. We do not ask students to “visualize data.” We ask them to answer questions using data visualization. There are two advantages to doing that: (1) the effectiveness of a solution can be verified more easily and (2) the results of the students can be compared on equal ground. To understand why this is true it suffices to imagine what would happen if the assignment were more vague: “You students, visualize these data.” How do you actually compare the results? How do you give feedback? How do students know how well they have done?
To give you a sense of what the questions look like, here is an example from our mini-projects. The mini-projects are all based on AidData, a dataset containing information about donations between countries over time.
How do the countries compare in terms of how much they receive and how much they donate? Are there countries that donate much more than they receive or receive much more than they donate?
Students are expected to create one visualization answering this set of questions. As you can imagine, since everything is based on questions, it is much easier to verify whether a solution is effective or not. Does the visualization allow you to answer the question? If not, you clearly have a problem. And, if the answer is yes, are there solutions that make producing the answer less effortful and lead to a lower level of uncertainty?
(On a side note: questions could be substituted with “communication intents.” I have never done that, but in place of questions, one could produce a set of communication intents such as: “I want the reader to be able to extract this set of facts” and then verify whether the readers do extract those facts.)
→ So, students receive the data set and a set of questions, and they are asked to submit their solutions. What happens next?
Once we receive all their solutions, we collect all the submissions in a shared slide deck, and we go through them one by one collectively when we meet in class (or on Zoom if we meet remotely). This is another crucial ingredient of mini-projects. Students are exposed to a lot of different solutions and see our comments in the context of these many solutions. Normally this type of session is very lively, and students ask lots of questions of the type: “How about this solution? How about that solution?” or “Why do you think this is a better solution?” or “What happens if we change it this or that way?” I guess you can easily imagine why this has a lot of value from the pedagogical standpoint. This image gives you a sense of what students produce when you ask them to solve this problem.
On a side note, by doing this type of collective critique over and over again, I discovered how incredibly limited data visualization theory is. It kind of crashes at every turn! But this is a topic for another day.
→ So, mini-projects are based on data+questions and are reviewed collectively in class. What else?
There is a third ingredient I use that I think is equally crucial: I allow students to resubmit their solutions and recover a large proportion of the points lost if they fix the original problems.
There are two reasons why this is an effective strategy. First, students feel less pressure to be extremely conservative in their first submission. Since they know they can always regain the lost points, they feel more free to experiment. They are more adventurous. Second, they learn a lot from redoing the same exercise again.
As a side note, I always found it surprising that assignments are graded, and students are not expected to redo the work without problems. It seems like such a missed opportunity! So, my students redo the work and have an opportunity to compare before and after. They can literally see the improvements with their eyes! Interestingly, this also leads students to be more critical of my recommendations! Many push back if they are not convinced that what I suggest makes sense, and sometimes for good reasons! If you ask students to do more work, they expect that additional work to be worth it.
There are many other aspects and subtleties of mini-projects I did not mention. But the gist of it is this:
Assign data + questions
Review students’ solutions together (collectively)
Allows students to redo their work
One interesting variation I tried multiple times is to have students work on their solutions in two phases: one where they submit sketches as solutions (designed by hand or using some digital drawing program) and one where they submit their coded version using d3.js. When doing that, we sometimes reviewed only the sketches together or the sketches one week and the final solutions the following week. Over time, I came to the conclusion that it’s a bit of an overkill, but this can be adapted in many different ways.
If you want to get access to the instructions for our mini-projects, you can find them in this Google folder I created to share the methodology with everyone. This is still a bit rough, but it’s good enough to start using it. My plan is to write a guide for instructors who want to use it in their own course and maybe a nice website. In the meantime, if you are interested in using it and want more guidance, feel free to contact me by commenting on this post. I would definitely like to see this adopted by many more people and also learn what works and what does not work.
Let me know what you think!
Great idea and I really appreciate your work on setting this up and 'testing' it! I plan to use it in my undergrad Viz course this Fall. Should be a great learning tool!
Thanks for the post. When I was a TA for a programming lab back in the day I did a variant of this where I asked students to resubmit their assignments if they could think of a better way of solving the problem/ or if they wanted to try a different programming language, and get extra credit. I also would give an optional challenging addendum to the original problem just to get some more engagement who were more advanced with their programming.
I felt the students were more appreciative and engaged when they were incentivized to try different methods or experiment with different ideas.