I am very intrested in how designers work, individually as well as in a team. Typically, researchers who are interested in these aspects study designers by recording and analyzing sequences of events. An “event” can be as specific as “picking up a pencil”, or more general, such as “evaluating a design”, depending on the research question being addressed. These events are logged with the help of video and transcript information, and there are several tools to aid this process of qualitative data analysis.
Apart from video and transcripts, we now have more ways of collecting information of design activity. Since designers now sketch on tablets, we can extract logs of sketching activity. Using wearable sensors, we can get a sense of physical movement around the room, or instances of speech or silence. In order to integrate data collected through such means, we need a tool that is not only flexible enough to accommodate different kinds of timeline and text data, but also extensible to newer forms of data that we have not yet anticipated.
The solution I developed was a framework called VizScribe. It runs on the browser, and can be used to provide an integrated, interactive view of designers' activity from multiple perspectives.
The design of this interface was important from the point of view of both the layout of the visualizations as well as the choice of individual visualizations. My initial focus was on visualizing the text information in the transcript:
My first mockup of the interface included a word cloud, which was adapted from Jason Davies' word cloud generator . I made my version of the word cloud interactive, so hovering or clicking on any word in the cloud would show a word distribution timeline: an overview of the word occurrence over the video timeline. Finally, I used Mike Bostock's collapsible indented tree to create a mockup of user-defined code hierarchies.
The next iteration followed the “overview + detail” visualization principles. Scrolling through the transcript shows your position in the transcript overview, and vice versa. Both the transcript and timeline views can be used to skip the video or scroll the transcript to specific timestamps. The layout also now featured a distinct “timeline view” section under the video to show events such as sketch logs, coded segments etc.
The final version of the design, shown earlier on this page, further separated the timeline visualizations by speaker, and color-coded the data to identify speakers across speech and activity timelines. The image below shows a detail of the timeline view.
The video below demonstrates interactions in VizScribe, and how it is used to analyze a design sesssion:
International Journal of Human-Computer Studies, 100, pp.66–80, 2017.
PDF | VIDEO | GITHUB | WIKI