TY - GEN
T1 - Evaluating visual analytics systems for investigative analysis
T2 - VAST 09 - IEEE Symposium on Visual Analytics Science and Technology
AU - Kang, Youn Ah
AU - Görg, Carsten
AU - Stasko, John
PY - 2009
Y1 - 2009
N2 - Despite the growing number of systems providing visual analytic support for investigative analysis, few empirical studies of the potential benefits of such systems have been conducted, particularly controlled, comparative evaluations. Determining how such systems foster insight and sensemaking is important for their continued growth and study, however. Furthermore, studies that identify how people use such systems and why they benefit (or not) can help inform the design of new systems in this area. We conducted an evaluation of the visual analytics system Jigsaw employed in a small investigative sensemaking exercise, and we compared its use to three other more traditional methods of analysis. Sixteen participants performed a simulated intelligence analysis task under one of the four conditions. Experimental results suggest that Jigsaw assisted participants to analyze the data and identify an embedded threat. We describe different analysis strategies used by study participants and how computational support (or the lack thereof) influenced the strategies. We then illustrate several characteristics of the sensemaking process identified in the study and provide design implications for investigative analysis tools based thereon. We conclude with recommendations for metrics and techniques for evaluating other visual analytics investigative analysis tools.
AB - Despite the growing number of systems providing visual analytic support for investigative analysis, few empirical studies of the potential benefits of such systems have been conducted, particularly controlled, comparative evaluations. Determining how such systems foster insight and sensemaking is important for their continued growth and study, however. Furthermore, studies that identify how people use such systems and why they benefit (or not) can help inform the design of new systems in this area. We conducted an evaluation of the visual analytics system Jigsaw employed in a small investigative sensemaking exercise, and we compared its use to three other more traditional methods of analysis. Sixteen participants performed a simulated intelligence analysis task under one of the four conditions. Experimental results suggest that Jigsaw assisted participants to analyze the data and identify an embedded threat. We describe different analysis strategies used by study participants and how computational support (or the lack thereof) influenced the strategies. We then illustrate several characteristics of the sensemaking process identified in the study and provide design implications for investigative analysis tools based thereon. We conclude with recommendations for metrics and techniques for evaluating other visual analytics investigative analysis tools.
UR - http://www.scopus.com/inward/record.url?scp=72849135069&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=72849135069&partnerID=8YFLogxK
U2 - 10.1109/VAST.2009.5333878
DO - 10.1109/VAST.2009.5333878
M3 - Conference contribution
AN - SCOPUS:72849135069
SN - 9781424452835
T3 - VAST 09 - IEEE Symposium on Visual Analytics Science and Technology, Proceedings
SP - 139
EP - 146
BT - VAST 09 - IEEE Symposium on Visual Analytics Science and Technology, Proceedings
Y2 - 12 October 2009 through 13 October 2009
ER -