When creating a visualization to understand and communicate data, we face different design choices. Even though past empirical research provides foundational knowledge for visualization design, practitioners still rely on their hunches to deal with intricate trade-offs in the wild. On the other hand, researchers lack the time and resources to rigorously explore the growing design space through controlled experiments. In this work, we aim to address this two-fold problem by crowdsourcing visualization experiments. We developed VisLab, an online platform in which anyone can design and deploy experiments to evaluate their visualizations. To alleviate the complexity of experiment design and analysis, our platform provides scaffold templates and analytic dashboards. To motivate broad participation in the experiments, the platform enables anonymous participation and provides personalized performance feedback. We present use case scenarios that demonstrate the usability and usefulness of the platform in addressing the different needs of practitioners, researchers, and educators.
|Title of host publication||Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, CHI EA 2021|
|Publisher||Association for Computing Machinery|
|Publication status||Published - 2021 May 8|
|Event||2021 CHI Conference on Human Factors in Computing Systems: Making Waves, Combining Strengths, CHI EA 2021 - Virtual, Online, Japan|
Duration: 2021 May 8 → 2021 May 13
|Name||Conference on Human Factors in Computing Systems - Proceedings|
|Conference||2021 CHI Conference on Human Factors in Computing Systems: Making Waves, Combining Strengths, CHI EA 2021|
|Period||21/5/8 → 21/5/13|
Bibliographical notePublisher Copyright:
© 2021 ACM.
All Science Journal Classification (ASJC) codes
- Human-Computer Interaction
- Computer Graphics and Computer-Aided Design