Evaluating five years of system design and development - how hard could it be?

Authors: Alexander Savelyev*, Texas State University
Topics: Geographic Information Science and Systems, Hazards, Risks, and Disasters
Keywords: geovisual analytics, evaluation, user study
Session Type: Paper
Day: 4/11/2018
Start / End Time: 10:00 AM / 11:40 AM
Room: Southdown, Sheraton, 4th Floor
Presentation File: No File Uploaded

This study evaluates an advanced geovisual analytics tool built for the purpose of geospatial social media data analysis. Instead of focusing on technical challenges or software implementation details, we emphasize the evaluation issues associated with testing large and complex (geo)visual analytics systems. We start by presenting a comprehensive synthesis of the existing best practices in software evaluation, including high-level conceptualization of similar user studies, a summary of common pitfalls encountered in similar work, as well as a summary of relevant low-level study design decisions. We then describe the user study that we have designed and administered following this synthetic methodology. Our study has two outcomes. First, we successfully evaluate SensePlace3, a web-base geovisual analytics system for geospatial social media data analysis, demonstrating high level of participant performance and low-to-moderate workload in realistic analytical scenarios involving real data and events. Second, we reflect upon our methodology design and document its details, as well as our suggestions for its improvement, for those looking to design a study of a comparable kind.

Abstract Information

This abstract is already part of a session. View the session here.

To access contact information login