Agile UX Research for Designing ‘Everyday Analytics’

February 6, 2013

A screenshot of a heatmap display.

A rollover heatmap allows therapists to quickly see where patients are experiencing challenges.

Electronic record systems are being adopted in the mental health field, but how big data will empower therapists and their patients is still an open question.  More data and bigger data isn’t helpful until we can render meaning from it. To make it meaningful, we need “everyday analytics.” That is, we need ways of making sense of new streams of data that don’t require us to know a big data analyst or to become one ourselves. “Data Visualization for Psychotherapy Progress Tracking” published in the SIGDOC’12, proceedings chronicles how a design team I was part of approached the challenge rendering new analytical tools for everyday use in evidence-based pscyhotherapy.

An snapshot of a single client dashboard showing patients progress.

Our dashboards are designed to reduce interpretation time while also reducing interpretation errors.

Led by psychologist and product owner Kelly Koerner of the Evidence Based Practice Institute our team of one visual designer (Heidi Connor), two programmers (Rohit Sharma & Mike Lipp), and myself used a process of continuous iteration with stakeholders to design several analytics dashboards for therapists. As the user experience researcher, I designed and led the product evaluations and interviews. Our product owner and our designer sat through the evaluations. This turned out to be efficient for team learning. Sometimes Connor would have a new design element sketched by the end of an evaluation. Instead of batch testing our designs, we scheduled evaluations every few days.  I ensured that the full team got nearly instant feedback on each evaluation.  In this way, learning across the team happened on a continuous basis. No big reports to read, just short emails of key findings to breeze through about twice a week. In the context of a domain heavy design, where there are a lot of details to recall about the user experience, this approach seemed to work well.

The project was mainly a pragmatic redesign of current evidence-based practices for the era of electronic, networked records. Many therapists already track patient progress. Those who do tend to have better patient outcomes. Our goal was to simplify and speed up analysis of patient trends without compromising accuracy in interpretation. The key to accomplishing this goal is to build the design around therapists analytical tasks and their workflow. Among the constraints we had to design for is the fact that most therapists aren’t experts at interpreting trend data as well as the fact that they only have a minute or so to review patient’s progress between sessions. Our biggest challenge in terms of a visually display for progress trends, is that therapists need to compare several trends at once. These data come from multiple psychological tests, but these tests can’t be normalized for a single view. Those who designed each of these  tests, didn’t forsee a day when  they might one day be rendered into a single visual display. Considering, how our user research panned out, it seems like it would be worth the effort for psychological statisticians to rework the math behind the most commonly used metrics so they could be visualized together. This would allow for the full value of a visual display to be unleashed, reducing interpretation time as well as interpreting errors.
A Multi-Client Dashboard for Patient's Progresss

Our Multi-Client Dashboard imagined more novel uses for data than has been possible before.

Many of the projects that I’ve worked on have been summative evaluation. That is, identifying what’s worked and what hasn’t- past tense. It’s important work, but on a personal level, it can feel like you arrived at the party after most of the guests have gone home. I find the formative evaluation process used on this project- where challenges are identified and accommodated as part of the design process- to be very satisfying. I enjoyed the team dynamic and will look for other projects where a continuous development strategy makes sense. Designing a visual analytic tool works every part of the noggin. It’s like doing a 3D crossword puzzle. That was fun too. More details of this design adventure are available in the paper.  If you are interested in reading it, but don’t have access to the ACM library, ping me and I’ll send you a copy.

Tags: , , ,

Methods Madness, Projects, Social Impact, Systems, User Experience