Robinhood Design Evaluation: A comparison of data visualization evaluation methods
A review of three design evaluation methods used to assess the data dashboard of Robinhood, a micro-savings and investing app.
Project Overview
In this project, I conducted a review of three design evaluation methods used for evaluating data visualization and interface design. I applied these practices to assess the data dashboard of Robinhood, a micro-savings and investing app.
Robinhood uses a variety of information visualizations on its investing pages, making it an interesting subject for evaluation. I defined flaws by their level of severity to highlight urgency and proposed two fixes. Finally, I conducted a comparative analysis of the three evaluation methods themselves.
Background
Design evaluation is essential in user experience design and research. We conduct evaluations to identify design and usability flaws to fix them. A key benefit is its discounted approach and rapidity; however, not all methods effectively identify issues in every domain.
In information visualization, evaluation methods often rely on theories that predate modern mobile devices. On mobile phones, traditional evaluation methods sometimes fail to address interaction flaws unique to the form factor. Evaluating mobile investing dashboards requires looking at how they employ both InfoVis and mobile app features.
A Review of Evaluation Methods
Usability and design evaluation has become a standard for professionals. While beneficial, it can be detrimental if not implemented in the right context (Greenberg & Buxton, 2008). Popular heuristics like Nielsen's or Shneiderman's offer benchmarks, but they face challenges regarding heuristics, context, and evaluators.
Strengths
- Cheap and easy to implement with limited resources.
- Can be implemented alongside empirical usability testing.
- Rapid format allows for early issue identification.
Weaknesses
- Potential mismatch between heuristics and the specific product.
- Often fails to address the context of use.
- Evaluator bias and background can skew results.
Methodology
Evaluation Goal
Evaluate the Robinhood data dashboard (Homepage & Stock Detail) using three popular evaluation methods.
Strategy
Select three notable heuristics (Forsell & Johanson, Shneiderman, Cuttone et al.) and focus on the strengths/weaknesses of each method rather than just the evaluation itself.
Scenario & Task
Scenario: A user checks their phone to see overall daily gains. They then want to drill down into a specific stock.
Task: View homepage, check portfolio trends, review stock details, and check performance over time.
Selected Methods
1. Ten Information Visualization Heuristics (Forsell & Johanson)
A broad set of high-level heuristics covering usability issues like:
• Information coding
• Consistency
• Minimal actions
• Recognition rather than recall
• Flexibility
• Prompting
• Orientation and help
• Remove the extraneous
• Spatial organization
• Data set reduction
2. Four Data Visualization Heuristics for Personal Informatics (Cuttone et al.)
Focused on mobile and wearable devices:
- Make Data Interpretable at a Glance
- Enable Exploration of Patterns in Time Series Data
- Enable Discovery of Trends in Multiple Data Streams
- Turn-Key Metrics into Affordances for Action
3. Shneiderman's “Visual Information-Seeking Mantra”
The fundamental standard for screen-based visualization:
- Overview first
- Zoom and filter
- Details on demand
- Relate
- Extract
Key Findings & Recommendations
Inability to check stock trends > 5 Years
Violation: Zoom and Filter
Robinhood simplifies trading for novices but restricts historical context. The time-series filter is capped at 5 years, limiting insights into long-term performance.
Recommendation
Add a '10 Year' or 'All-Time' filter to provide necessary historical context.
Limited Data Exploration
Violation: Enable Exploration of Patterns in Time Series Data
Limiting data exploration prevents users from understanding the full "story" of a stock's past performance.
Recommendation
Add a '10 Year' or 'All-Time' filter to provide necessary historical context.
Data Visualization Through the 3 Waves of HCI
Evaluating data visualization through the lens of HCI history reveals how our approaches to user interaction have evolved.
First Wave (1970s–80s)
Engineering & Human Factors
Background: Influenced by engineering and ergonomics. Focused on "man-machine coupling" and strict guidelines for human information processing.
Evaluation: Quantitative metrics (time-on-task, GOMS). Evaluating Robinhood here would compare task completion times vs. desktop. We'd use the "Gulf of Execution" to see if users understand how to act on the visualization.
Limitation: Explains what happens, but not why. Ignores the user's experience level or context.
Second Wave (Mid 1980s)
Cognitive Science & Context
Background: The pivotal era. Influenced by psychology and the rise of personal computing. Focused on user mental models and context.
Evaluation: Heuristic Evaluation (Nielsen) emerged here. We would interview users to understand their mental models and observe them in natural settings (Situated Cognition).
Limitation: Heuristic evaluation is often done by experts, not real users, missing the actual "messy" context of use.
Third Wave (2000s–Present)
Culture, Ethics & Values
Background: Influenced by critical theory. Focuses on culture, ethics, and human values.
Evaluation: Asking critical questions: Does this design empower the user? Is it ethical? Does gamification manipulate behavior? Methods include Participatory Design and Value Sensitive Design.
Limitation: Tackles complex social issues that are hard to quantify and often neglected by profit-driven organizations.
Conclusion
The three methods used in this evaluation successfully revealed the design and usability successes and flaws of the Robinhood app. Considering the features within the task and scenario on which the evaluation was conducted, Shneiderman's “Visual Information-Seeking Mantra and Four Data Visualization Heuristics to Facilitate Reflection in Personal Informatics best identified the severest issues of all. Shneiderman's mantra, which predates both evaluation and arguably the most fundamental data visualization evaluation method, was convenient for me to evaluate with.
After the evaluation, I view the Ten information visualization heuristics more as a product interface design evaluation method but not necessarily a data visualization method. Overall, these three methods are an excellent combination to evaluate a fintech product like Robinhood.
References
- Cuttone, A., Petersen, M. K., & Larsen, J. E. (2014). Four data visualization heuristics to facilitate reflection in personal informatics. International Conference on Universal Access in Human-Computer Interaction, 541–552.
- Forsell, C., & Johansson, J. (2010). An heuristic set for evaluation in information visualization. Proceedings of the International Conference on Advanced Visual Interfaces, 199–206.
- Greenberg, S., & Buxton, B. (2008). Usability evaluation considered harmful (some of the time). Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 111–120.
- Luzzardi, P., Freitas, C., Cava, R., Duarte, G., & Vasconcelos, M. (2004). An extended set of ergonomic criteria for information visualization techniques. Proceedings of the Seventh IASTED International Conference on Computer Graphics And Imaging (Cgim-2004), Kauai, 236–241.
- Nielsen, J. (2005). Ten usability heuristics. http://www. nngroup. com/articles/ten-usability-heuristics/(acc-essed ….
- North, C., Shneiderman, B., & Plaisant, C. (1997). Visual Information Seeking in Digital Image Libraries: The Visible Human Explorer. Information in Images.
- Scapin, D. L., & Bastien, J. C. (1997). Ergonomic criteria for evaluating the ergonomic quality of interactive systems. Behaviour & Information Technology, 16(4–5), 220–231.
- Webel, B. (2020). Fintech: Overview of innovation financial technology and selected policy issues. In Fintech: Overview of innovation financial technology and selected policy issues ([Library of Congress public edition].). Washington, DC: Congressional Research Service. https://purl.fdlp.gov/GPO/gpo141846