As the use of A/B testing and other analysis methods grows, the need for effective metrics becomes more and more important. After all, you don't want to get stuck measuring the wrong thing, or being flooded with too much information that could lead you to paralysis.
Traffic metrics like page views or unique users are a good baseline when applied to websites, however don't often help as much when evaluating specific interfaces.
The UX Researchers at Google (found via Kerry Rodden) have come up with a simple framework to put some more rigour around evaluating the quality of user experience changes, by measuring both the quality and effectiveness against overall goals.
The framework is divided into two areas:
- The quality of the user experience (the HEART framework)
- The goals of your product or project (the Goals-Signals-Metrics process)
Let's take a look.
The HEART Framework
When defining UX metrics for various different product teams, the team at Google noticed they tended to fall into five specific categories, becoming the HEART Framework. These are as follows (note, a lot of examples are being applied to standard Google products):
Measures of user attitudes, often collected via survey.
Example: satisfaction, perceived ease of use, and net-promoter score.
Level of user involvement, typically measured via behavioral proxies such as frequency, intensity, or depth of interaction over some time period.
Example: the number of visits per user per week or the number of photos uploaded per user per day.
New users of a product or feature.
Example: the number of accounts created in the last seven days or the percentage of users who use labels.
The rate at which existing users are returning.
Example: how many of the active users from a given time period are still present in some later time period? (e.g failure to retain, or “churn”).
Traditional behavioral metrics of user experience, such as efficiency (e.g. time to complete a task), effectiveness (e.g. percent of tasks completed), and error rate. This category is most applicable to areas of your product that are very task-focused, such as search or an upload flow.
Example: how many people signed up to a newsletter.
As an example, you could just measure unique users over a time period when analysing an interface, however measuring Adoption and Retention will allow you to distinguish new versus returning users - you can tell how quickly the user base is growing, and if it is stabilising.
With the HEART framework, you don't need to apply all metrics to each test or analysis, and each item can be applied at a number of levels. Pick the ones that are most important for your particular project.
The Goals-Signals-Metrics Process
Now you have defined the categories you need in the HEART framework, you will need specific metrics you can implement and track. These will likely be different depending on the project in question, so will need to be defined using the Goals-Signals-Metrics process.
Start at the higher level by identifying your goals for each category. There may also be levels here - an interface may have different goals for different features.
An example given is YouTube. At the higher level, the most important goal is engagement - Google wants users to consume videos, and keep discovering more videos and channels. However YouTube Search has a feature goal of task success - when they enter a search, Google wants them to quickly find the video that is most relevant.
Also, try not to define things in terms of existing metrics. A common pitfall the Google team encounter is people defining increased traffic as a goal - awe all want increased traffic, but how will user-experience improvements help fix this?
Next to each goal, map out the lower-level Signals - a representation of how success or failure may manifest itself in user behaviour or attitudes.
Going back to the YouTube example, engagement may have a success Signal of time spent watching videos - higher is better, as it means they are not clicking and leaving. For YouTube Search task success, a failure signal may be searching but not clicking on any results.
This step may be hard as there could be a heap of useful signals for each goal. This may require research, benchmarking, or existing data analysis to determine your best predictors.
After defining Goals and Signals, it is time to define the exact Metrics for each. These are the items you will track over time for comparison in your A/B tests.
In our YouTube example, our engagement Goal, leading to a time spent watching videos Signal, may end up with 'the average number of minutes spent watching videos per user per day' Metric.
Averages or percentages will work better for your Metric output if possible.
One last tip from Google once you craft your framework. Try to only track metrics related to your top goals. It may be tempting to add "interesting stats", however you need to ask yourself if these will help in your ultimate decision making, or if they are really needed to be tracked over time (or if a single snapshot will suffice).
Try and avoid extra effort or dashboard clutter wherever possible.
This post continues my series on mental models.