Quantitative vs. Qualitative
The best user research blends both quantitative and qualitative data. Quantitative data can tell you WHAT happened, while qualitative research helps answer WHY something happened.
Quantitative data is based on metrics, stats, and analytics. This could be anything from information regarding the people that visit your site (who and how many) to scores and ratings. Quantitative data is a record of what is actually going on behind the scenes in terms of numbers. You can also pull from quantitative data to either validate or refute your assumptions.
Clients and stakeholders typically respond best to numbers. Being able to draw from quantitative data can be highly valuable if you need to get them on board to take a project in a certain direction.
Qualitative data is a key driver in user experience research. It involves observing users in their natural environments to better understand their motivations, feelings, and behaviors. It tells you what users actually do, rather than what they say they do. Usability tests are also highly informative, where you observe users interacting with products in a way that can also unlock a lot of qualitative insights.
Whenever possible, you want to triangulate your approach through mixed methodologies, implementing various types of research and collecting different kinds of data, in order to get a full picture of the user experience.
Look beyond vanity metrics
The book Lean Analytics: Use Data to Build a Better Startup Faster by Alistair Croll and Benjamin Yoskovitz, defines a vanity metric as a "piece of data on which you cannot act." In other words, vanity metrics present information that is not very useful because it does not inform the changes you should make to improve your design.
Vanity metrics can seem like a good thing at first. Knowing how many users you have on your platform can make you feel good. But if they sign up and never touch it again, that's not an ideal scenario. UX designers and product managers need to be more concerned with active users. By learning about the behaviors and actions of active users, you can then use that information to better inform your approach to other parts of a product or site.
You'll likely hear a lot of businesses bragging about numbers which are actually vanity metrics. The following are some vanity metrics you should be aware of:
Number of hits to a webpage (you would rather know how many users)
Number of page views (page views are not directly linked to business models)
Number of visits to a site (it makes a difference if it's one visitor or multiple)
Number of unique visitors (you don't know why they came or left)
Time spent on site or pages viewed (you want to know how it connects to user engagement)
How many followers you have, or how many people are on your email list (you'd rather know how many people take action)
Number of downloads (you want to know how many people are active and opened the app more than once; note, however: the number of downloads can help you move up in ranking in the app store)
Look for measurable behaviors
Metrics provide a feedback loop to help you better design for users of your product. Every user behavior is measurable in some way, and you need to keep that in mind when you design. One question designers often ask is, how do we know a user has successfully completed a task?
In e-commerce, you'll often land on a confirmation or thank you page which is a signal that a purchase has completed. It's a simple "flag" or indicator that someone made a purchase and took the step beyond just putting items in their cart.
In online learning, there is an exercise or quiz at the end of each section. These exercises not only solidify the student's comprehension but also mark task completion.
With behavior-specific data, you can use it to better understand who the users are, and what works for them. The data can help show which parts of the experience can be improved. You can also apply what you learn from one aspect and apply it to another.
The best metrics are those that are:
Clear (you know exactly what you're looking at)
Specific (well defined so you're not looking at too much at one time)
Actionable (tells you something about user behavior, so you can make improvements)
Comparable (across time periods or groups of users)
Behavior modifying (this is the hardest to design for, but it means that users are making it through each step in the experience flow)
The Nielsen Norman Group defines conversion rate as "the percentage of users who take a desired action." Conversion rates help measure what happens once users are on your site. Sometimes it may be a marketing and advertising push to get users to the site, but as a designer, you want to consider how to use design to encourage users to become members, to make a purchase, or to perform whatever action you desire of them—in other words, to convert them. How can you design an experience so enticing that first time visitors don't think twice before taking action?
Retention is also essential to the success of products. Rather than looking at whether someone downloaded your product, retention considers users who not only used your product once, but ones who liked it enough to keep coming back. Facebook VP of Product Design Julie Zhou describes retention as essential for product-market fit.
Think critically about your tools and how you collect metrics
There are countless tools that can be used when trying to understand and design experiences. Many in the industry approach surveys, focus groups, and NPS scores with skepticism and therefore ignore them. As with any tool, there can be a time and place for each, but it's also important to recognize limitations. Surveys, focus groups, and NPS scores can bridge the quantitative and qualitative side of metrics.
Surveys are a good way to reach a wider audience, but the challenge is writing questions that will get the necessary information without bias or speculation. Framing the right questions is an art, and should include discussion between team members. Erika Hall warns that surveys are easy to run, and that's why they are so dangerous. If you are going to use them, check out this article from Sarah Doody on how to rethink the questions you ask, and Chris Thelwell's introduction of the Lean Survey Canvas.
Focus groups are when a group of people is pulled together asked their thoughts on a topic. They became a popular tool in the field of market research but, like surveys, can be difficult to do right. Often times members are influenced by others and don't offer their own opinion. In that case, you're not going to get honest feedback. It's also more difficult to observe how customers interact with your product in a contextual setting when they are in a group. You're more likely to get accurate feedback through ethnographic studies or field research.
NPS, or Net Promoter Score, is a measure that was created to help gauge customer satisfaction. It's also a number that UX expert Jared Spool is highly skeptical of arguing that just because a customer says they would recommend a product or service doesn't mean that they actually will. It is more important to focus on actual user behavior, and not hypothetical situations. (You can also hear Jared Spool talk about NPS on the UX Podcast).
Be aware of bias
Biases are something to be conscientious of in all phases of the design process. Confirmation bias happens when you look for data that confirms or upholds a hypothesis or desired outcome while overlooking what is actually in front of you. One of the risks of any kind of research or product development is that you only see what you want to.
The benefit of numbers is that they don't lie, but still you must be aware of any biases in approaching each project. Working in teams is a good way to minimize bias.
Minimize bias by writing down any hypotheses or expected outcomes and refer back to them when you have the results. Remember, there is continuous learning in the design process. Some experiments may not be successful, but you can still learn from failure and use that to inform future direction and decisions. Through documenting your work along the way (we covered this in Manage creative projects), you can continue to use these learnings throughout the process.
Throughout this course, we'll look beyond vanity metrics and dig deeper to help us think about how metrics can inform our design decisions. We'll also consider how we can combine both quantitative and qualitative research methods to benefit user experiences.
Quantitative data can tell you what happened, and qualitative data can help tell you why something happened.
Vanity metrics look and sound good, but they are not actionable.
Confirmation bias occurs when you see the information you want to see rather than being open to what the data says.