Skip to main content

We live in the age of big data and analytics and it seems like this discipline has appeared nearly everywhere, including sales, marketing, product management, and, of course, UX design.

There’s a good reason data analytics has become an essential part of the design workflow nowadays. By analyzing the behavior of your users, you are able to significantly improve the quality of your designs.

So, let’s understand how design teams can take advantage of data-driven design and make sure that your users are in a state of absolute bliss when interacting with your product.

What Is Data Driven Design (and Why Use Data in Your UX Design Process?)

Data-driven design is the process of incorporating data tracking and analytics into your workflow to help you make better design decisions based on the learning you obtain from your data analytics tools.

Designers use a variety of user research and analytics tools to track critical user journeys and KPIs that represent the overall user-friendliness of their product. For this, they use processes such as A/B testing, analysis of different types of data (e.g., quantitative and qualitative data), user testing, interviewing focus groups, building trackable landing pages and prototypes, and others.

As a result, they can quickly identify usability issues, find the core problem causing it, and fix it.

Moreover, design teams actively use data analytics to measure the impact of their latest designs on user experience and use the findings to create new and improved iterations.

Examples of Data-Informed Design

The discipline of implementing UX design solutions based on empirical data is much more popular than you might have thought. The majority of successful product companies actively use this process to ensure that the design choices they make are likely to have a positive impact on their product.

Here are three examples of these products.

YouTube’s Autoplay

Do you remember the older versions of YouTube when you had to click on a new video after the current one finished? Wasn’t that annoying?

Well, this “feature” led to a significant drop in the engagement rate of YouTube users (number of videos played per day), and the design team decided to do something about it.

They looked at the data and obtained two interesting insights:

  • People tend to binge-watch multiple videos in a row. Some of them like to turn to music on YouTube as background noise while they cook or clean. Others (like me) like to fall asleep with some video playing in the background.
  • Many users would not click on another video thumbnail after the playback was complete. It was inconvenient for a lot of users to constantly click on the video thumbnails.

Based on these findings, they decided to add the famous “autoplay” feature that would automatically switch to a related video suggested by the platform or the next one in the playlist.

youtube's auto play

As a result, YouTube saw a massive surge in its engagement metrics.

Stay in-the-know on all things product management including trends, how-tos, and insights - delivered right to your inbox.

Stay in-the-know on all things product management including trends, how-tos, and insights - delivered right to your inbox.

  • By submitting this form, you agree to receive our newsletter and occasional emails related to The Product Manager. You can unsubscribe at any time. For more details, please review our Privacy Policy. We're protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
  • This field is for validation purposes and should be left unchanged.

Airbnb’s User-Centric Review Flow

If you have ever stayed at an Airbnb property, you have most likely reviewed the place by going through the platform's feedback flow.

If I ask you whether leaving a review was cumbersome, most of you would probably say that the form was not short, but you didn't find it painful to fill out.

Well, that's because you didn't have a chance to experience the agony of filling in the million-question-long form that Airbnb used for its reviews in the past.

The folks at Airbnb had two problems:

  • Few people would fill in their review forms.
  • The responses in their forms were very generic and unhelpful.

To find a solution for these, the design team started analyzing the form's performance data and understood that the vast majority was simply abandoning it somewhere in the middle.

Based on these learnings, the team started conducting usability testing and understood that:

  • The size of the form was overwhelming for users. Nobody wanted to answer that many questions.
  • People left generic feedback because they did not want to spend their precious time writing detailed reviews.

Now that the source of the problem was clear, the team at Airbnb started a major redesign of their review experience. Instead of a single long-form, they created several pages of shorter forms to avoid overwhelming their users.

They also started using buttons with predefined answers instead of text fields, which helped them avoid generic answers. Moreover, the team could run data analytics on user responses and optimize them even further since there were structured answers instead of text.

airbnb screenshot
Source: Airbnb

As you might have already guessed, the redesigned form was a massive success! 

Reddit’s Comment Navigation

Reddit is not simply another forum. It has become a cultural phenomenon and the place where the entire internet gathers to discuss things.

There was an interesting downside to the popularity of Reddit, though. People were adding so many comments under the topics posted on the site that it was becoming impossible to read the thread. It was especially annoying to navigate when there were dozens of replies under a single comment, and you had to scroll for a while to reach the next comment.

The folks at Reddit knew about this problem. They even had several concepts for redesigning the comment section to make it easier to navigate.

To pick the best concept, they started running usability tests and taking advantage of other research methods to gather user interaction data, extract key metrics, and compare their concepts using these metrics.

There was one design solution that clearly outperformed the rest—letting users expand and collapse responses.

reddit screenshot

This way, if you did not want to read the endless thread of responses under a certain comment, you could simply collapse it and move on to the next comment in the list.

As you can see, data-driven design process sits at the core of the workflows for many successful products.

And I know that you want to make your design decisions based on data, too (otherwise, why are you here?).

So, here's a step-by-step breakdown on how to incorporate data-driven design into your product development process.

6 Steps for Enabling Data-Driven Decisions for Your Product

Let’s take a more practical approach here. Instead of simply talking about what should be done in theory, we will also look at examples for each step using real-life tools. The main weapon of choice for me today will be Amplitude since it is the event-driven product analytics tool that I use at work every day.

Now, let’s begin with the first step—strategizing.

Step 1: Build A Data Analysis Strategy With Your Stakeholders

You can forget about getting valuable results out of your data analytics efforts if you don’t have a sound strategy for it.

A typical data analysis strategy includes the following two sections:

Section 1: Objectives

You want to clearly define the reason you are implementing analytics in the first place. Without clear objectives, you risk tracking and measuring user data that does not contain valuable insights.

Now, let’s imagine that you are in charge of Spotify and see what a typical set of objectives would look like for you.

objectives and goals

I know, these goals look kind of obvious. But, believe me, without clearly stating them, you will end up in an analytics mess.

Section 2: KPIs

Once your objectives are ready, you will use them to derive the key metrics that you want to measure. The keyword here is “derive.” It means that improving your metrics should bring you closer to achieving your goals.

For instance, if your goal is sustainable user growth, you should consider measuring retention, activation rate, and signup rate. Picking MRR as your key metric, on the other hand, would be a bad idea, as you can increase your revenue in the short term but lose most of it in the long run. In this case, your growth will not be sustainable, and you will fail to achieve your goal.

Here’s what a list of KPIs will look like for the first goal in our previous example - “Enhance User Experience”:

example of KPIs

These are all indicators of a positive user experience. If the rates above are increasing, we can confidently state that users are enjoying our product.

Step 2: Integrate a Data Analytics Tool

Now that you have a clear understanding of what you want to measure, it’s time to think about how you will measure it. In particular, it is about your selection of the analytics tool that you will end up using to find interesting insights in your data.

There’s a wide range of tools that you can consider for this, depending on the category of analytics you are interested in (e.g. heatmaps, demographics, user behavior, etc.), the feature set (cohorts, funnels, etc.), and your budget, of course.

Here’s a couple of options for you to consider for each type:

  • Heatmaps (including A/B and multivariate testing): Hotjar, Mouseflow, Clarity.
  • Demographics and Site Analytics (e.g. bounce rate): Google Analytics, Matomo, Yandex Metrica.
  • Event-driven analytics (for conversion rates, funnels, etc.): Amplitude, Mixpanel.
  • Business Intelligence (for financial impact metrics, etc.): Looker Studio, Tableau, PowerBI.

My current setup consists of a combination of GA4, Hotjar, Amplitude, and Looker Studio.

Today, however, I want to focus on Amplitude and show you how to integrate it.

We will use Google Tag Manager (GTM) to keep things simple and avoid involving your engineering team in setting it up.

We will begin by opening GTM, clicking on the “Tags” in the left sidebar to open our list of tags, and clicking on “New” to open the tag creation sidebar.

screenshot of tag manager

To avoid manually configuring a tag for Amplitude, the company has created a template for us to use. To use that template, we will click on the “Tag Configuration” card to open the template gallery.

screenshot

Here, we need to click on “Discover more tag types in the Community Template Gallery”, search for “Amplitude” and click on the “Amplitude Analytics Browser SDK”.

screenshot

We’re almost there. First, we need to add our Amplitude API Key to the corresponding field.

tag configuration screenshot

Then, we add a trigger by clicking on the trigger card on the screen and selecting “Initialization—All Pages.”

Finally, we give our tag a name (e.g., “Amplitude Initialize”), save it, and publish a new version in GTM. Voila! The next time someone opens your product, Amplitude’s SDK will initialize and start running in the background.

Step 3: Document Your Quantitative Data Collection Process

Running Amplitude is only the beginning of the setup. As it is an event-driven analytics tool, you will need to manually set up events in your product and fire them using Amplitude’s SDK, which we have just added to your product using GTM.

But before adding events, we will first need to document them. A data collection document is usually a spreadsheet where you list the events that the SDK will fire, the user actions that will serve as a trigger for them, and the different parameters that you want to send to Amplitude along with the event.

Here’s what a simplified version of this spreadsheet looks like for tracking user interactions in the Airbnb review form.

spreadsheet example

After we have documented the events we need to track, we will then ask our engineering team to add triggers for each one in our frontend code using Amplitude’s SDK. When they complete this task, our product will automatically fire these events and submit the necessary data to Amplitude for us to analyze.

Step 4: Create Reports and Dashboards for Your Key UX Metrics

At this point, you have all you need to start conducting the actual analysis to determine whether your user experience is meeting your expectations.

Your path to data visualization will begin by building graphs for the key usability metrics you identified and documented earlier using the analytics tool of your choice. Let me show you how to do it using Amplitude as an example. The playlist adoption rate seems to be a great metric to start with.

We’ll create a new segmentation chart (a.k.a. an ordinary chart) in Amplitude by clicking on the “New” button and picking “Segmentation” from the sidebar that opens.

example screenshot

Then, we need to select the events that will help us measure adoption. In our case, it will be “Follow Playlist”.

events screenshot

Next, we need to ensure that the numbers on the chart represent the percentage of active users who perform this action. So, we will select the “Active %” option in the “Measured as” menu.

example screenshot

Here’s what our chart will look like. It shows that around 70% of our daily active users follow playlists every day. To be honest, this seems to be quite a good result.

example screenshot

Finally, we will save our chart and work on the remaining metrics.

You can create a dashboard to have a single view of all your key metrics. For this, we will click on the “New” button again, pick “Report,” then “Dashboard.”

As a result, we will see an empty dashboard that we need to name and fill with the charts we had created earlier.

dashboard screenshot

Let’s name it “Key Usability Metrics” and add our charts by clicking on “Add Content” then “Existing Chart” and picking the charts we made.

example key metrics screenshot


Congrats! Now you have an overview of all your key metrics!

Step 5: Constantly Review and Analyze Your Reports

The results you see on your Amplitude dashboard are “alive.” They will increase or decrease depending on the changes you have made to your UX. So, you should constantly review these metrics to:

  • See if your efforts have made a positive impact on the usability of your product.
  • Identify poorly performing metrics and try to find the reason behind them.

I suggest you organize weekly or bi-weekly review meetings with your stakeholders to discuss the KPI changes and brainstorm possible reasons for the ones that do not perform well.

The outcome of these meetings should be a list of hypotheses about improving your metrics and candidate problems that cause poor performance.

You will then run a UX design research process to validate or dismiss these hypotheses and start building solutions that can move the needle.

Here’s a real-world case study from my experience.

One of the products I was working on was a marketing platform for websites that let them send push notifications to their audience. After signing up, website owners had to set up our SDK on their websites to activate push notifications on it.

We built a funnel chart that showed all the main steps for website owners to set up. When we first reviewed the funnel, we noticed something odd. There was a massive 70% drop between the SDK installation step and the first push notification sent.

After a brainstorming session, we hypothesized that the reason behind this drop was the lack of a “test mode,” where users could check whether the push notifications were working correctly.

We ran several usability tests and user interviews to validate this hypothesis. It turns out that website owners were not comfortable with using any new technology unless they tested it on themselves first.

So, we designed and built a testing journey that the users would go through right after installing the SDK. The result was staggering. We were able to cover the user's needs and pain points quite well. Within a couple of weeks, the 70% drop turned into a mere 15%!

Step 6: Integrate Learnings into Your Designs

The final step in your data-driven approach is also the most important one. Your analyses and hypotheses will not impact your product unless you act upon them.

The way we handled this process for one of the products I was leading was quite simple. The moment we validated a hypothesis, we would create a product design task in our task management tool, prioritize it, and assign it to one of our designers.

As soon as the design was complete, we would run moderated or unmoderated usability tests on them, iterate based on user feedback, and prepare the new design for backlog refinement.

No matter whether you follow the process I just walked you through, or you build your own—make sure that you act on the learnings!

Level Up Your Design Game with Data

Data analytics is a fantastic tool in the hands of a design team. By constantly monitoring and troubleshooting your key usability metrics, you can significantly improve your decision-making and build user experiences that will leave your users in awe.

Don't forget to subscribe to our newsletter for more product management resources and guides, plus the latest podcasts, interviews, and other insights from industry leaders and experts.

Suren Karapetyan

Suren Karapetyan, MBA, is a senior product manager focused on AI-driven SaaS products. He thrives in the fast-paced world of early stage startups and finds the product-market fit for them. His portfolio is quite diverse, ranging from background noise cancellation tools for work-from-home folks to customs clearance software for government agencies.