A/B Tests, User eXperience Analytics Tools and Methods
A/B testing plays a big role in User eXperience design processes. Especially on the stage of checking the solutions created to make sure that the project works in a way you thought it would work.
That's a good strategy if you test the design that you create and implement. Otherwise, it's hard to understand what's going on with your business and why some things do not work.
Most A/B tests are conducted using just two versions of a design, and sometimes people want to test more versions. Usually, it's called multivariate or multipage testing. I'll be covering everything in this article because I'm really more focused on the results for end users, not the right technique of performing one or another method.
I've already created 7 versions of my studio website, and I always run tests of whole pages that were updated and small tweaks. I usually do it manually and without any testing tools. I found it useful to use analytics software. Even a free one can help a lot.
The most common tool that I use is Google Analytics. One of the reasons is that it's free, and it's good for the start because you never know for how long you're planning to run your startup. Maybe you just want to try things out and run a couple of versions of your landing page.
Of course, it could be not easy at times to understand what metrics to look at. Still, if you select 2 or 3 of them — you can get all the data that you need to understand how different variations perform and just manually create a comparison between those variations.
Another tool that I found useful is Hotjar. The best thing about it is that you can record your visitors' sessions and see how people perform tasks like logging in, creating an account, etc.
I've learned different metrics and parameters to understand better what is happening with visitors to my website and what I can do to make things even better for new visitors and regular users.
You can learn basic things like the most visited areas of your project and hot actions that users prefer to take. A/B tests allow focusing on a specific page, blog, or element like a button. So you can create a couple of variants, like one with a green button and another one with a red button, and see how it works in your case.
The best way to improve something is getting feedback from real people or, even better, from regular users and customers.
A/B testing is important for managing a digital project like a website, app, or physical product. Yeah, you actually can test different packages, delivery services, how you organize your subscription, etc.
But if we focus on internet projects, it's mostly useful for shops, web apps, or even regular websites that play a role in your business.
It makes sense to run a test or several tests before deciding about design changes. You can spot different behaviors of users using different versions, which might be a turning point for your next plans regarding the project.
I'll share more details on analytics methods and A/B testing tools so you can better understand the basics and immediately start using them to improve your design iterations.
First, a little disclaimer here. I'm not an expert in Google Analytics, and I just want to share my experience and what has worked for me so far.
Also, they change their interface from time to time, and some things I will describe below won't be the same as in the latest version of their interface. But what is unchanged is the tricks you can perform A/B tests and generally see what's going on when people visit your website or a product.
Before you start analyzing your traffic, you need to install the Google Analytics tracking code to your website or a web app in case if you haven't done it. The whole process is pretty straightforward and can take only 2 minutes of your time.
To make sure everything works, you need to open the "Realtime" section, reload the page of your project, and you'll see that your session appears on the chart and even on the map.
Let's imagine that you've installed your tracking code and already have traffic, so you can start testing any changes you might want to make.
Usually, when I open Google Analytics, my 1st priority section is the Behavior area. As a UX designer, I need to see where people start using my website or app and how many take a specific action and go from one area to another. I don't know if it's still relevant, but they've also had some sort of a heat map where you can actually see how far users scroll, which means what they see on their screens in most of the cases. Utilizing A/B tests, you can test changes in the content and see how these changes impact reports like this.
User flows or path analysis views help me to understand how people interact with my design. Also, it helps me understand spots where my visitors close my site or app and the common patterns of user behavior.
Alternatives to Analytics
I used to try Yandex Metrika as an alternative to Google Analytics to see the broader picture of how my users performed certain actions. You can watch videos of visitors' sessions, and even if that takes time to go through videos, you can see a lot of details like clicks, elements that drag attention the most, how users read content and messages, etc.
Another alternative that works similarly to Yandex Metrika is Hotjar. You can record visitors' sessions, see heat, and clicks maps.
As I mentioned before, it's good to have users' feedback before making any changes. Analytics tools help gather this kind of feedback and even automate this process by printing out the dry statistics in charts, tables, maps, and schemes. It's just easier to view and understand this data if you have such visual materials, and you can filter by dates as an example. So you can upload one version of a web page for 2-3 days and then try another one. And the analytics tools that have filtering capabilities will show you what variant performs the best, and sometimes you can even figure out why.
There is another option of gathering such feedback. Still, mostly it's about conducting interviews with one person or a group of people where you can ask them to perform specific tasks and fill out a questionnaire while they view a specific area or a page of a website.
Although it can be time-consuming and expensive, that might give you a more direct response and more specific data.
More Precise Tools
Some people highly recommend Visual Website Optimizer and KISS metrics as alternatives to Google Analytics. These two have a better interface and a lot of integrations.
You might want to consider trying them out if you want to do more accurate tests. For instance, Visual Website Optimizer shows that a new version B is better because of a specific reason. It allows you to split traffic for different versions in a more automatic way.
Google Analytics works on the idea of a session. A person bounces around on a website and leaves, which is considered a "session," but if they return, that is a new session. Tools like Visual Website Optimizer use unique visitors so you can see the stats dedicated to a specific visitor. This is critically important as 10 people, and 10 visits by 1 person are not the same.
Obviously, Google Analytics is not the main and not the best tool. Often a data there looks like a mess. And it would help if you did a lot of custom work to make it work for you.
Custom Analytics Tools
You can always build your own software for your specific needs or build an integration inside your app.
Actually, I've built one inside Spry (blogging system). You're welcome to check it out as an example of a minimalist analytics tool that's built into your website.
This solution is good, but sometimes it can be expensive, and it might take some time before you start running it and conducting tests.
Some time ago, Eric Ries wrote an article about Vanity Metrics and Actionable Metrics. While Vanity Metrics usually do not make much sense in practice, we have some Actionable Metrics that you can start implementing right away:
Split-tests - based on comparing 2 versions of anything from minor copy tweaks to major changes in the product or its positioning.
Per-customer metrics - focuses our attention on real customers. For instance, instead of looking at the total number of page views in a given month, consider the number of page views per new and returning customer.
Funnel metrics and cohort analysis - helpful for projects that have a couple of key customer lifecycle events: registering for the product, signing up for the free trial, using the product, and becoming a paying customer. I'll describe this type, for example, a little bit later in this article.
Keyword (SEM/SEO) metrics - allows testing different users who were acquired with a given keyword as a segment and then tracking their metrics over time.
A good place to start with split-testing is to try moving UI elements around. For instance, you can rearrange the steps of your registration process for new customers and compare this change to the previous flow.
Try to build funnels with the analysis of user interactions in 2 versions that you've tested. As a result, you can see a table with two columns where each column is the variant of the page, and each row is a step of the interaction.
So, for example, you can have 100 registered users in both versions of the page. 50 of them are signed up using variant one, and another 50 users using variant two. As a result, variant one leads to 20 sales, and variant two shows only 5 sales. That can give you a picture of how certain changes impact your business and maybe even what funnel stages affect your results the most.
Whichever tools and metrics you decide to use, it's still good to conduct even the most basic tests to see what's going on.
You don't have to put all your bets on Analytics data or A/B test results. Sometimes it's wise to listen to your intuition and if you get any insightful tips elsewhere.
Often, analytics tools might give you highlights of what people want to do and what they search for. That usually helps to shape better experiences for your customers. It's a crucial part of any user experience design to find gaps and inconsistencies, fix the issues, and improve your product or website.
A/B test and similar tests can be handy and insightful to use in design processes.
I recommend you to use the analytics tools and methods I described above, along with many more you can find elsewhere as an additional layer of your work on the product or a website. But, please, don't overwhelm yourself by testing things too much and creating an endless amount of variants.
It's best to find a balance and do things taking into account your resources and time constraints.