What is A/B Testing
A/B testing, or split testing, is a method used by UX designers and marketers to compare two versions of something. Be it a product, an app, a website or an ad. The goal is to determine which one of the two versions is more successful, according to our business, product, and user needs.
Two versions of the product, A and B, are created and submitted randomly but in equal numbers to users. The version that measures the most users taking the desired action is the winner. This obviously implies that each user’s response is recorded with some testing tool or analytics.
How can designers use A/B testing?
A/B testing comes in handy when it comes to confirming a design is going in the right direction, optimizing conversion rates, improve the overall user experience, decide which approach or tone to implement.
So let’s take a look at all of the elements on which you can carry out A/B testing:
- CTA buttons (their placement, size, colour, or copy)
- Headings and subheaders
- Images on landing pages
- Contact Forms
- Copy (length, placement, and content)
- Videos (presence or absence)
1. CTA buttons (Placement, size, colour, or copy)
Call to action or commonly known as CTAs are strategic pieces of content. CTA is one of the most important elements of any campaign. It highly impacts your conversion rate. Everything on your landing page or homepage should lead the app user to tap the CTA.
The CTA can take many forms, depending on what constitutes a conversion in your campaign: it can be a download, registration, purchase or any action that benefits your brand. Any aspect of your CTA button could have an impact on your conversion rate, from placement to wording to colour.
Your A/B tests can focus on:
- Testing if the CTA stands out from the rest of the creative elements: Check which variation of colour or size is more clickable.
- Testing the CTA copy: Test if the text on a CTA button makes a difference. Also whether the paragraph leading to your CTA affects click-through rates.
Marketing experiments did a study by changing the text of the CTA button and the results are an eye-opener. They did multivariate testing, in which they tested 5 versions of CTA buttons and measured the click-through rate.
The researchers found that subtle changes in words can make a significant difference in performance. By split testing, you can discover the words that will work best for your unique audience and your unique products. More importantly, you can mitigate negative results, like a 26% drop in click-through, from these subtle changes by understanding their effect in a controlled environment.
2. Headlines and Subheaders
The headline is your website’s first impression as is the first thing your visitors will read. It is essential that it hooks them up to scroll through the remaining page. A generic, vauge, or irrelevant headline will shun away many visitors from reading on leading to a high bounce rate. It not only damages your conversion rate but also drags your SEO rankings.
While testing headlines you can try using different title lengths, wording, and emphasize the main benefit for the visitor. For example, instead of a generic headline like “Grab this cheat sheet now!”, try something more specific and evocative like “Learn these proven success strategies that will make you a better UI/UX designer”.
You can try the same with the text that supports your headline, which is referred to as sub-headers. These are also strategic pieces of content that bolsters visitors confidence and provides that little but required push to take the desired action.
3. Images or Graphics
Images/graphics enhance the richness of the experience of the user. Images evoke different memories and experiences than do written words. The studies consistently find that images are the best way to help convey feelings that drive conversions.
Different variations of images, graphics or illustrations can also be tested to see which version the majority of the users like. There would be certain images that the users would prefer over the other. Split testing of images is more prevalent in e-commerce websites.
4. Contact Forms
Contact forms are an important touchpoint for your website’s visitors. Depending on your business, you might need more than just a first name and an email. You can A/B test the following elements of the form:
Length – Long forms deter visitors from completing the form and skipping halfway. You should only keep the most essential information you require to convert the prospective visitor into your customer.
Submit Button Copy – Just like the CTA button, small changes in submit button leads to surprising results.
Field Bottlenecks – People are reluctant to share sensitive information like phone number. So in this case, you need to test whether it is acting as a bottleneck and consider to remove it. If getting phone number is an important piece of information to close your sale, then try re-phrasing it or shifting its location after certain questions.
Design – Form design is a part of UI design but it is often overlooked. If done right, it will skyrocket your conversions.
Copy is the content of your website. And just like the other elements, it can also be tested on different parameters. The length of your copy also matters. If visitors are interested in what all products are you offering, then the length should be kept minimum. But if they like to read about all the services you offer and want to know how something works, the content would be text-heavy.
You can try different variations of the paragraphs, written with different styles and language difficulty, to see which copy makes more sense to them.
Not all people like to read. Including the videos on the website will lure them into sticking to your website and watch the video about your service offerings or USP. You can try putting the videos above the fold or down below, to see which placement yields the best results and also whether the presence or absence of videos makes the website more engaging to your target audience.
So now you know what elements to test on your website. Now we will see, as a designer, how should you conduct A/B testing.
Steps to Carry Out A/B Testing
- Research and Collect Data
- Set the goals
- Build a hypothesis
- Create A and B versions
- Test A vs. B versions
- Results analysis
1. Research and Collect Data
The first step in improving anything is research. To carry out A/B testing, you must identify which elements i.e. the ‘problem areas’ to be tested. You can find these problem areas through existing data. These could be pages with low conversion or high drop-off rates. Use your own data to inform which features you should test.
In some cases, like testing for images or colour variation, you can base A/B testing on your or someone else’s gut instinct and opinion. But tests based on objective data and in-depth hypotheses are more likely to gain valid results.
2. Set the goals
Now that you have identified the problem areas, you should now align your A/B testing with your business goals. Say, for example, you want to achieve a conversion rate of 20%. Currently, due to some problem areas, it is stuck at 7.5% So setting goals clearly will help you better understand what you want to achieve through A/B testing.
3. Build the hypothesis
Formulation of hypothesis is the basis of any research. You have to generate a valid testing hypothesis – consider how and why a certain variation may be better. For example, you might want to improve the completion rate of your contact form. So your hypothesis could be: “Research shows users are not responding to sharing of their mobile number right after their name. If we place the mobile number field after the email field, this will increase the chances of sharing their mobile number.”
4. Create A and B versions
Now that you have formulated your hypothesis, you can now have a clear idea of what to test and on what basis. You can now create two versions on which you will be carrying out the A/B test. The existing version is a control version, called ‘A’. If you’re testing a web page, this is the unaltered web page as it exists already. If you’re testing a landing page, this would be the landing page design and copy you would normally use.
From there, build a variation and call it ‘B’ — the website, landing page, or email you’ll test against your control. For example, if you’re wondering whether including a testimonial on a landing page would make a difference, set up your control page with no testimonials. Then, create your variation with a testimonial.
5. Test A vs. B versions
The stage is all now set for your to invite your target group to your experiment. At this point, visitors to your site or app will be randomly assigned to either the control or variation of your experience. Their interaction with each experience is measured, counted, and compared to determine how each performs.
To run an A/B test, you will require an A/B testing tool, which will provide a platform and environment to test both – the control and variation. There are several different A/B testing tools available. The most commonly used tools are Optimizely, VWO, Google Optimize, Hubspot, Crazyegg, Omniconvert. This list is not exhaustive, and you are free to explore the web to find the most appropriate tool that will suit your needs.
A/B testing can be conducted remotely, just like other testing methods in UX design. We have covered useful tools for remote UX design and testing in our one of the previous articles: Remote UX Design Toolkit.
6. Result Analysis
Once testing is over, it’s time to analyze the results. The A/B testing tool will present the data from the test and show you the difference between how both versions of your page or the app performed, and whether they demonstrated a significant statistical difference.
Running A/B tests requires a good knowledge of statistics and prior experience. You will require to understand certain concepts like sample sizes, statistical significance, ANOVA test, one/two-tailed tests, multivariate testing, etc. to know if you’re doing the right thing.
From the results, if you find that the variation is statistically significant, you can then be sure that the ‘B’ version would definitely perform better in the real world.
Advantages of A/B Testing
1. Each variable can be tested and improved
The basic tenet of A/B testing is to test one variable at a time. By this way, you can test all the elements in question and from the results, you can decide on the version that performs better. This is not only beneficial for your business but also increases the user experience for the visitors on your website or the app.
2. It increases return on investment (ROI)
A/B testing can help you boost ROI as it helps in improving the elements that are strategically linked to your business goals. It prevents your business from losing out on prospective customers and helps in converting the incoming traffic. It also helps save your money on ad spends to get more traffic as you are just working on improving the existing assets.
3. Data-driven decision making
A/B testing flushes out the guesswork and gives insightful data, which helps in making informed decisions. You can test conflicting ideas that exist within your team and let the actual users decide on the right thing. Ultimately, user experience is the driving factor to be successful in this highly competitive world.
4. Easy, Fast and Convenient
Since in A/B testing, you test only one variable at a time, be it the headline, copy or the CTA button, the process does not take much time. It is easier for the users participating in the test, as they have to just observe and select only one option out of the two. It can be carried out remotely as well, with help of advanced A/B testing platforms.
Drawbacks of A/B Testing
1. It is hard to uncover the ‘Why’
A/B testing is a great way to find out what works and what doesn’t. But you should keep in mind that A/B testing will not provide an exact reason why one particular design element works better than another. For that, qualitative user research can help identify these factors.
2. It does not give the big picture
The results of A/B test should not be considered as conclusive, as you are just testing one variable at a time. It doesn’t give you the relationship between the elements and the influence of one element over another. This way, with A/B testing you cannot get to know the bigger picture and for it, you may have to account for other user testing methods.
Essential Tips for Designers to Conduct A/B Testing Effectively
A/B testing offers an avenue for testing the design elements on fine, granular levels to improve your user’s experience.
As A/B testing is carried out on the visual elements, i.e. the UI design, designers need to test each element in question on five different planes of UX.
i. Surface plane: It is the user interface – all visual elements to aid them in using the product/site/app: fonts, colours, styling, animations.
ii. Skeleton plane: The optimized organization of all on-screen elements starts to take shape here through interface design and navigation design
iii. Structure plane: The ‘how’ plane, how much content is there? How is it organized? How is it prioritized? Interaction design and architecture is of core focus here.
iv. Scope plane: The ‘what’ plane, as in what we are doing? It outlines all the features and functions that are going to help achieve the strategic goals.
v. Strategy plane: The ‘why’ plane, as in why the product/site/app exist? This is applicable for both the user and the product owner/designer. It details the goals of the user and owner.
So based on the above-discussed points, here are some useful tips, which you, as a designer can utilize to effectively conduct A/B testing.
1. Focus on your goal metric
Your core focus should be on the initial goal set during the commencement of the A/B test. It usually happens that with A/B testing you collect a lot of data, but this should not affect the metric you want to track which supports the initial goal.
2. Take appropriate actions based on the results
If one variation is statistically better, go with that. If not, test again. Don’t count negative results as failures – it’s important to learn what doesn’t work and what to try next.
3. Use qualitative results to reinforce quantitative results
The results obtained from A/B testing are statistical. It would not be wise just to rely on that. As we mentioned in the drawbacks earlier, A/B testing does not answer the ‘why’ part. So you need to pair with appropriate results obtained from qualitative tests. This will help in achieving the long term strategic business goals.
4. Don’t run tests for a long time
If you are serving one variation of your page to a large percentage of users, it can be seen as an attempt to deceive search engines. Google recommends updating your site and removing all test variations of your site as soon as a test concludes. Hence, you should avoid running tests unnecessarily long.
5. Account for external factors
Apart from the internal controllable factors, there are other external factors that you need to account for while planning the A/B test. Be sure to consider any public holidays or existing sale offers that may influence web traffic. Also, some research tools may slow your site down, or lack the necessary qualitative tools – e.g. heatmaps and session recordings.
A/B tests are a handy way to test certain elements which appear subjective and act as driving factors behind your product or website or app’s success. So, it is absolutely essential to carefully plan the test and ensure that all things go well. Right from identifying the goal to analysis of the data. When done correctly, you will notice a significant improvement in the performance of your product/website/app and also account for an enriched user experience.
Jan 12 12 mins read
Why are design systems important and how to create one?
Design systems are the building blocks of any digital product. This article explores their importance and a simple process of…
Dec 31 18 mins read
How to build a high-performing UX team and lead effectively?
Learn how to build a high performing, cross-functional UX team and how to lead them to the path of a…