What is A/B testing and how it works?

So what exactly A/B term is? A buzzword or secret weapon of successful marketers? And most important if it’s so powerful why it doesn’t get the same amount of attention and interest as SEO or e-mail marketing topics? Let’s find out.

According to Wikipedia A/B testing is: “In marketing and business intelligence, A/B testing is a term for a randomized experiment with two variants, A and B, which are the control and variation in the controlled experiment.”

In a nutshell, it is as simple as it sounds you compare two-page versions (variants) against each other and after getting a sufficient amount of data you pick the winner (champion.) With one simple goal in mind – improve your website conversion rate. Or any other metric that you want to improve now such as sales number, bounce rate, time on page, etc.

At the end of your test, you can be sure that changes that you make on your website are backed up by real-world data and they aren’t just a guess that can do harm to your business.

Why should you do A/B tests

Acquiring new quality traffic can be a pricey option but on the other hand optimization of existing page can give you quick results on a low budget. Since you will be doing split tests among the users (traffic) that you already have. And it is also a fascinating and challenging process.

How A/B testing works?

First, you need to come up with a hypothesis list of changes for your website that you think will increase one of your metrics. It can be anything, but most common things to test are:

  • Product description or headline text;
  • Layout and style of your website;
  • Form’s length;
  • Pricing model;
  • Promotional offers;
  • Images used on your page;
  • And of course, Call to action’s (buttons) wording, color, design, and placement.

It is essential to stay creative because one of the very bold changes can make a huge difference and help you outperform your competitors.

Come up with a list of changes for your website that you think will increase one of your metrics. Click To Tweet

Now create variations of your page that will compete between each other. If you don’t have coding skills than a tool like Visual Composer will be very handy for you; with simple drag and drop interface you can create new page versions fast.

Remember to have just one difference between your competing variants. Otherwise, you can’t tell what worked and what didn’t and the only outcome you will get is that your time is spent for nothing. Send traffic to your competing pages evenly and serve them at the same time. If you will test your variants one by one, then there is a big chance that something else will affect results.

It might sound complicated, but good news that there are tools that are designed specifically for this, and they take care of all the complex things. In fact, I’m sure that you are using one of the possible tools very frequently – Google Analytics. Yes, Google Analytics despite being a great statistical tool it can help you with your A/B tests too.

Note: There are other great tools for the job, and some of them have more features, but personally I like the fact that it’s already in Google Analytics, and you don’t need to install extra software or opt-in for another online service.

Hands-on guide

Wondering why you never heard about A/B feature in Google Analytics? Because they called it “Experiments.” Now you can dive into your analytics dashboard and navigate to “Behavior – Experiments.” to create and run your first test.

Doing tests with GA is pretty straight-forward simply follow on-screen instructions. Start by clicking the “Create experiment” button and give a name for your first test. Then, select the objective (goal) of your experiment, this is how the champion variation will be determined. Under the “Advanced Options” you can set how your traffic will be distributed, minimum time that test will be performed and confidence threshold that should be used in decision-making call. It all can be left as defaults for most of the cases, but if you experiment with existing page, it is a good idea to drive more traffic to the original page and smaller amount of users to the competing variations. That way you will minimize the risk to do harm in case of bad assumptions in your tests.

But if you launch an entirely new page then you can go with evenly distributed traffic to find the champion faster.

Google Analytics Experiments for A/B Testing
Creating Google Analytics Experiments for A/B testing

Now it’s the time to click “Next step” and provide your competing page variations. Google Analytics will spread the traffic among those pages automatically, and the most important single user will see only one variation during the experiment. This means a more consistent user experience and more accurate results. You can add as many variations as you want just make sure they are reasonable to test. Otherwise, the experiment will be running longer, and you will lose time. And keep in mind, it’s better to make simple tests learn and adapt faster.

Proceed to the next step when you are done adding pages to get javascript code that should be inserted into “Original” page. Now is the time to get your hands “dirty” and luckily this is the only part where we will touch the code. If your website is running WordPress, then it is most likely that your theme will provide you with an option to insert javascript into a particular page. Pay attention to the advice from Google Analytics “Paste this experiment code immediately after the opening head tag at the top of your original page.”

Now review your settings and start the experiment! Wasn’t it too complicated? It will take some time for the data to appear in your analytics dashboard so simply relax and wait now. And when you are rewarded with results – take actions. Stick to the winner and lose all weak variations. Sometimes you will be surprised by very controversial results, but data-driven approach can’t be wrong.

Common A/B testing mistakes

Tests called too early

To make decisions, you should be confident and to be confident you should have enough data to make the call. So make sure that the threshold for a test is large enough because you don’t want to make changes to your website with a 50% probability. The lowest that Google Analytics will allow you to go is 95%, and that is wonderful.

Also, it’s critical to hold tests in full weeks, how can you end a test launched on Monday if you don’t know how your change will perform on a weekend? Exclude seasonality from results by running tests in full weeks; again Google Analytics suggest to perform two weeks long experiments.

Tests performed on a website with low traffic

Who cares if your conversation rate will improve by 30% and your sales will jump from 1 sale a month to 1.33 sales? I doubt that so drastic change can even be noticed. If your website is a low traffic site with a low number of conversions, then you are in a better position where you can do more brave experiments while you “shape up” your business. And see improvements by thousands of percents, rather than thirty.

Ignoring small gains

It is hard to come up with a change that will improve your metric by 50% unless your website is crap. If you run your website for some time already then it should be “crap free” already. So a 1% to 10% improvement is a big deal already. There are websites where an improvement of 1% can lead to millions of dollars in revenue. Doesn’t look so small now, right?

Running before and after tests

Never run one test for a period of time and then another one for the same period. Traffic quality varies a lot, and that means that your tests will send you a wrong message. Instead, do simultaneous tests.

Running too many tests

It’s very tempting to run a lot of tests at the same time on your website especially when you just started with them. By doing so, there’s a chance to lower your revenue significantly. Each test you make has a probability of decreasing the conversions and drop in revenue (we all want to improve, but it still happens sometimes), so the more tests you run, the more loss you can encounter. A/B testing is a game where there’s no room for the rush.

Testing too many variables

If your variants have more than one difference, then you can’t tell what led to the positive or negative outcome of the test. Make granular improvements it is the only path to success.

Falling in love with an A/B test about magical color changes

Forget about A/B tests of button color. Don’t spend your time testing insignificant changes. I’m sure there is more space for improvement than a simple color change of your call to action block. Once established, you will have a plenty of time to test button colors as well.

Now when you know how to perform tests I invite you to do one and share your results and a small piece of advice – never stop experimenting.

More on wpcrib

2 comments

  1. Thanks for this guide on experiments in GA. Wondered how to proceed with A/B test for long time and now I know 🙂

    1. Thank you for the nice feedback. We will continue to publish this type of hands-on tutorials.

Add comment

Join discussion and make an impact. Your email address will not be published.

GDPR is going into effect on May 25, 2018. Learn more in our new GDPR section. You can also view changes to our Privacy Policy.
We use cookies to provide a personalised experience for our users.