Common Mistakes Businesses are Doing Whilst A/B Testing

common mistakes businesses are doing whilst ab testing -

A/B Testing is not only Valuable – it’s Satisfying, too.


There’s something incredibly fulfilling about running a test that gives a clear overview of web page performance. However, it’s not all plain sailing and mistakes are often made whilst A/B testing.


Here are the Top Mistakes that we see Businesses do Whilst A/B testing:


Thinking A/B testing will Solve the Low Demand for your Product or Service

If your website is struggling to convert users, it’s easy to blame CRO. But sometimes, it isn’t conversion rate optimisation that’s to blame for the low demand for a product.

Sometimes, it’s the product itself, or your marketing strategy – and A/B testing isn’t going to help you solve those problems. It’s important to take a look at your sales/marketing funnel and to assess all avenues to gain insight into the reasons for low product or service demand.


Running too Many Tests

In a similar vein to our first point, it can be easy to run test after test on a website without looking at the bigger picture – how you are going to get customers to your website. You see it is all well and good maximising the conversion rate of the people who find you, but you should never neglect the ways that you are going to pull in new customers.


Testing Days or Weeks After Making a Change

This mistake is one we really wish businesses would iron out. It’s when a business makes a change to a web page then does another further down the line, rather than at the same time with two versions of a web page. The issue we have with this is that you are effectively wasting your own time and the potential business benefits of testing pages side-by-side and gathering data about the performance of the pages.


Using Somebody Else’s Data

It’s easy to assume that a business in your market with a similar structure will produce invaluable data for your own website. But this is not always the case. What works for them may not work for you. You can use another entity’s data to guide your changes, however, you should never count on that data to improve your conversion rates. Count on your own.


Not Seeking user Input and Customer Feedback

The people who really understand your website are your users – so there’s no better source of information than them. And yet, a lot of businesses choose to gather data from tools and software instead of the people who are using their service.

This is obviously a mistake. And so, we recommend running a site survey or even a poll or two that asks your users how they would feel about certain user interface changes or how they would change your site to improve their shopping experience with you.


Running Tests that are Too Short

A/B testing requires time to work. Ideally, you want to test two or more versions of a web page for more than 7 days. However, a lot of businesses run tests for longer and it is these businesses that are gathering the best data.

You should do the same and run A/B tests for a good amount of time. Our data suggests that running a test for 7 days gives a 95% or higher likelihood of finding a winning version.


Not Testing Everything, and Making ‘Blind Changes’

This mistake is really, really common. For example, the CEO of a business may ask a website developer to add an image to a web page without it being tested. To the CEO, this is just an image and it can’t possibly do any harm. Yet the data may say otherwise.

And so, it’s important for those in charge of conversion rates and user experiences to stand up and say no – we’re not going to do that, at least not before the impact of adding that image or other element has been analysed and we are sure it won’t adversely affect us.

Startup Lab workshop: A/B Testing done right via GV


Don’t Forget

You can monitor the progress of your A/B results using Ruler Analytics Visitor Report Learn More – P.s Make sure you follow us on Twitter so you can keep up to date with all our latest news here at Ruler Analytics.






Written by

Digital Marketing Manager at Ruler Analytics with a background in SEO, analytics, content marketing and paid social. I help people (like me) close the loop between marketing-generated leads and revenue.