Conversion Optimization 101: 3 Ways To Screw Up A/B Testing

6787305664_75637526e7_o

Over the last decade I have done a lot of A/B testing, it started with my blog and now we run new A/B tests at Fashion Metric just about every week. When it comes to conversation optimization A/B testing is absolutely critical, here’s the problem though, many people screw up their A/B test without knowing it leading to false conclusions.

There are a lot of great tools for running A/B tests but my personal favorite it Optimizely. I like Optimizely because it’s simple so you can quickly setup a tests and see results in a few days. That being said, even simple tools can be used the wrong way and running a bad test could cause you to change your site in a way that actually hurts your conversion rate.

So if you’re just getting started with A/B testing here are three thing that could screw up your tests:

  1. Testing too many things at once – the best A/B tests focus on one specific element on your page. Maybe you move a button from the left to the right, change a headline, or change a button color. If you do four things at once you’ll never actually know which one made the difference which could lead you to make changes to your site that are less-than-optimal.
  2. Testing with completely different traffic sources – testing one set of site changes with customers coming from Facebook and another with customers coming from Twitter is a recipe for disaster. Every traffic sources brings with it different customers with different customer behavior.
  3. Not testing with enough traffic – you could easily send 50 people to your website and use this data to make a decision. The problem is you might find a completely different result (and possibly the exact opposite) if you send 5,000 people to your site. The more traffic the better, too little and your results won’t be very meaningful.

Photo Credit: robertpmeade via Compfight cc

Morgan Linton

Morgan Linton