Eliminate the guesswork with A/B testing

Deciding exactly what to do with your business’ online marketing can be a real challenge. Let’s say, for example, that your website gets plenty of traffic but you don’t think it’s bringing in as many customers as it should. The site is informative and it looks nice — hey, there’s even a call-to-action featured prominently on the home page — but only a tiny fraction of your site visitors are following through by submitting a contact form or sending in an email. A junior rocket scientist on your staff suggests that maybe something on the site should be changed to drive more conversions. All you have to do is figure out what sort of change will deliver the higher conversion rate your business needs.

Before you start making wholesale changes — or decide to bite the bullet and invest in a completely new site — stop and think for a minute. The problem very often lies in the details. Maybe you only need to make one or two small changes to see a significant increase in the number of conversions. But how do you decide which changes will have the biggest impact?

The answer is simple: Testing.

Online marketing is a data-driven business. Nearly everything of business value that happens on the Internet — website visits, ad click-throughs, email responses — can be logged, quantified and analyzed. As a result, it’s fairly straightforward to set up a testing environment that can use all the data collected through your website to help guide your decision making process.

A/B testing, also known as split testing, is the easiest testing methodology to set up and probably the easiest type of testing to use as a guideline. The idea behind A/B testing is simple: You set up two variations of a web page and visitors are (randomly) directed to one of those variations. The “testing” part comes in when you measure the difference in visitor response to the variations.

You don’t have to invest in a ton of new design work in order to get a lot of value out of A/B testing. The changes between the two web page variations don’t have to be big. The differences can be as minor as switching out headlines or moving the location of the call-to-action. It’s surprising how often very small changes can cause big improvements in conversion rates.

A/B testing isn’t just limited to websites. In fact, a lot of people who are running pay per click advertising campaigns may be using an A/B testing method without even knowing it. In Google Adwords, “ad variations” allow you to run different versions of text for the same ad — and Adwords then provides you with conversion metrics so you can see which version (if any) works better. Many of the popular email marketing programs, such as iContact and Constant Contact, also let you send different versions of an email to test groups to see which one gets the best response.

An important point to keep in mind is that A/B testing relies on statistical methods for its results. That means any kind of split test you may want to setup — web page, online ad or marketing email — requires a decent number of sample responses in order to be effective. In simple terms, if your website doesn’t get enough traffic (or your ads/emails don’t get enough views) then your A/B test won’t return any kind of statistically significant result. In that case, you need work on increasing raw traffic numbers before you worry about the details of fine-tuning your conversions.

 

For more information on setting up A/B Testing, please feel free to call us at 800-709-3240 and speak with one of our marketing consultants.