The Power of Split-Testing
by Steve Jackson
Direct marketing professionals don't guess. They base their decisions on statistics and calculate what the return on their investment will be.
Because of what is known as split-testing or test runs, direct marketers rarely get it wrong. Why then do Web “professionals” rarely if ever pay attention to this incredibly powerful strategy?
This article is about how to run a split test, or A/B test, on your Web site and how it can affect your Web site conversion rate only in a positive way.
What has direct marketing got to do with it?
I used to work for one of the biggest direct mailing houses in the UK. It used to send over 100 million letters for many different companies to different types of prospect every month, over 1 billion letters per year.
Its success was largely due to its methods of testing. If it had a mail shot of 100,000 sales letters, it would do a test run on 1,000 targeted people and measure the response. It's fair to say that if it got a 2% response from those 1,000 people, it would get the same response rate from the same target market for larger mailings. So from 100,000 people they would bring in 2,000 sales.
In that way, it was very easy for this company to justify the expense incurred in sending the mail shot and optimizing the sales letter. The customers of this company knew what the return on investment (ROI) was going to be. So they lined up to get this company to sell their products.
There were still times when something went wrong, of course—but never by so much of a difference that a customer would lose money. This was all because of the testing. This company would never do a mail out without a measured test run, and it never tested to less than 1,000 people.
(Incidentally, the highest conversion rate we achieved while I worked for the company with this method was 4% sales conversion from 1 million sales letters (40,000 sales), resulting in a profit for the customer of over $200,000.)
What has this got to do with Web marketing?
In the direct marketing world, there is a cost incurred for the test run: the letters needed to be optimized, printed, posted, measured and potential ROI calculated. On the Web, you can do the same testing and achieve similar results without it costing you anywhere near as much.
Why? Because on the Web it's easy to test and optimize your Web pages in the same way, simply by using a good measurement system to measure the results.
What is a split run?
A split run is where you measure a new idea or way to sell a product against a control or default that you know works.
So you might have two pages of sales copy—one that you know sells 2% to a targeted audience, and another that is selling the same thing but at an unknown rate.
How do you do this online?
First, you need to invest in a decent measurement system—something that accurately records the number of people arriving at your Web site and the pages they land at.
Second, you need to have a system or Web developer at your disposal that allows you to alter your pages as you require. This is your investment. If you aren't prepared to invest in your own future, then you should question whether you want to have a Web site at all to sell products or services .
Then you need to define what variables will make a difference to your Web sales strategy.
Here's where it gets interesting
You can write a Web sales page in much the same way as a direct mail letter. There are plenty of small businesses out there that do this quite effectively, many of which have done tests and measured response. The ones that succeed do that.
That, however, is a single Web page. A Web site is different. On a Web site, your landing pages should convince the browser that continuing further into your site is worth the effort. In other words, you try to stop people from what is known as “bouncing”—arriving at the site and looking, but finding nothing relevant and leaving.
In the case of a Web page or a Web site, split-testing can and should be applied to the following three content sections in order to best cater for your readership. Remember that it's not your Web site, it's your visitors' experience that counts. The better their experience, the better your chance of improving your conversion rate.
1. Headlines
We have done tests on headlines that improved readership conversion 36%. In other words, by using a better headline we stopped 36% more people from leaving our Web site.
We tested by using a split test. We knew that our home page converted (got people to click through to another page) around 50% of our audience. We considered this poor for our main landing page and so wrote a programming script that alternated between two headlines to see if it made a difference.
We ran this split test until 1,000 people had visited the page. We knew that 500 had seen the default headline and 500 had seen the new one. Of the 500 who saw the new one, just over 100 bounced—or left without doing anything. This meant that we had a distinct improvement, especially when we compared it to the default page, which was consistent in its mediocrity: only 243 people from 500 actually went further into our Web site.
So we switched the headline, and the following month 86% of our new visitors arrived and went further into the Web site. The reason for this is quite logical: people read headlines. If the headline doesn't strike a chord, the reader will not bother reading the rest of the page… and will just leave.
2. Links
Links within content can also be split-tested. We used two versions of one link that drew a 100% improvement in click-through.
It's not so much where the link goes but the attractiveness of it. When you use a link like a headline or as part of an explanation, it draws the reader deeper into the site. The content where that link goes should continue the process.
In our split-test, our default link used to say “Click here to read the rest of this article.” Our tracking system reported that the link got 49 clicks from 500 visitors, or just under 10%. Our split test was to use the headline of the article as a link. The headline read, “Without Conversion Rates, You Don't Know If Your Mickey Mouse Or Mickey Mantle.” We improved click-through to 102 from 500, or just over 20%—a 100% improvement.
Again, our method was a simple php script, which alternated between two versions of the link.
3. Navigation Systems
We have just finished testing navigation on one of our Web sites. One test was DHTML with drop-down menus, which meant that a visitor could get to any section of the site with one click of the mouse.
The other was a simple set of links on the left-hand side, less effective on the amount of clicks required to get to a page but very obvious as a bunch of links that you're supposed to click.
Because of this split-test, we are going to change our entire Web site structure to display the simple links on the left-hand side. We found that from the 500 who looked at the ordinary simple navigational links, 30% more people clicked them. The DHTML—while built to potentially make it easier—seems to be less effective from the results we've seen.
While this is true of our site, it differs for others. Sites where it is important to find things easily will likely benefit from DHTML menus of this sort. But our site doesn't really need it.
What can't you test?
I could go on. You could test different versions of graphics, you could test subscription and sales forms, you could test background/text combinations, link colors, buttons (buy now, order now, buy, order) you can even test paragraphs of content. There is nothing on a Web site that cannot be split-tested.
How do minor changes like this affect conversion?
Apart from the improvements in click-through and readership, the changes affect sales.
The targeting you use in headlines gets more immediate readership; this means that there is a chain reaction toward the improved linking that gets more click-through; additional click-through results in more interest in your products and services. In the end, you get a better prospect and sales acquisition rate. In other words: more sales conversion.
Split-testing shouldn't be a new idea for the Web, it should be the norm.
---
Steve Jackson (steve@conversionchronicles.com) is editor of the Conversion Chronicles and CEO of Aboavista, a Finnish company that improves Web conversion rates.
Check out this One-Stop Ads, Visitors, Clicks, Actions & Sales Conversion Tracking / Split Testing Solution at:
http://GoTo-Pro.com/silver/5772