There are millions of Web pages online that represent the "best guesses" of the marketing and Web experts who created them.
A large number of these pages appear to be working just fine. Indeed, many are profitable and clearly deliver good conversion rates.
But could they do better? Could the same pages have their design and writing tweaked to deliver better results?
Nobody will ever know. Unless, of course, the pages are tested.
A Shocking Truth for Intuitive Experts
Put simply: However experienced and smart you may be, you really can't tell—on the basis of your personal expertise alone—whether a Web page is working at its best.
You think you can? Then consider this...
My company, MarketingExperiments.com, shares test data on an aspect of marketing online once every two weeks. During one of these teleconference calls we showed our 200 participants three versions of the same page.
Version A was the original page. Test pages B and C were two other versions designed to beat A.
Before sharing the results with our 200 call participants, we asked them to tell us which version—A, B or C—they thought was the winner.
In other words, we asked them to put their expertise on the line and guess which page delivered the best results. All they had to do was look at all three versions on a page we created online, and then check one of three boxes, selecting which page they thought was the winner.
The results of our little poll were interesting, to put it mildly.
The actual results of our test showed us that version B outperformed the original, A, by 15.57%.
Version C underperformed by a horrible -53.28%. It was a disaster.
Here are the figures:
A/B/C Split Test
Page A | Page B | Page C | |
Percent of Traffic | 34% | 33% | 33% |
New Sales | 244 | 282 | 114 |
Change | N/A | 15.57% | -53.28% |
Now look at the results of the poll we conducted with the call participants before revealing the actual test figures. These percentages represent their guesses at which page would perform best:
Value | Label | Count | Frequency |
1 | Page A | 22 | 12.50% |
2 | Page B | 51 | 28.98% |
3 | Page C | 103 | 58.52% |
| Total | 176 | 100.00% |
As you can see, over 70% failed to guess that B would be the winner. Less than 30% correctly anticipated that B would deliver the most sales. Almost 60% chose the page that decreased sales by over 50%.
Why Isn't Every Page Tested?
It is very simple and very inexpensive to test two or more versions of a Web page. So why don't we test all Web pages as a part of a discipline that is integrated into our processes?
First, designers and writers justify their salaries and reputations on a foundation of assumed expertise. In other words, a Web copywriter or designer will rise through the ranks on the assumption that he or she has significant, money-making expertise.
But what happens to that person's reputation if he or she is shown to be consistently wrong through the practice of simple A/B split testing? Is it really surprising, then, that one will sometimes meet resistance to testing from online writers and designers?
But let's not just point fingers at the creative group. The practice of testing is also a perceived threat to many senior executives.
Here's another case in point. We recently designed and wrote a new page to test against a client's version. The evening before the test took place, our contact at the client called to cancel the test. Why? Because he feared that our version would win. And, if it did, the level of political fallout at the company would be unacceptable.
In other words, too many senior people within the company had put their stamp of approval on the original version. And the internal culture of the company was such that it couldn't tolerate the possibility that those senior people might be wrong.
Consistent Testing Will Challenge a Culture
From a rational and commercial standpoint, testing alternative versions of Web pages makes complete sense.
Online testing is fast, inexpensive and will result in optimized pages that deliver better results.
The challenge that the online industry faces is not one of justifying the practice of testing. It needs no justification. Its benefits are self-evident.
The challenge is to the culture within companies. Testing is a huge threat to anyone who rises within a company on the foundation of assumed expertise.
Nevertheless, if the mandate of a company is to make money, then boards of directors everywhere should insist on Web page testing. And all those people whose reputations and incomes float on a bubble of assumed expertise should beware.
How to Get Started With Testing
The basic practice of A/B split testing is very simple.
Look at your original page and then create a new version designed and written to deliver better results, whether that means more sales, more subscriptions or more click-throughs.
If you are new to testing, you would do best to test one small change at a time. Perhaps just test a different headline, or the color of the page background, or the position of an image, or the size and position of the "Buy" button, or the price of whatever you are selling, if appropriate.
Then ask your IT team to deliver each version of the page, alternately. In other words, visitor number one to the page sees version A, visitor number two sees version B, visitor number three sees version A... and so on. And, of course, you need to track the performance of each page.
If you don't have the expertise to conduct the technical side of A/B split testing internally, you can always use an outside service like TestLab@MarketingProfs, Vertster.com, Optimost.com or Offermatica.com.
However you choose to proceed, make a move now. Do something. Make the commitment to start testing. It may be an uncomfortable change for some companies, but the ultimate outcome will be improved site performance and a healthier bottom line.