Posts Tagged ‘conversion tracking’
If you like tinkering with your website, then you have probably heard of A/B or multivariate testing. This is where you can quickly test new things on your website, such as copy, images, call-to-action buttons, placement, etc., and see which combination effectively leads to more conversions. A/B testing is essentially testing two versions against each other that could be completely different, where as multivariate testing assesses multiple areas of the page and tests all possible combinations. So, if you had 3 versions of a headline, 3 versions of an image, and 3 versions of a button, you would have 27 possible combinations in a multivariate test.
Traditionally, multivariate testing has been where each possible combination gets equal play; meaning each combination is displayed equally to your traffic. Once a visitor is exposed to one of the test combinations, they are given a cookie so that each time they return while the experiment is still in progress, they will see the same combination. Google Website Optimizer is considered a traditional multivariate testing tool, where their algorithm will determine the winner based on how many combinations you have, and how many visitors is required to statistically determine a winner. Their model shoots for a 12% minimum improvement in conversion rate at an 80% confidence level.
Now we have what’s called adaptive multivariate testing, which is offered by a company called Hiconversion. What they do is the same multivariate set up, but instead of giving equal play to each possible combination, they only test page combinations that are consistently performing in producing conversions. They claim that their methodology dramatically reduces the amount of traffic required to reach a statistically correct “winner”. This real-time adaptation also reduces the amount of loss leads or sale conversions that a typical test can produce.
This is sort of how Google AdWords works when you opt for the “auto-optimize” feature, which instead of displaying your ads evenly, they display the best performing ads more often.
Well, I personally was never a believer in AdWord’s optimization feature. I thought they determined a “winner” way too early, and so I’ve always turned off the optimization feature, and did my own split testing within the ad groups themselves.
I also had the same disbelief for the Hiconversion tool initially. First of all, how could you really determine a winner if all combinations were not played evenly? If one was played more than the other, then of course that combination would win!
So I began to ponder about the mathematics involved with this (I know…that’s what I do). First let me start by saying that I do not know how the Hiconversion algorithm works (I did ask them, but they said they would have to shoot me), so this is just my rationalization.
Let’s imagine that our multivariate test had 100 possible combinations. As the testing starts, each combination gets equal play, or 1%. The algorithm can quickly see which combination is starting to get higher conversion rates until it reaches a point to where it begins to test itself. (Bear with me). So if a few combinations are “starting” to look like good performers, the algorithm might say “OK, you got 5 conversions on 100 plays in X time…let’s see if you can get the same 5 conversions or better for the next 100 plays in the same time period.” If the combination meets or exceeds the mini-test, it moves up to the next “level” where it is now played 3% of the time, where if it doesn’t, and lets say it falls to 4, or 3, then perhaps it stays among all of the other combinations that are tested at 1% play. So the combinations in the experiment keep getting tested in this manner until there are only a few left, where a “winner” prevails. All of this happens very quickly, and in real-time. It’s kind of this survival of the fittest scenario because the combinations that can’t leap into the next levels, actually get played less and less over time, since the ones that are performing are eating up their play time.
Anyway, that’s my take on how it might be working, but again, I didn’t design the tool. My ultimate experiment would be to do identical experiments in Google Website Optimizer and Hiconversion and see if they arrive at the same result. Of course, I can’t be experimenting with all of our clients leads and sales, so maybe I’ll try this on my site one day.
So often we are concerned with the “big conversion” on the website, like purchasing something, for example. We call this a macro conversion – it’s your ultimate goal. But what about other activities, maybe not as valuable, but still worth something.
We forget that marketing is basically broken down into these 3 pieces: Awareness, Consideration and Purchase.
Everyone, including upper management, is zoning in on purchase. But what about awareness? Remember the circles of trust graphic? It’s highly unlikely that many will purchase from you when they don’t know you.
My point: create micro conversions in the Awareness and Consideration stages and measure them!
Things like entering a zip code, joining a mailing list, or subscribing to your RSS feed. Now you have a chance to converse with some highly potential, future customers on a permission-based marketing system, versus a interruption marketing system.
Now, assign a value to these micro conversions. A zip code might be worth $1 to you. Asking for a zip code is great because it further refines what geo-tracking in Google Analytics can’t do. Now you know what zip code your visitors are from, so it takes some of the guesswork out of your next direct mail piece.
Use this value to compare to the costs you’ve put into the activity, such as SEO, PPC or even web analytics. Before long, you will be able to see which activity is driving the most value.