Website optimization and analytics

A/B Testing may not be as powerful as multivariate testing, but if you are starting out with website testing I would personally recommend doing some A/B experiments first. These are just easier to set up and implement, and you are also far more likely to results sooner than a multivariate experiment.

So I was pretty excited about the announcement that Website Optimizer would have separate wizards for A/B and multivariate tests.  And I can confirm that setting up an A/B test is much quicker than a multivariate test. There is just one annoyance. The wizard insists on checking the experiment tags on the live pages. This is OK for the original page – say the homepage – and the test page(s), but it is not logical to assume that the conversion page is accessible directly because it could be at the end of a checkout process. You can just create a dummy conversion page, slap the tracking script on it, and then do the validation check. Once the wizard is happy, you can put the tracking code on the actual conversion page.

Anyway, as reported earlier, website optimizer reports about number of visitors and conversions. So it can be a bit of a crude measure – what about other metrics such as profitability and average order size, etc…?

Take this (real) example:

Woabreport_2

(click for larger)

Obviously no winner yet, but lets look at what analytics shows us:

Bouncewo

(click for larger)

Here the test page has a (significantly?) lower bounce rate than the original. So, my question is whether this sort of data would be valid to consider when picking a winner? Even though overall conversions are the same, I like a lower bounce rate…