Speaker diversity at conferences

I enjoyed going to the Conversion Conference recently in San Francisco and picked up a few great nuggets to try out on my sites. I especially like it when speakers approach conversion rate optimization from a different perspective than AB or Multivariate testing. Michael Summers‘ presentation on eye tracking was quite literally an eye-opener for me.

I also appreciate hearing directly from retailers and practitioners doing conversion optimization, rather than consultants doing CRO for retailers. Don’t get me wrong, I got a lot of great ideas from consultants, but it is important to get different perspectives, if only to mitigate biases, however subtle they may be.

The bias I see is that many presentations talked about the successes of website testing. Somehow most experiments ended up beating the original. The problem is that if you only report successful experiments you might get the impression that it is about AB testing per se. Get a tool and watch the improvements roll in.

This type of bias, called survivorship bias, has been described in the book Fooled by Randomness by Nassim Taleb. An example he uses is the book “The Millionaire next door“, which studied and then recommended following the habits of those people who became millionaires. But as Taleb points out, “That all millionaires were persistent, hardworking people does not make persistent hard workers millionaires” and “We are trained to take advantage of the information that is lying in front of our eyes, ignoring the information we do not see”.

So we need to know about the failures, even though they are less sexy than successes.

You shouldn’t care that site XYZ improved conversion rates by changing the Call to Action on the Add to Cart button. What would be more interesting to know is the number of other sites who tried exactly the same experiment, and then look at the percentage of successful experiments. Kinda like an experiment of the experiment ;-)

Wouldn’t it be much more informative to know that out of 100 different sites that changed the Add to Cart button, only 10 of those reported higher conversion rates? If I only hear about the 10 successful ones, I don’t get the full picture if the other 90 are left out. Contrast that to a (hypothetical) meta experiment where 30% of sites reported higher conversion rates.

Having this type of prior success probability would help you prioritize what to test in a more objective manner.

Not saying it’s easy to get this type of data, but you should still be aware that you may not always get the whole picture. Conferences naturally attract like-minded people, so it’s very important to have a diverse group of speakers and I hope that conferences like the Conversion Conference will continue to do that. I will certainly go again in the future.