October 3, 2018

How I became an A/B-testing addict

Olga Shavrina
Every product manager says "You must use A/B testing!". But do you know somebody who really does it on a day-to-day basis? I'm happy for you if you do :) I didn't realise the true value of A/B testing until I started working at Nautal – marketplace with a volatile conversion rate that changes a lot depending on the season, region and even weather.

Any business, related to sport, tourism or any outdoor activities can feel this instability. For example, in our nautical sector the highest conversion is in summer. In September it starts to go down. An average check behaves the opposite way as well as a decision-making time. In winter it takes days or even weeks for people to decide on the deal but in summer they need a boat right away. The percentage of mobile and desktop users also varies depending on the season.

And the weather – oh yes, the weather is a pain.
– Hey, what's happened to our conversion rate in France?
– Well, it's raining there today.
– О_о
Changes are fast and dramatic. One week can naturally perform better or worse up to 20-30% than another one. In these circumstances, it's practically impossible to say whether your UX improvement made any difference. So without A/B testing, you are almost blind.
Tools for A/B testing
Previously I had heard about Optimizely – popular tool for A/B testing with an awesome knowledge base. But it is so expensive that there aren't even any prices on the website :) it is thousands of dollars per year.

So we chose Google Optimize instead. It has more or less similar functionality as Optimizely, but it is free. The cool thing is that it works together with Google Analytics where you can view detailed reports.

Ok then, let's start from the simple things and see how it works.
Testing colors, paddings, fonts etc.
Have you heard the phrase «A button can be any color, if it is red»? So now we can prove it right or wrong.

Optimize asks what page you want to use as a base for configuring an experiment and uploads it to the visual editor. There you can choose any block and specify its color, size, paddings, border.... in short, everything that can be done using CSS.
Google Optimize – editing a style of the element
Experiments with a content
You can change headers, captions, links… basically all the static content on the page.

Together with the targeting of the audience, this feature can be used to customize the page for the user. For instance, you can show personal headlines to people who have come from advertising or visitors from a particular region, repeat visitors or users of mobile devices.
Google Optimize – editing the content
Important! All of this must be checked and tested carefully because it is very easy to break something, e.g. the website localisation. Also, it is important to be careful when changing the dynamic content and interactive elements. Say if you change a search field placeholder you can damage all the search.

The general rule is this – at first, the original page uploads with all the static and dynamic content. Then all Optimize changes apply. We just have to take it into consideration.
Block sorting
One of my favourite types of experiments – block sorting. E.g. the position of blocks on the landing page or product page can influence the conversion rate pretty strongly.

Drag and drop blocks wherever you want. You can put a block in any place of the page (it usually means you damage an HTML structure) or use a default option to reorder the blocks of the same type – list items or one level divs. At this point, you will imbue with respect (or not) to your frontend developer and see how well the layout is done.
Google Optimize – reordering blocks
Reordering of blocks can give a very good increase in conversion rate. Plus, this experiment will tell you whether or not this block is important for users.

Also, you can reorder blocks in one experiment variant and completely remove one of the blocks in another and see what will happen.

For example, on our yacht description page, there are two blocks - "Secure booking" and "Contact the owner". Nobody knew whether they are necessary or not and which of them is more important.
That's why I run an A/B test with three variants – original, with reordered blocks and without the security block.
A/B-experiment example with 3 variants: original, with reordered blocks and with deleted block
A variant with reordered blocks constantly shows the best conversion rate. An interesting thing is that a variant without a security block performs worse than the original variant. So we can't delete it but it can be a good idea to switch the blocks.
How to test a new feature
Say, you add something to the page and want to test whether it improves conversion rate or not. It is super easy to test this in Optimize. Just remove a new block from one of the variants and launch an experiment. In a few days compare the results and start pulling out your hair celebrate.
A bad result is also a result
Sometimes you don't really see a big difference between variants and the results are contradictory. It means that either everything is fine with an interface block and even your improvement can't damage it :) or nobody is using the block in the first place.
Example of A/B experiment, when all the variants perform more or less the same way
You can try to delete the block or to move it to a different place and see how it will influence the conversion rate.

But sometimes you see very clearly that one variant performs better than the other. That means you are on the right track and it's a good idea to implement your improvement on all the website.
Example of A/B experiment where one of the variants is a clear leader
In order to get a statistically significant result, it's better to wait for a couple hundreds of conversions. Plus, Optimize itself recommends running experiments for at least two weeks in order to exclude an impact of intra-week fluctuations.
What if a result is positive
Say, an A/B experiment confirmed that our idea was good and a new variant performs better than an old one. What should we do? We can (and in many cases need to) formulate the task for developers.

Even if we want a new feature in production right away, we know how long it usually takes to develop, test and deploy it. So in order not to waste time – we can just switch the best variant in Optimizer to 100% of the audience. This is the way to deliver the best design to production fast and at the same time give your IT team time to develop and test everything properly.
Switching the best variant to 100% of the audience
Targeting
Google Optimize has a pretty advanced constructor for targeting the A/B experiment audience – 11 groups of rules for making magic.
Targeting opportunuties in Google Optimize
I have been using two categories so far:
1
URL's → Contain… – and specify the string that page URL has to contain. Here you will say (or not) "Thank you!" to your SEO-engineer. Because if you have a proper URL structure – you can easily select various groups of pages for your experiments.

E.g. in Nautal I can launch an experiment only on landing pages of catamarans in France or sailing boats in Italy and Greece.

If you can specify different groups of pages it can be a good idea to launch the same experiment on different pages, compare results and learn something new about your audience. E.g. our clients who rent motorboats behave differently than people who rent sailing boats.
Our awesome URLs
2
Technology → Device category, to run an experiment only on mobile devices or desktop. Well, comments are not needed here.
Experiments log
I found it not quite informative how Optimize shows a list of running and finished experiments. First – it doesn't show experiments of all the domains in one list and I have different language versions of the site working on different domains. Second – a list doesn't show which variant performs better and what conclusions can be drawn from this.

That's why I made my own table where I put information about each experiment:
  • Is it on or off,
  • Date of launch,
  • Goal (what metrics I want to improve),
  • Targeting (domain, group of pages...)
  • Hypothesis (what I want to check),
  • Result (which version won),
  • Insight (what conclusions can be drawn from this).
Experiment log in Google Spreadsheets that I share with all the team
It turned out very convenient. First, for myself – it motivates to organize work and write down conclusions, plus it is helpful to analyze the results. Secondly, for the team – when they see something strange on the site – they can look in the file and quickly understand, this is my experiment, or something broke :)
Everything would be great if...
There is one thing that worries me. When the page is loaded, the original design is displayed and only a second later the changes of the experiment are applied. If I change the design at the bottom of the page - everything works perfectly, but if the changes relate to the title - the user notices a twitching.

Optimize recommends installing a plug-in that hides the page before it fully loads and the experiment changes are applied. But this delays loading of the page for a couple of seconds. And the slower it is loaded, the lower the conversion rate. So we decided to abandon the plugin and live with the twitching of the interface :(
So what do we have
Optimize is not perfect, and I use, probably, only 20% of its functionality but I have definitely started feeling confident, which is so necessary when your conversion rate depends on the weather conditions.

The more experiments you run, the better you understand the target audience and the market. You come up with more and more ideas and they are more and more successful. It's addictive :)

So, I think the tool is useful. Next steps will be to learn how to run experiments with dynamic content: cards of boats and popups. If you have such experience - I will be very glad to hear about it.
Olga Shavrina
Product manager. Human being