A-B_testing_example

If you love getting more leads for your business (or your clients’ businesses) you should love A/B testing. Whether we’re doing local, Portland conversion rate optimization or casting a wider net, A/B testing not only increases sales, it also helps us learn more about our audience, which sometimes doesn’t act the way we expect.

When you do web design and execute or study A/B tests, you get a general sense of the way people interact with websites - what makes them stay, what makes them bounce away, and sometimes what really pisses them off (a great place to do this is on WhichTestWon). With more experience, you can even get a specific feel for certain types of websites. We do a lot of eCommerce, professional services, and non-profit web design, so we’re very familiar with visitor mindsets on those types of sites. However, as these next five A/B tests will show, you can never assume anything without testing.

Test 1 - Social Proof Equals More Sign-ups

AB testing email signup

Assumption: Social proof, or showing that others like and use your product or service, results in higher conversion rates.

Result: Since you’re expecting a surprise, you probably know that the version on the right won. This does indeed go against all popular opinion, and even our own previous blog post recommending social proof on portfolio websites. But you probably wouldn’t guess that it beat the left version by a whopping 122%.

This is a very interesting result because social proof so often works, and unlike the button test above, there is no immediately apparent reason it wouldn’t work here. One hypothesis is that with many popular Twitter and Facebook accounts having hundreds of thousands of followers, 14K just isn’t impressive enough. Personally, I don’t know if I buy that, so I would want to follow up with a test that showed the same format with a more impressive number. If the higher number worked, the blog would know to change the design when their subscribers actually reached that level. If not, the difference in conversions could probably be chalked up to the simplicity of the form on the right, or that social proof is overkill for an email signup.

Test 2 - Icons Make Navigation Easier

AB test icons

Assumption: Because icons are more visual, they make navigation easier and will result in a lower bounce rate, and therefore higher sales.

Result: The variation without the icons won with 21% more product purchases.

Even though the icons added were for the most popular product categories on the site, visitors didn’t go for it. An obvious potential cause is that, as you can see, the icons clutter up the design, and clutter is a bane of usability and conversion. We don’t have screenshots of the lower portion of the screen, but I would like to have seen a follow up test with these same categories, or fewer, moved into the body of the homepage.

Test 3 - Trust Symbols Help Form Submissions

AB test form trust symbol

Assumption: Adding a trust symbol, such as a privacy seal, will ease fears of information security and increase form leads.

Result: The variation without the trust symbol received 12.6% more submissions than the one without.

You’ve seen similar trust symbols throughout the web, so why didn’t this one work? Well, consider where you’ve seen them. It’s usually during checkout, right? This is a simple lead generation form, and it’s possible the visitors took the symbol to mean they were purchasing something and left. Testing a different symbol or a simple text link for the site’s privacy policy would be useful follow ups.

Test 4 - The Classic Button Color Argument

AB testing button color

Assumption: Green is the best color for a call-to-action button because it is associated with moving forward.

Result: Version B outperformed Version A by 14.5%.

And this even when the winning yellow variation was ugly as all get-out! Beware, though; my heading here about button color is a little misleading, to illustrate a point. Button color is only one of the differences, you may also notice that, while it is true that the winning version had a yellow button and the loser had a green button, the white text on the green button isn't very readable at all. So, even if you are intending to look at only one variable, be aware that other incidental changes can and will affect results. A good follow up test to this would be to see how a version of the green button with black writing (which would be more readable) does.

I should mention here that other case studies have shown that in general the most important thing about button color is that it stands out from the rest of the site, green or not green. You can read about a test case on Hubspot where red beat green.

Test 5 - Humans Help Sell

AB test with no human face AB test with human face

Assumption: Images of humans (especially smiling ones!) help conversions by establishing a connection with the visitor.

Result: The top version, without the image, won with 24% more form completions. Why? Well, I have my theories, but I’ll leave this up to you to guess, and if you have a good one, post it in the comments. It’s always interesting to see others’ interpretations of results.

What did we learn here?

This isn’t as obvious as it might seem. Even though I presented 5 cases that went against general principles, it doesn’t mean those principles are wrong. It only means they aren’t always right, and are sometimes drastically inferior. It means we start with best practices, but always test our assumptions, and learn not only what works and what doesn’t, but WHY. When we learn why, there are no real contradictions.

To that end, I hope you also gathered that conversion optimization isn’t a one-time thing. For each test above (except #5) I suggested follow up tests. After those, there would be more. And some some more. Continuous optimization is important to truly understand why things are working and find ways to generate ever more sales.