A/B Testing: Guess the Winning Result

corte-swearingen-director-ab-testing Corte Swearingen | October 26, 2017 Comments
Validating-A.B-Testing

A/B testing is the science (and art) of running controlled experiments on your website to validate a website change. By running experiments and measuring the results, we take out our subjectivity and pre-conceived notions on what we think would work best, and we find a more refined and deeper method for improving both revenue and lead generation.

In the 25 years that I’ve been conducting split tests, I still find it a very humbling practice. Things that I assumed would test well have fallen flat, and other tests I thought might be weak have won with strong results. At the end of the day, what I’ve learned most is that there is no such thing as “Best Practices,” and everything must be tested to validate and quantify improvement.

Let me give you some example tests and see how good you are at guessing whether the original page or the test variation performed better. All the answers will be at the end of this article so don’t peek!

Test #1

What Was Tested: If removing the pre-order banner affected sales.
Test Goal: Conversion Rate

Which variation had a higher conversion rate?

Simcity

Test #2

What Was Tested: Headline copy
Test Goal: Clicks on the “Try it Free” button

Which variation had more clicks on the “Try it Free” button?

Try-it-Free

Test #3

What Was Tested: TRUSTe security seal on a form page
Test Goal: Clicks on the “Submit” button

Which variation had more clicks on the “Submit” button?

Certified-Privacy-Image

Test #4

What Was Tested: Radical redesign of landing page
Test Goal: Clicks on the orange button

Which variation had more clicks on the orange button?

Orange-Button-Clicks

Answers:

Test #1
While common sense may have led you to believe a promo-banner would have increased orders of the Sim City game, the reverse was true. The conversion rate with the banner was 5.8%, but jumped to 10.2% when the banner was removed.

Test #2
A simple headline change can make all the difference. In this test, the variation increased signups by 38%!

Test #3
A best practice is to always advertise a trust seal to help reduce form or purchase anxiety. In this case, the test variation (without the trust seal) produced a 12.6% lift in “Submit” clicks. So much for best practices!

Test #4
Sometimes, we completely redesign a page in an effort to produce radical improvement. In this example, our redesigned variation produced an 89.8% life in account signups. Our test variation used easy to scan bulleted benefit statements, a photo of a real person instead of stock photography, and reinforcing “Free” on the button call-to-action.

We encourage you to stop guessing at what changes might improve your website and embrace the art and science of A/B testing. The first step we recommend is performing a conversion assessment. This assessment goes through your entire site and identifies potential friction points along with 30-40 A/B testing ideas to help eliminate / reduce those friction points. If you are interested in learning more about our conversion assessments, please contact your sales representative for additional information. We’d love to have you aboard the testing team.

Contact_US_CTA

Write a review

Authors

  • Steven LaGow Blog Author
  • John Parker
  • James Grergory
  • Ellis LaMay Blog Author Image
  • Mani Blogger Image
  • Elin-Alexander
  • David-Dillon
  • Ahmed Okour
  • Paul-Ross-Blogger-Bio-Pic
  • Rachel-B-Web
  • Courtney-V4
  • scottstiles
  • stu-3
  • Joseph Gustafson
  • Vince Scarlata
  • Tim Ahlenius
  • Staff Blogs
  • shawn griffin blogger image
  • Americaneagle Partner
  • Nick Goodrum
  • Missy Hildebrand
  • corte-swearingen-director-ab-testing
  • Emily Stark