decorative yellow lines on background

Background

As LiveChat wrapped up a site redesign in 2011, they felt happy with the new website’s look and feel but were uncertain how it would perform. They wanted an elegant, beautiful website, but the primary goal, according to CMO Szymon Klimczak, was functionality. In particular, LiveChat wanted the new site to nudge more visitors to sign up for a free trial. As a small team with limited resources, LiveChat isolated a goal and chose to start testing it using Optimizely for its design, ease of use, and ability to make complex testing statistics digestible. Klimczak said approaching A/B tests from a goal oriented standpoint produces the most concrete results. “We do A/B tests to improve particular elements of our website – various descriptions, buttons with different texts, etc,” he said. “We realize that there’s always something we can improve.”

Tests and Goals

a woman sitting at a desk with a computer

Lucy Frank of LiveChat

Lucy Frank, LiveChat’s visual designer, isolated a specific goal the company wanted to achieve using A/B testing with Optimizely: more product sales. She examined the steps most people take towards becoming a customer. More people signing up for free trials, the company reasoned, would result in more sales. Lucy chose to experiment with the text on the big shiny “sign up” button first. Lucy’s goal, then, was to test alternate phrases or call-to-action words and see which enticed more users into taking up the offer of a free trial. She began the testing with a simple change in wording:

Original:

a cartoon of a boy

Variant:

a cartoon of a boy holding a red box

Results:

With such seemingly small changes, Lucy didn’t expect much in terms of results. She says she was pleasantly surprised, though, “by how big an influence even the smallest changes could have on the statistics.” In LiveChat’s experience, “Try it free” generated about 15% more clicks than “Free Trial”.

a screenshot of a computer

Apart from the unexpected results, LiveChat’s experiment also produced another surprising, equally important realization about the duration of time needed to run a test. “It was especially evident with the testing the buttons,” Lucy says, “that sometimes, even if the results said one thing at the beginning, with time they could show a different outcome. A button that was initially receiving many clicks, would later turn out to not really be the best fit.”

Key takeaways:

  • Words matter — Action-oriented words are often more effective and engaging. Try to use words that tell your visitors exactly what you want them to do. Be direct in your call to action buttons. Instead of advertising something like “Free Trial”, instruct the visitor to take an action.
  • Time matters — Tests need enough time to level out. There are bound to be fluctuations in the beginning and you want to ensure your test is valid, which means giving experiments enough time to ride through the ups and downs before you can deduce the clear result.