a man wearing a hat and a suit and bow tie

While the examples I give are geared towards e-commerce sites, any type of site can find ideas here.

Before you can come up with meaningful A/B tests for your e-commerce store, you need to understand what your customers value when making purchases and what turns them away. Surveys directly on your site or through email are a popular method of collecting qualitative customer feedback. Surveys have some drawbacks though: It takes a lot of time to collect enough data from them, many people will completely ignore them and the data can be biased if you ask leading questions.

Other, more unconventional, sources of information exist that can be extremely helpful in generating ideas for A/B tests. These sources are:

  1. Customer service records
  2. Public forums
  3. Competitor data
  4. Internal search keywords

In this post we’re going to look at how you can use the data from these sources to seed meaningful A/B tests.

1. Customer service records and tickets

You customer service calls and chat transcripts provide a treasure trove of data. You can choose to look at them as complaints about your store and products, or advice about where you can improve. These questions, complaints, and requests customers submit are important enough to warrant a phone call or an email — do not ignore them as a source of ideas for optimization.

Waterfilters.net received numerous complaints from customers having trouble finding products on their site. Even the support team had a tough time finding certain products when customers asked.

Instead of ignoring the complaints, the team at Waterfilters.net realized something wrong with the store search. They decided to test out a new search function on their store, which lead to an increase in conversion rates by 11%. Check out the full case study from Google here.

Tip: Categorize tickets for easier discovery

Scanning through all of your past customer service records can reveal great test ideas, but scanning your tickets is tough if you have hundreds of records. Make this process easier by grouping them into categories

For example, issues related to products are one group; issues related to checkout are another. Make sure there’s no ambiguity regarding which group an issue should go into. By grouping tickets like this, you make it easy for your team to sort through them and identify common complaints.

When daFlores.com started working with Conversion Rate Experts to optimize their online flower store, they started with lots of customer research. Apart from regular methods like Qualaroo surveys, they roped in their customer support team to make lists of common inquiries from customers.

They found that customers were often concerned about how long their flowers would take to arrive. This data led to a hypothesis for a new test: communicating urgency around delivery would increase checkouts. They tested a banner that urged customers to buy quickly to be eligible for same day shipping.

text

daflores.com

Et voila, it increased conversion rates by 27%. A huge bump in conversion rates from addressing a common customer inquiry.

It’s important that the marketing or optimization team have a tight feedback loop with the customer service team. If a certain issue is happening regularly, there’s clearly a need to be proactive and nip it in the bud before it becomes a big issue. Also, after countless hours on the phone with customers, they definitely have some ideas for tests they’d like to try out themselves.

2. Public feedback

These days, consumers are quick to complain about an experience on any public site. While it can be frustrating for many brands, public feedback also offers a wealth of ideas for your A/B testing pipeline. Take to online forums, social media, and even your own product pages are all fair game to see what ideas you can find.

For example, these Zappos customers commented on a Nike shoe product page. Both customers were misled by the sizing and found that the shoes were too thin.

graphical user interface, text, application

zappos.com

The 3-star ratings they gave makes the product look bad and will probably hurt its conversion rate. Data like this could lead Zappos to test out a different sizing chart.

Tip: Get public feedback delivered to your inbox

It’s easy to keep track of comments on your site and social media accounts, but what about all those other websites, blogs, and forums out there? Just like with customer service records, if you sort through public comments you can find trends that will lead to meaningful A/B tests. There are a number of online tools you can use to get:

  • Google Alerts – Enter your company name or other keywords you care about and you’ll get an email alert any time someone mentions you online.
  • Social Mention – Track mentions on blogs, social media and bookmarking sites.
  • Mention – A paid tool with real-time tracking and an analytics dashboard.

Dairy Queen has developed a monitoring system for brand mentions across various channels, from social media to their blog pages. They track anyone who has complained about them online and redirect them to a feedback form on their site.

By consolidating mentions across all channels, including their core customer support, DQ makes it easy for their team to handle issues and identify trends. Creating a similar process for your store will save you a lot of time and help you identify major issues before it’s too late. Read more about DQ’s approach to customer service on MarketingSherpa.

From cumulative data via feedback forms, service records and other research, the team at TopCashback.co.uk learned that customers lacked trust for the product. However, they had a lot of good press going for them so they thought that testing out press logos would help.

topcashback.co.ukSure enough, with social proof from the addition of logos, conversions increased. Here’s the full TopCashback case study from Conversion Rate Experts.

Another example of ideas from public feedback: Mobile retailer, Wanelo wanted to test a new product feed for their mobile app. The initial data suggested that the new changes increased engagement but at the same time, also received a lot of negative feedback about the new feed.

Instead of relying only on the data for their engagement metrics, Wanelo listened to customers and experimented with it. They hypothesized that adding a second feed would help them solve customers’ complaints while maintaining the engagement their existing feed created.

The new test proved to be a success and the positive reviews were off the charts.

3. Competitor data

Newsflash: customers are also talking about your competition, and you have access to that data too. Complaints and compliments about your competitors site is another fertile ground of ideas for new tests.

There’s no reason why you shouldn’t find out what’s going wrong with your competitors and fix it on your site. When customers have a problem with a competitor’s site, they’ll come to you because you have a solution.

Just like you should set up Google alerts for your own site, set them up for competitors’ sites. Track all the public feedback, good and bad, about your competitors. This will help you analyze what they are doing so that you can do it better.

When travel company, Sunshine.co.uk ran usability tests on their site, they also thought to run them on their competitors’ sites. This gave them a direct comparison and showed them exactly what their competitors were doing right.

By comparing themselves with travel competitors, Sunshine discovered that people prefer to book with Sunshine because they offer specific key benefits. So, they tested different ways to play up these benefits on their site. These tests, generated from usability tests on other websites, doubled their conversion rate, bringing in an additional 14 million pounds per year!

graphical user interface, text

sunshine.co.uk

Voices.com also analyzed competitors’ websites to determine their strategies. This helped them find opportunities to position themselves against competitors. The A/B tests that came out of this analysis contributed to an increase in conversion rates by 400%!

4. Internal search keywords

If your website has a search bar, it’s likely shoppers are using it to find products quickly. The keywords shoppers use are great sources for A/B test ideation. Is one term appearing all the time? If so, try highlighting that product on the homepage. Moral is, if you’re not tracking the terms consumers enter into your search bar, you should start doing that now.

To track this, you need to have the right tool. Check out ElicitCelebros, and internal site search from e-commerce platforms like Shopify, Drupal, and Magento.

4 Wheel Parts offers a good example of using search queries for A/B tests. They started using a new tool, Celebros site search, to track search queries across every page on their site, giving them valuable data they never had access to before. After a couple of months, they had enough data to show them trends like which products were searched for most often and from which page.

They found that consumers searched for a particular brand name on one of their product pages. Testing out brand logos and links on that page was the next logical step for them. The result was a 59% increase in revenue from that page.

They also tested different ways products showed up in search results. By emphasizing certain promotions, they were able to increase conversion rates across their site by 4.65%. Read the full case study about 4 Wheel Parts on Marketing Sherpa.

Stock photo website, Bigstockexperimented with their search algorithm, comparing the likelihood photo searchers would select a photo with a “fuzzy” or “exact match” would return better results.

graphical user interface, application

bigstock.com

Start (and don’t stop) digging

Hopefully, you’ve already been keeping records of your customer service calls and internal site search queries. If you haven’t, it’s time to start now. Public feedback and competitor data already exists, but it needs to be collected and sorted. The only thing left for you to do is to dig through this data and find insights that will lead to new A/B testing hypotheses.

This isn’t a one-time task though. Creating a continuous process that tracks, gathers, and sorts all this data as it comes in will make it easier for you in the future when you run out of new ideas to test. Add this data to insights you’re pulling from other sources like surveys, brainstorm sessions, usability tests, your site analytics and you have an endless source of A/B testing ideas!

What are some other unconventional sources of inspiration you’ve found that have sparked interesting A/B tests on your site?

From day one, Mitt Romney’s digital campaign team understood a common truth: the campaign is not a creativity contest – what looks best and what works best for the website is not always the same.

“We tried to be very conscious that this team doesn’t have creative opinions, this team has data,” says Ryan Meerstein, a senior political analyst from Targeted Victory, the agency who ran testing and optimization for the Romney campaign. “It’s hard for the team to argue with a graph that proves what works and what doesn’t.”

The graphs were results from AB testing – lines that showed how two different versions of a web page performed over time. Rather than have protracted discussions on the design that could work best, the team tested and gathered data to inform every design decision.

The team went for the low hanging fruit first: email sign-ups. They hypothesized how different combinations of graphics, headlines, forms and color impacted a visitor’s decision to sign up for email updates from the campaign.

From the start, the team considered increasing email sign-ups on mittromney.com a primary goal.

“Email is still the golden goose of fundraising when you’re making direct solicitations,” Meerstein says. “We’re seeing each email valued at anywhere between $7-8 in future revenue.”

Knowing how beneficial email was to raising money for the campaign, they tested heavily on the homepage and splash pages of mittromney.com always optimizing for email sign-ups. Among AB testing tools, Optimizely was their platform of choice.

“There were some hesitations in our shop to use Optimizely because of past connections,” Meerstein says. “But we got past that and started to use the product and found it just far superior to any of the other ones we were using prior.”

Between May 2011 and November 2012, the Romney campaign’s 140-person digital team along with Targeted Victory ran hundreds of tests.

“Once we saw the ease of using Optimizely, the ideas started flying. We wanted to start testing just about everything,” Meerstein says. “We started on the splash page and when we saw success, we continued to build from there.” 

Call-to-action button test 

The team started optimizing for donations with a test on the main call-to-action in the right upper corner of the homepage. They wanted to see whether button color – blue, green, yellow or red – and word choice – “Contribute,” “Support”, or “Donate” – impacted the likelihood of a visitor to click.

graphical user interface, website

Overall they found that color did not have a definitive impact, but the word “Contribute” did show a statistically significant improvement of 10%

Knowing that “Contribute” converted visitors to click more often than “Donate,” the team changed verbiage all over the site – and in all email messaging – to reflect the test results. 

Home page carousel test

Still armed with the goal of maximizing email sign-ups, the team focused the next iteration of testing on the carousel images on the homepage. A carousel is a rotating slideshow of images that designers frequently use to showcase featured content. They tested using a carousel versus a static image offering visitors the chance to win a trip to the Republican National Convention with the headline, “Be There in Tampa.” The main metric they measured was the percentage of visitors who reached the email sign-up confirmation page.

They tested four variations:

1. The control – A full page moving carousel.

2. A half-height moving carousel.

3. A static image with an “enter to win” form.

4. A static image with a “learn more” button.

graphical user interface, website

Adding the form to the homepage image increased the percentage of visitors who signed up by 632%.

In this case, visitors seldom reacted to the “Learn More” more button. They reacted extremely well to the immediacy of the sign-up form giving them the chance to win with filling out just two form fields.

State specific splash pages

Next, the team used geographic location as a pull to encourage visitors to sign up for email updates. The team wanted to gauge whether visitors signed up more with a message specific to their state or a generic one.

a screenshot of a personSimply by adding “Florida” to the call-to-action text, visitors who saw this page entered their email and zip code 19% more often.

The data clearly showed success in personalizing the message. With this test as testament, the team decided to make the splash page specific for each state. They used geotargeting in Optimizely to send visitors from each state to a page with a message specific to that state. So no two visitors to mittromney.com from different states saw the same message. Using Optimizely, the team delivered a unique one-to-one experience for every visitor to the site from September 2012 to election day.

Personalization proved to be a powerful tool for the Romney campaign. They saw greater signups on the splash page and more interaction with local events advertised on the site, especially as voting started.

They did personalized call to actions based on absentee states and early vote states. Visitors from Ohio saw messages directing them to early voting locations and the hours they were open. Visitors from Colorado saw targeted messages for how to get an absentee ballot.

“The thing that was great about it was that we could go in there and set up the personalized experiences in 30 minutes,” Meerstein says. “In the final weeks of the campaign, there’s a huge difference between something being live on Tuesday morning and Thursday night.”

These tests demonstrate how critical time-to-test and time-to-results is when the stakes are incredibly high. Waiting for results or for bottlenecks incurred by the dynamics of teamwork is not an option for presidential campaign teams. In a matter of days, the team had conclusive results about which variations won. Without relying on the creative or engineering team, the analysts themselves used the tools within Optimizely to create huge gains in email sign-ups.

“You really can never test too much,” Meerstein says.