Showing posts in “Best Practices

Why An Experiment Without A Hypothesis is Dead On Arrival

Experiments without hypothesis are dead on arrival

Imagine you set out on a road trip. You packed the car, made a playlist, and set out to drive 600 miles—but you don’t actually know where you’re headed. When you arrive at a destination, and it’s not at all what you imagined it would be.

Running an experiment without a hypothesis is like starting a road trip just for the sake of driving, without thinking about where you’re headed and why. You’ll inevitably end up somewhere, but there’s a chance you might not have gained anything from the experience.

In this post, we’ll show you how to craft great hypotheses, how they fit into your experiment planning, and what differentiates a strong hypothesis from a weak one.

How to Prioritize Your Test Ideas and Other Critical Questions

Kyle Rush

When I’m not running experiments on Optimizely’s conversion funnels, I love to interact with the optimization community. GrowthHackers has one of the best communities out there and last week I hosted an Ask Me Anything (AMA). The questions were very high quality and covered topics like running multiple tests at the same time, how to overcome technical hurdles, how multi-armed bandits can be helpful, what to do with inconclusive tests, and more.

If this piques your interest, have a read through the questions and, of course, continue to ask me anything.

5 Traits of Best-in-Class Optimization Teams

Award

What are your best customers doing?

That is the #1 question I hear from customers on a day-to-day basis. How do others companies do optimization and testing? It’s a great question.

Based on thousands of interactions with Optimizely customers and four years of enterprise enablement, I can confidently point to five traits that all best-in-class optimization teams possess…

2 Controversial Site Redesigns That Should Inspire You to A/B Test

Original version of Netflix page.

When it comes to making better data-driven decisions, the sooner the better. Often the temptation is to wait until after the redesign is done or the feature is rolled out. “Oh, we’re doing a redesign; we’ll do the A/B testing afterwards.” The fact is you actually want to A/B test the redesign. One story from Digg and another from Netflix show why.

Around 2010, Optimizely co-founders Dan Siroker and Pete Koomen were introduced to the folks at Digg by their new VP of Product Keval Desai to talk about using Optimizely. Their response was, “We are busy working on a complete overhaul of our site. After we do that, then we’ll do A/B testing.”