3 Common A/B Testing Mistakes Made by Ecommerce Stores

3 Common A/B Testing Mistakes Made by Ecommerce Stores

I listened to a pretty amazing podcast this past weekend – Episode 071 of the Smart Passive Income Podcast, by one of our absolute favorites in the industry – Mr. Pat Flynn.

His guest for this episode?

A Lifestyle Entrepreneur who has been on the forefront of cool and trendy internet-based businesses since 2005. His prestigious resume includes working directly for Mark Zuckerberg as employee number 30 at Facebook, then becoming employee No. 4 at Mint, and now he finds himself a founder of SumoApps – a brand that is absolutely killing it as the “Groupon for App Retailers”. I speak of none other than Noah Kagan, a true ecommerce and Lifestyle Business Thought Leader (LBTL).

While Noah’s illustrious career has brought great success, and millions of dollars along the way, there have been great mistakes and great losses realized as well. The failures in his career taught him some valuable lessons.

“I want people to fail if it means they will go on to greater things,” Noah explains in the interview.

While failure is a painful experience most entrepreneurs experience at one point or another, a key differentiator between chronic underachievers and successful entrepreneurs is the ability to learn the lesson the first time. In other words, you won’t continue to fail if you post-mortem all of your mistakes, learn the valuable lesson from each, and never repeat them.

Among the many excellent insights Noah and Pat discussed, was the point that too much research can absolutely paralyze a business, but the right research yields desired results as expeditiously and efficiently as possible, while mitigating risk. Lifestyle Entrepreneurialism is all about efficiency and mitigating risk. Therefore, it is absolutely imperative that we do not spend too much time on the research, that we get our products out on display ASAP (as Seth Godin would say, “Artists ship!”), and that we get the information that allows us to, both, validate and optimize, with as little of our own time and effort as possible.

To put it another way – we simply seek the right information, through asking the right questions.

Why A/B Testing?

That brings us to today’s subject – A/B Testing. One of the greatest tools we are given in the business world, is the ability to Validate and Optimize by pitting a control against a variable. Boiling your business down to the most basic conversion funnel, what tweaks can you make to increase the size of that funnel (idea validation) and which tweaks could be made to increase the sales conversion rate (optimization) once your ideas are validated?

While we’re on the topic of learning from our mistakes, I felt it apropos to take a look at 3 of the most common A/B test mistakes made by ecommerce stores today. Before I introduce the mistakes, and walk you through my personal takeaway, let me tell you up front that I’m trying something a bit different this week. Please make sure you read all the way through this post, so you can participate in a little social experiment I’m conducting.

The Mistakes

Mistake No. 1 – Too Little Data Acquired Over Too Short of a Duration

Running A/B tests over too short of a duration could lead to acquiring insufficient data to draw statistically significant conclusions. For instance, if your site has a black “checkout” button now, and you want to test your idea of a red “checkout” button, you would want to run the test for long enough of a time frame to get the following: a) a representative sample size; and b) enough of a difference in conversion percentage that you can say the red button converts a greater (or lesser) percentage of shoppers by a statistically significant margin.

If the sample size is too small, you run the risk of not only acquiring inaccurate data, but data from which you could not reasonably draw conclusions. If only a handful of shoppers participate in your A/B test (say, 5 red and 5 black) and 3 of the 5 shoppers make purchases with the black button, while 4 of the 5 shoppers who saw a red button converted – you would have a serious problem. The difference from 3 conversions to 4 is quite small, and the sample size would render that difference to be statistically insignificant. Conversely, if you had 500 of each type of shopper, and 300 converted on the black button and 400 converted on the red button, the percent-difference from A to B would be the same, but the volume of test data would yield statistically significant results. You could therefore reasonably draw your conclusion.

Mistake No. 2 – Different Customer Segments Were Not Accounted For

Although you’ve created your site with Sam Shopper in mind, not all of your customers are going to be like Sam. In fact, depending on how Sam arrives to your site and interacts with your site, even Sam can have a different experience or desire for each interaction with your site. Because of subtle, and not so subtle, nuances between your customer segments, there are times where it will be important to account for these segments in your A/B Testing Protocol.

When running an A/B test, you’re typically going to run the test on all of your inbound traffic – your current and potential customers. But, you’re occasionally going to need to tweak your testing if you want to see improvements in certain types of customers. For instance, you may want to see if returning customers prefer a more direct path to goods they commonly browse or purchase, or you may want to know if a landing page with blue and white accents is more appealing to customers who arrive to your site from Facebook.

The fallacy in not accounting for segments, is that the picture can be too general or too muddy when attempting to design testing for your entire website traffic volume. When we talk about segmentation, we’re not talking about the validation aspect of A/B testing, we’re  squab in the middle of the optimization aspect. We’re looking for that shade of a percent that puts us over the top at this point. You’re not going to get there without accounting for different customer segments.

Mistake No. 3 – Test Was Not Designed for a Subtle Outcome

What if you’re seeking an advantage that is really subtle, because you’re already highly optimized – converting right at the proper rate for your position in the industry? You’re less likely to make a drastic change at this point, and the desired affect is going to be subtle.

We’re talking about a scenario where your variable (B) may yield 0.01% more (or less) than your control (A).

When that is the case, the test must be properly optimized for a subtle outcome. You’re going to need a ton of data to show statistical significance and the variable must be nearly 100% isolated. The data must be acquired quite carefully – so carefully that you can confidently state that the variable was truly the only difference from Group A to Group B, and there were no other factors that could have effected the outcome of one group or the other.

To prevent this problem from occurring, one should calculate the sample size required for the desired outcome. Don’t worry, though. it’s not Calculus, nor anything beyond basic Algebra, and I have found a tool to help you perform the calculation, here.

The Takeaway

These are but a few of the mistakes that are commonly made during A/B testing. They’re among the mistakes you’re most likely to make if you have not taken the time to research careful A/B testing and studied the follies of those who came before you.

Are you committing any of these potentially harmful errors in your A/B testing? Remember, a ton of research isn’t our desire – only the right research. Also, we don’t want any repeat lessons, right? Learning each lesson once doesn’t mean you have to make all the mistakes on your own, after-all. So, let’s learn from these common mistakes and vow that we will not make them again (or in the first place).

What you can further take away is that you don’t need an MBA or a masters degree in statistics in order to logically identify the information that is critical to the conversion funnel for your business – nor to come up with ways to effectively increase the size, and conversion rate of that funnel (validate and optimize).

Do you A) Comment and Share; or B) Keep This Post to Yourself?

We’re having a lot of fun sharing all of this valuable information, here at Ecommerce Rules.

You know what’s even more fun, though? When we get to hear from you! So, dearest, valued, deeply intelligent people of the Ecommerce Rules Tribe, please weigh in.

Are you regularly validating and optimizing with A/B testing strategies? Would you be so bold as to share some of your great successes (and failures) with us, here on this public forum?

Remember that we’re all here for the betterment of each other. Lifestyle Entrepreneurialism is just that, a way of life. We’re individuals that belong to an elite and diverse group of brilliant, creative minds – and we understand that, though we might compete for each and every dollar that will be spent in ecommerce shops, we’re all more successful when we carry on as a community, sharing what works, what doesn’t work, and a spirit of good-natured, healthy competition.

In fact, you know what? I’m going to do a bit of my own A/B testing here.

  • If this post gets to 50 [meaningful] comments + social shares (Twitter & Facebook) by the end of the month, we’ll draw names from those who commented and/or shared, and someone will win something pretty cool.
  • If we don’t reach 50 comments + social shares, we’ll draw names anyhow and dedicate an entire post to roasting the “winner.”

Did you see the fallacy in that test setup? Let us hear it! And, lay it out there – your takeaway, something you can share that would help the group, or just a comment for us to chew on over here at Ecommerce Rules.

Published by

Joseph Yi

Since he was a freshman in college, Joseph has worked in several internet startup companies and has developed campaigns and digital strategies for Fortune 500 companies and brands including the Los Angeles Lakers, Manchester City FC, the Oakland Raiders, Sephora, and Whole Foods.

One thought on “3 Common A/B Testing Mistakes Made by Ecommerce Stores”

  1. You know, the point where you got me to comment (I’ll definitely keep this post to myself, as I’m in a middle of a research which I expect to be quick and efficient) when you mentioned that “I” existed and you wanted “my” feedback. Exactly at this phrase: “Do you A) Comment and Share; or B) Keep This Post to Yourself?” Somehow, this touched me to say hello. And I guess that is more of a point to make than the prize promise indeed. Besides: “Something really cool” to be given is really a marketing strategy? Think about that.
    Best Regards, keep w the good writing
    Pri

Leave a Reply

Your email address will not be published. Required fields are marked *