Episode 59: Why Not Test Everything?
Announcer:
You're listening to Drive and Convert, a podcast about helping online brands to build a better e-commerce growth engine with Jon MacDonald and Ryan Garrow.
Ryan:
Jon, I get these clients to talk to me about CRO and optimizing their conversion rate. They talk to us about companies they're talking to, and so I get some insight into some of your competitors. A recent one came to me and was like, "Well, we're looking at... Look at the CRO company, we're talking to them. They are trying to get us to do all these tests at the same time, like tons of them." They've kind of touched on something, like we're going to test this page, this page, this page, we're going to test these things on this page, probably hundreds of them at the same time.
This site was not, by any means, millions upon millions of monthly visitors a month. It was pretty standard e-comm site doing, I don't know, 40, 50,000 a month in traffic. I kind of like the idea, hypothetically, since I tend to be aggressive and I'm like, "Let's just blast all these tests really quick, learn as quick as we can. Go fast, break things." My typical mantra.
Jon:
Yeah. I'm over here in pain.
Ryan:
Yeah, I'm sure you are. Thankfully I didn't tell them that. I was like, "You know what? You should probably talk to Jon. It doesn't sound like you've got enough..." My initial reaction was like, "I don't even have enough traffic to do that. If somebody thinks you can be, they're probably smarter than I am on conversion optimization." I would hypothesize that there's some risks to doing hundreds of tests all at once on a site. I would love to get some of your feedback on... I guess a couple of questions would be like, why would a company tell somebody that? Then, is there any validity? It sounds really different and I guess conceptually cool, but what's going on there?
Jon:
Well, I think you just answered the question of why brands tell leads, if you will, or companies they want to work with this, right? Because to folks who like to move fast and break things, which is most entrepreneurs, it sounds great. Right?
Ryan:
Except for you. You've gotten through the go fast, break things phase, it feels like, and you're much more just relaxed. I'm like, "Gosh, I want to be Jon when I grow up sometime."
Jon:
Well, I mean, look, today is actually our 13th anniversary of The Good.
Ryan:
Oh that's awesome. Congrats.
Jon:
It took a lot of years, a decade plus, right, to get to that level of acceptance, if you will. I don't know if it's calm, but I would say acceptance. There's always going to be a storm brewing and you just go with the flow. When brands come to us, I hear the same thing. Well, how many tests are you going to run? Because I could go over here and get hundreds of tests run.
It's like, "Well, you could, I guess, but are you going to actually get anything out of that?" The answer is probably no. In 99.99% of cases, it's no. Now, who's running hundreds of tests? Let's just set the benchmark. Amazon, Walmart, Nike, are you any of those brands? Probably not. If any of those brands are listening to this, please give me a call. Would love to work with you more.
Ryan:
Yeah. I want to work with you. Like, "Hey, Nike, Amazon, come see me."
Jon:
Yeah. We've worked with Nike in the past and they have a whole team doing optimization, right? The reality here is 99% of brands are not the brands that can really truly run lots and lots and lots of testing, as good as that sounds. You really need to be informed and measured with your testing, and that's how you're going to see results. At The Good, what are we experts at? We're not experts at building or running tests. Anybody can do that. You can go get a Google optimized account and set up a test and run it.
You can do that with a train yourself in a couple hours on how to set up and run a test. Now, that doesn't mean that you're going to run the right test. That's where we come in is analyzing the data and helping brands to understand what they should be testing for the highest impact. Right? Could we go through and test hundreds of things? Yeah, of course, we could come up with hundreds of test ideas, but the vast majority of those aren't going to move the needle. They're just inconsequential or, for the most part, there's other challenges that are there too. I'm sure we'll talk about those.
Ryan:
Got it. Okay. It's, yes, you're probably going to get to a lot of those tests. Well, potentially, but at least with somebody like you or your team looking at it, you're going to be able to say, "Yeah, that's kind of a cute test, but this one we think is going to move the needle the most across all of your channels. It's not going to be testing red versus blue checkout buttons."
Jon:
Right? Yeah. That's where we are experts in helping to bring the voice of the consumer to the brand and help brands understand what challenges their consumers are having at each step of the funnel. Right? That helps inform what the test should be. I just today got an email from somebody who is an aspiring conversion strategist, works at a generalist digital marketing firm right now, is their only conversion strategist. He's been applying apparently to work at other CRO focused agencies like The Good.
He wrote me an email and he said, "Hey, I really loved your email newsletter article you sent out this week because I just interviewed at a firm. They asked me to do a test, like a pre-qualification exam, if you will, whatever. The activity was to put a testing plan together, and all they gave me was analytics data." They said, "Based on this analytics data, form five tests you would run for this brand."
He said, "Based on all the training that I've done and all of my experience, I told him, I would never do this for you. Instead, I would want to run user testing, talk to consumers and then look at the data." He said, "So without that, I can't put five tests together." He's like, "I honestly thought it was a trick question." I was like, "Let me introduce you to our director of client services, who is looking for a new strategist [inaudible 00:06:07]." Because that is exactly the type of answer that I would want in an interview to hear.
It goes to show you that a lot of people out there who are, you can't really see in air quotes right now, doing CRO, they're misguided. They're offering what they think the brands that they're selling to want, which is just A/B testing. They say, "Oh, well, we'll do hundreds of those for you." Business owners like yourselves who are move fast, break things, love that. It sounds great to them, like, "Yeah, I'm going to learn a lot. I'm going to move really quickly. We're going to get this done."
Then, they call us and they say, "Okay, well..." And I'm like, "Well, how much traffic do you have? Honestly, you shouldn't run more than four or five concurrent tests." They're like, "Wow, how are we actually going to get a return on that?" It's like because I'm testing the highest impact items and we're going to solve those within a monthly cycle where if you're running hundreds of tests, you're not going to be able to do that.
Ryan:
It's almost like, if I can oversimplify, it sounds like if somebody is just looking at analytics and they haven't done anything on your site or understand your user at all, and they have, "Hey, here's the hundred tests we run out of the box," run for the hills. You don't want that. I mean maybe it would help with what we have affectionately referred to as CRI. Yeah, you might get some improvements, but it's not optimization. Some of those improvements might be accidental.
Jon:
Well, and those improvements should just be made. I mean think about a test. Should you actually run a test or should you just make the change?
Ryan:
Yeah, it's true. If you have no product recommendations on a PDP page, you probably don't need an A/B test for that. Let's just put it out there, because we know that once people click those, conversion rates increase on the site anyway. That's pretty basic data.
Jon:
Yeah.
Ryan:
Okay. What's the big risk of somebody executing this and saying, "Yeah, we're going to test a hundred tests all at one time?" What's going to happen to them in reality?
Jon:
Well, I think there's three things that always come to mind for me on this. The first is that it's going to take much longer when you're splitting the traffic in so many ways to prove out what test is a winner, what variant is a winner, right? Without getting too deep into the math, let's say you have a hundred people coming to your site, you're going to split those into segments, so like groups. You're going to say, "Okay, everybody, I want to run a test that is specifically targeting people who have been to the site before." Right? Then, you're able to via Google Analytics know who's been to the site before within the last seven to 14 days.
That would indicate that somebody's at the next step in the buyer journey because they're doing their research. They were at your site, then they went to look at other options, now they're coming back, right? We want to run a test against that audience, but it's going to be impossible to reach that statistical significance on the test against that group if you're running a ton of tests that you need to keep that traffic separate, right?
You only have a hundred visitors and you want to run 15 tests. Now you got to divide that hundred visitors by 15 plus, actually 16 because you need a control group who sees no changes. Right? Now you really are just like... You have very few visitors that you're testing with. It's going to take a very long time for you to feel comfortable with the math, statistical significance, that you have actually proven out that a specific change is moving the needle for you. The first risk is just it's going to take months for you to prove that out unless you, again, are a Amazon, Walmart, you have millions of visitors coming a day
Ryan:
Or you just make a bad decision saying, "Hey, one of those five visitors converted, so that must be good."
Jon:
Right. That's where statistical significance comes in, right, where you're saying, "Okay, how many people actually saw this test and how many converted?" If it's one out of one, an inexperienced optimization person would say, "Well, that's the one. Somebody converted and nobody converted on any of the others," but the confidence levels can be very low on that.
The second thing that's a challenge is when you're running hundreds of tests, you really lose your focus. It becomes difficult to determine what tests are actually improving. Right? For the same thing, for reasons of math, but you're much better off saying, "Okay, right now, I know I can't run tests across the entire customer journey. I'm going to focus on improving this step of the journey. How we're going to measure success is does somebody get to the next step?" Right? That's the goal is always to help people get to that next step in the conversion journey. If you realize there's a step where people are dropping off, that's the step you want to start improving.
If you are trying to optimize the entire journey, you really end up doing the third issue here, which is crossing the streams, right? Like in Ghostbusters, right, where they don't want to cross the streams of the ray guns because it explodes essentially. Well, here's the thing. You will have no idea what tests improved around that step of the customer journey, because you don't know if a test that you ran earlier in that customer journey is influencing what happens later in the journey. Does that make sense? In essence, you end up needing to run multiple tests on the same segments of users and you just can't really tell cleanly what test is having an impact and was actually the driver for them to convert.
Ryan:
Well, I think you just simplified my response to anybody coming to me with this problem, like, "Well, do you want to cross streams?" I'm like, "No, no. That's just a general best practice, do not cross streams." Well, that's what you're going to do if you go work with that crappy CRO agency. For some people that have listened to us before, to me, it seems like high volume or high velocity testing would be similar at least in vernacular to rapid testing, which we've had a podcast about. I know that rapid testing is good, high velocity testing or hundreds of tests at a time is bad. I can hear somebody telling me, hundreds of tests, say, this is going to be really rapid testing. We're going to solve all your problems very quickly. How are they going to be different?
Jon:
Yeah. I think that's a really good point because I think a lot of people mostly just don't know the difference, quite honestly, and it causes confusion. I think it's because they both have testing in the term. Right? Really high level, what's the difference between the two? Rapid testing is a... It's a type of testing different from A/B testing that most folks think about when they think of conversion optimization, right?
Rapid testing is very small discrete tests that are things that are not necessarily onsite testing. I'll explain what that means in a second, but the reality, these are things like five second tests. Okay, I'll put a page in front of you for five seconds, take it down and then run a survey, ask you or record a video of you and ask you what you thought, what stuck in your mind, did you like that page or not? What was your sentiment? Did you know what action to take, what you needed to do next? Right?
What can somebody gather in the first five seconds is very indicative of how clear that page is. Is the messaging clear? Is the actions you want them to take clear, right? If not, then they should do some other items, right? You should really take a step back and redesign that page probably.
There's also things like preference testing, "Hey, here's two options. Did you like A or B more, right?" Or if you were going to take this next step, would you prefer A or B? There's a lot of preference things that you can do here. Then, there's things like card sorting, which this is great for like navigation, helping sort navigations, where you get consumers involved and you say, "Okay, here are all the navigation options. How would you sort these into a realistic navigation?"
Ryan:
That's cool. I've never actually heard of that. That's cool.
Jon:
Yeah. The whole point of that is what's their preference? What do they think is clear? Because, as a consumer, you're too close to your brand, right? You know what all the products are and how in theory they should be organized or how... I've seen brands, I ask them, why their navigation is organized in a certain way. They say because that's how their warehouse is organized. Well, that might be great for, yeah, might be great for the person picking and pulling the packages, because it's really efficient for them. I'm a consumer, who's trying to help me? Not make it easy for your dude who's picking the packages.
The point here of rapid testing is these are usually all done offsite, right, with testers that are paid and driven to that test specifically. Okay? That's a good way to think about it. These are things that should take a couple of minutes at most, really quick testing. That's where the rapid comes from because you can learn a lot very, very quickly by sending a hundred people to a five second test. Right? It's cheap to do it in theory. Again, you still have to know what to test, what questions to ask, et cetera. There's some experience there for sure, but that's where rapid testing comes in.
Now, A/B testing on the other hand is what most folks think about when they hear about CRO. This is done onsite and that's the big difference. It's done by taking actual site visitors and segmenting them out into groups based on maybe like where they came from, have they been to the site before, the things I mentioned earlier, right? Then, you're altering the site experience in small ways to influence their actions on your site. These folks usually have no clue that they're being tested.
I mean this is happening all over the internet to you, and you have no idea that you're being opted into a test, which you need because then you're not poisoning the well, right? You're not influencing the behavior. I'm poisoning the well by telling you, "Hey, you're being opted into a test," because now you're looking for what's the test? You don't want to be tricked. You don't want to be tested, so now you're trying to figure it out as opposed to just doing what you would normally do.
The big difference is that if you're doing hundreds of tests and they're not these rapid tests, which in the cases that you and I are talking about today, hearing from brands, that's not what's happening. They're being told we're going to run hundreds of A/B tests, and that is where it becomes a challenge. You're right that there's these different types of testing and it can be confusing for folks, but rapid testing and A/B testing are completely different. When we hear about this, most folks are really talking about A/B testing and that they're going to offer hundreds of A/B tests.
Announcer:
You're listening to Drive and Convert, a podcast focused on e-commerce growth. Your hosts are Jon MacDonald, founder of The Good, a conversion rate optimization agency that works with e-commerce brands to help convert more of their visitors into buyers, and Ryan Garrow of Logical Position, the digital marketing agency offering pay per click management, search engine optimization, and website design services to brands of all sizes. If you find this podcast helpful, please help us out by leaving a review on Apple Podcasts and sharing it with a friend or colleague. Thank you.
Ryan:
Now, these companies that are doing these hundreds of A/B tests all at once, from a technology standpoint, is there a way to say, "Hey, I'm testing this product page that we're landing this traffic on. If I exit the product page, can they keep me from seeing any other test or is it once I've seen ..."
Jon:
Yeah. With A/B tests, you can segment your audience and declare who can see what tests based on those segments. This is where having a very clear strategic plan for your testing is going to matter because you really want to be able to say, "There is a test this step in the funnel. If I go to the next step down, am I being influenced by what happened before?" You really need to map out your test. This is where, like I said earlier, it gets impossible to track all of those tests and what's being influenced and what's not, and how do you separate all of that? If you don't have enough traffic where you can say, "I only want to run this one test with this segment at a time," and that's really important.
Ryan:
Until you're working with a brand, you don't actually tell them what you're going to test, right? You realistically couldn't.
Jon:
I couldn't, no, because like a doctor, we have to go in and diagnose before we prescribe. When a brand comes to us and says, "Hey Jon, we're not converting very high. I know I need a knee replacement." Well, if the first thing I do is go and operate on their knee because they said they needed a knee replacement and it doesn't solve their problem, guess who gets fired? Not only that, but I mean if a doctor did that, it'd be malpractice, they'd lose their license. Right? The reality here is we really need to diagnose and do our research before we prescribe what we're going to do. Other than that, it's really just guessing and you know where that gets us.
Ryan:
Yes. To summarize, you need to have... To do real CRO, which if we're saying that there's differences, like real CRO is doing what Jon does, you have to have enough traffic and you have to have hyperfocused a testing plan that only comes after you've begun working with the CRO company. There's no way a CRO company can tell you what's wrong with your site until they've understood your user and what's trying to be accomplished and uncovered a lot more things that can be done from an external analysis.
Jon:
Yeah, exactly. Every client we work with, we have about a three week onboarding period where we're doing that diagnosis. We're making sure all the data is correct. We've talked a little bit about, in previous recordings, about clients who don't have clean data. Well, we want to go in and at least, moving forward, make sure that they have clean data as much as we can. Right? We have to correct any issues that are there.
There's a lot of work that would be done that is, a, valuable and, b, when you go into a doctor's office, to keep that same analogy, you don't walk in and say, "Hey doc, thanks for meeting today. Can you tell me everything that's wrong with me? Then, yeah, I'm not going to have you do the surgery, so I'm not going to pay you for this appointment." The doctor still gets paid for that appointment. They still get paid to diagnose you, because it's valuable to you to understand exactly what's wrong.
Whether or not you choose to resolve it and actually get the surgery is up to you, but they still need to go do the MRI, do the X-rays, take your blood panel, whatever it might be. Those things take time and expertise to diagnose. I think that doctor patient analogy is about as close as I can get to how we operate.
Ryan:
Mm-hmm. That's great. For any of you with my personality that wants to go fast and break things, if you are trying to do that in CRO, you're going to be very frustrated and not get what you want. You have to adopt patience to get the best care.
Jon:
That's fair. I think it also goes the same for driving traffic, right? You're going to move fast and break things and just start spending money all over the place in driving traffic? Probably not. You want to diagnose, whoa, let's test a few things here and there, see where you've spent in the past, what pages and collections should we advertise. Right? I mean you're going to put a plan together and then execute on it.
Ryan:
Yeah. I mean we do have companies that come to us and like, "Hey, we really need to be launching something next week." We're like, "Okay, well, that's not going to be best for you. You're going to have some inefficiencies and holes in what's going on so you can rush it, you'll waste money." Oh man, e-commerce requires patience, so I'm having to learn that constantly.
Jon:
I'm glad we balance each other out.
Ryan:
Yes. Jon, I appreciate you educating me so I can be more valuable to my clients talking to them about high velocity testing.
Jon:
Love it.
Ryan:
And how it's not in the best interest of companies. Thank you for your time.
Jon:
Thank you.
Announcer:
Thanks for listening to Drive and Convert with Jon MacDonald and Ryan Garrow. To keep up to date with new episodes, you can subscribe at driveandconvert.com.