8 Ways to Evolve Beyond Tactical CRO

Life cycle of common birdwing butterfly from caterpillar

Although my digital marketing career started on the “driving website traffic” side of the equation (a noble task, to be sure!), over the past few years, my passion and attention has shifted to a “get more traffic to convert” viewpoint. (I outlined the reasons for my change in focus in an earlier post, Why More Traffic Is The Lazy Marketer’s Answer.)

In the process of building out a Conversion Rate Optimization (CRO) team and evangelizing CRO to anyone who will listen, I’ve noticed some common mistakes that confront marketers who are just dipping their toes into the waters of optimization and testing.

Balance Due Diligence With Taking Action

To do Conversion Rate Optimization right, you’ve got to do a lot of research upfront. Learn about the client’s business and pore through their website. Dig into their web analytics to find out where the problem areas are. Analyze their competitors’ websites. Install software tools on their site and then wait for data to accumulate so you can see heatmaps, scroll maps, session recordings and form analytics. Research and develop user personas, then determine which questions and challenges they’ll have at each stage of the buyer journey, and ensure content and assets are available to address all. Launch visitor surveys and analyze the data. Perform user testing, either quickly online or more thoroughly in person.

Performing this kind of due diligence is critical in order to create your CRO strategy, tactics and priorities. The only other option is to blindly follow “best practices” (which may or may not apply to your situation more on that below), randomly choose to tackle the parts of a website you personally don’t like, or jump into the HiPPO’s current whim.

Sounds like a wise approach, doesn’t it? It most certainly is … but it’s also easy to get stuck in analysis paralysis as you gather more and more data to make your case.

Good marketers have a bias toward action and realize that testing is where the rubber meets the road. So gather data and do your research, but start taking action at the same time. Usually you’ll uncover some obvious opportunities for improvement early in the process. For example, web analytics will tell you which pages have the highest bounce and exit rates. Do some quick user testing or watch session recordings to figure out why the page may not be compelling, come up with a hypothesis and run an experiment with a different version of the page to create some quick wins.

Be Judicious in Choosing What (or Whether) to Test

A huge part of CRO is testing, running either A/B or multivariate tests to see which version of a web page encourages more visitors to convert. It’s hard to beat gathering real data about real users’ behavior when determining how to improve your site.

But it’s not always advisable, or even possible. To ensure that the data you’ve gathered is reliable, you’ve got to have a statistically significant sample size. That sample size depends on a number of factors, including what your starting conversion rate is (the lower your baseline, the more data you’ll need in order to be confident that a lift isn’t just a random fluke).

For high traffic website, this usually isn’t an issue there’s typically enough website traffic to run tests until the cows come home. But for niche websites with lower visitor volume, it can present a challenge. I once did the math on how many visitors we’d need to have a statistically valid test for an important but low traffic page on a client’s website. It would have taken 18 months for the test to run.

That doesn’t make any business sense. This is a situation where relying on qualitative data such as from surveys or user testing makes more sense than waiting an eternity for quantitative data.

Sites that don’t enjoy an overflowing abundance of visitors need to be judicious in deciding which tests to “spend” their traffic on. Focus on experiments that are the most likely to have a business impact. You might be curious about the possible impact of changing the button color from burgundy to garnet, but your effort would be better spent on testing making the value proposition more explicit on the page.

Beware of “Regression to the Mean”

In one of the first experiments I ran personally, I was thrilled to see that the alternate version of a page (which we had conceived of and designed) was beating the original by 32 percent. It was only a week into the test, but I was really excited by these early results.

Our client was located across the country and, though we’d been working with them for some time, we had never met face to face. I was traveling to their area to visit family and arranged for an in-person meeting. Since we had these early results, I was thrilled to show them to the client, who became equally excited.

And then … over the next several weeks until the test ended, the delta between the original and our variant steadily shrunk from 32 percent to 8 percent. Yes, “our” version of the page still won, but not by a landslide. What happened? “Regression to the mean” happened, as it typically does.

I did take statistics (what I not-so-fondly called “sadistics”) in college, and it was the only C I ever earned there. After this embarrassing client incident, I brushed up on stats and relearned the phenomenon of regression to the mean.

This is the best way I’ve come up with explaining it to other sadistics-haters. Let’s say you flip a quarter ten times. In theory, you should get heads five times and tails five times, but in reality, chances are decent that you won’t get an even split. You could get heads six, seven or maybe even eight times.

Flip the quarter 100 times, though, and you’ll come much closer to a 50/50 split between heads and tails. You still might get tails 60 or 65 times out of 100, but the delta won’t be as wide as it was when you flipped it ten times. Flip it 1000 times, and it would be very unusual for the split to be anything but 50/50.

So when you look at early test results based on small sample sizes, your data is much more likely to be skewed than when you (impatiently) wait to collect statistically significant data. I try not to peek now, or if I inevitably do check in on early results, I don’t get excited by them and I certainly don’t share them with anyone until I know they’re fully baked.

Some Tests Are Losers and Some of Your Assumptions Will Be Wrong

Ah, this one hurts. It feels so good to launch an experiment (based on your research and a solid hypothesis) and have it win, as in the example above. And it happens more often than not. But it doesn’t always happen. Sometimes, even a well thought out test fails.

One client had a form that consisted of one field. After the user filled it out, a pop-up appeared where they had to fill out additional fields before moving forward. We hypothesized that putting the entire form (it was brief) on the page itself might increase conversions since we’d be asking users to click fewer times overall, and there’d be less of a “bait and switch” feel.

We set up the experiment, and 25% fewer people converted. Oops! Apparently, once people had mentally “committed” by filling out the first, visible field on the original version, they were more likely to finish the process on the pop-up. Our straightforward version may have appeared more cumbersome to fill out.

Losing tests happen. But continuous testing will lead to long-term improvements, even if some of your assumptions are proven wrong along the way.

Some Tests Are Inconclusive

What’s more frustrating than a losing test? An inconclusive test! Sometimes you run the test for the appointed amount of time and the original and variant perform the same. Again, it happens. Assuming the experiment was set up correctly, data was gathered accurately, and you have a high confidence level in the validity of the test, this usually means that what you chose to test just doesn’t matter that much in a visitor’s decision whether or not to convert.

So you’ve learned something. Make a note of it so you don’t waste time in the future, and move onto a test that might be more impactful.

Best Practices May or May Not Be “Best” for You

A lot of smart people have spent a lot of time and money on CRO, as well as User Experience (UX), and general best practices have come to be accepted. Put the form “above the fold.” Human faces in images work best. Ensure the button color contrasts with everything else on the page. Focus on one clear call-to-action (CTA) per page.

Best practices are best practices because, a lot of the time, they work. But not all the time. If everything always worked the same for every company and every product, then every web page should and would be identical. This is why we test because we never know for sure what our specific target audience will respond to when they’re thinking about our product or service.

Why do sports teams play the game despite the experts’ predictions as to which team is best? Why do we hold elections when the pollsters tell us who’s going to win in advance? Because we never really know until the game is played, the votes are counted, the experiment is run.

Segment, Segment, Segment

I’ll never forget hearing a presentation from the colorful Avinash Kaushik where he said “Failure to segment your data is a crime against humanity!”

He’s right. Here’s one example. We tested two paid media landing pages that had identical offers and similar layouts. One was short and to the point. The other in keeping with industry best practices added more, compelling information such as a visual of the offer, a customer testimonial, and a brand-building video.

When we looked at the data in aggregate, the “best practices” longer landing page appeared to have won. But then we segmented results by device, to look at responses from people on desktops, tablets and mobile devices.

The results completely flip-flopped. Data for desktops was inconclusive those visitors were about equally likely to convert on the long and short pages. But 27 percent of tablet and mobile visitors converted on the short page, making that one the clear winner. We would have missed that entirely had we not looked deeper and segmented our data.

The Most Important Thing

I actually have two “most important things.” The first and I’m on my personal soapbox here  is to ensure your site loads as quickly as possible. The rule of thumb is that it shouldn’t take more than two seconds to load (maybe slightly longer on mobile). Your analytics will tell you how you rate here.

It doesn’t matter what you do in terms of CRO and testing on your site if it loads so slowly that people abandon it before they see any of your hard work. With more people accessing the web on mobile devices, site speed will continue to become one of the most important factors as to whether or not your website is contributing to or detracting from your business success.

The other “most important thing” is to keep your customer in the forefront. Yes, do research, create hypotheses, run tests. But if you don’t truly understand your customers’ varied personas, their needs/desires/concerns/questions/wants, the stages of their buyer journeys … it’s all for naught. No one will care where the form is on the page, what the button copy is or how the headline reads if you’re not solving your customers’ problems and speaking authentically and directly to them.

It’s like taking a lot of time to decorate the prettiest chocolate cake in the world when your target audience prefers lemon cake instead. To mix food metaphors, you’ve got to put the steak before the sizzle. Ensure you’ve got the content and messaging right before you start fiddling with page design, labels on form fields and button copy.

In Closing

Conversion Rate Optimization is fun (if you’re geeky like me). CRO can have a huge impact on your company’s web success, and your bottom line. Even small investments of time and money can significantly improve results. It’s not uncommon to see the number of online leads or sales double, triple, quadruple or more when CRO is done right. I have a hard time thinking of an investment with a better ROI.

That said, go into it with your eyes wide open and avoid these common mistakes.

Written by Stacy Sutton on July 5, 2016

Comments

Add A Comment
Written by
Stacy Sutton