Optimize for Customer Experience, Not Conversions
One of the reasons I love marketing and advertising is that it’s filled with incredibly smart and talented people. That makes sense given the biggest brands in the world spend billions each year in an effort to better connect with their customers.
However, like any other industry, there are deeply embedded norms and practices that don’t make sense. I could go on a long rant about all of the things that are broken in our industry, but I’ve done that a few times already in previous posts (here, here and here).
What I want to focus on here is how, despite all of the effort and brainpower marketers put into our craft, we fail over and over again to put our time and efforts where they matter most.
Customer experience is the game. It’s not a channel. It’s not a campaign. It’s not our clever tactics. Those things matter. However, what we often fail to see and understand is the 360-degree online/offline customer experience.
And if we optimize that, we win. We win because when our customers win, we win. It’s a virtuous cycle.
The CRO Myth
This is where I hear smart and savvy marketers bring up Conversion Rate Optimization (CRO).
However, most CRO campaigns fail to make permanent and sustainable improvements. Too often marketers are trained on the tools, but not in the elements that lead to iteratively and systemically improving brand experiences.
Moreover, the craft was born with a focus on the wrong end of the equation: conversion.
How CRO Puts Conversions Over Customers
Of course, everyone wants more conversions. But wanting more conversions doesn’t mean you will get more.
Focusing on conversions (and conversion-oriented touchpoints) is limiting.
CRO isn’t focused on what the customer wants or needs, but rather on getting a potential customer to click, download, buy, etc. more efficiently.
To make matters worse, experiments are often set up without regard to the 360-degree customer experience. Test preferences are based on a single channel, in a single session, without insight or care to what stage of the buyer journey they happen to be in, whether they’re an existing customer, or if they’re even a potential customer.
I’m not personally attacking people who do CRO for a living. It’s just the craft and industry are flawed. This forces marketers who do CRO for a living to conform to broken industry norms. This is true of most of the leading CRO tools as well. Too often, tools like Optimizely, HotJar and others are used to conduct A/B or multi-variate tests to test various messaging, creative, or other UX variations.
But these tools are incomplete without taking into consideration audience types, stage in the buyer journey, channel path, UX best practices, design thinking, actual statistical rigor, and a testable hypothesis that can be implemented across the customer experience where applicable.
For example, the CRO tool may crown a winner of an experiment based on “statistically significant” data. The statistics are there, right? We had 300 conversions across 10,000 sessions and “Learn More” got 175 conversions, while “Get Started” received merely 125. Stats don’t lie, right?
First, there isn’t a testable hypothesis here. This means that even if the results were statistically relevant and accurate, we can’t apply this learning in a meaningful way across customer touchpoints to iteratively improve their experience.
Moreover, many times you can take a simple experiment like the one above (whether it’s a CTA, image, layout change, etc.) and run the test 10 times and you’d get a different result 50% of the time.
How is that possible? We had more than enough data to call a winner.
But we didn’t.
Most of the time it’s nearly impossible to control the variables sufficiently to get a clear winner.
If 10,000 visitors came, the break-down may have been similar to the following:
- 5,000 are potential customers
- 1,000 are in the initial stage of their journey
- 3,500 are in the middle of the funnel (at various mid-funnel stages)
- 500 are lower funnel
- 2,000 are existing customers
- 500 people are visiting because they’re trying to sell your company something
- 500 are stakeholders (employees, stockholders, others in the industry, etc.)
- 500 are relevant, but random (media, competitors, etc.)
- 500 are irrelevant (non-potential customers, relatives of employees, accidental visitors, etc.)
Even if you design experiments to just focus on the potential customer group, trying to design the experiment to control for stage in the buyer journey, device, channel, etc. is incredibly difficult.
Hence, many times CRO leads us to walk in proverbial circles.
Don’t get me wrong, I’m not saying not to test. We can learn from experiments if we can control enough variables and understand the limitations.
My point here is that CRO does not optimize customer experience. At its worst it actually can make for poorer experiences, and at its best, it simply removes friction from conversion-oriented touchpoints.
We Can Do Better
What we need to do is elevate our craft.
We need to combine a scientific-method approach with design thinking, UX best practices, statistical rigor and a relentless pursuit to improve our customer’s experience across all touch points.
We need to make our customer’s goal our goal. We need to build experiments that help us statistically and systematically move toward true empathy and sympathy for our clients. We need to design experiences that educate, entertain, inspire and ultimately empower our customers.
We call this User Experience Optimization (UXO).
Why UXO Is the New CRO
Note the difference. It’s a small, but profound change. We’re not worried about conversions. Conversions are a byproduct of caring about the customer and solving their problems, not ours. What we’re focused on exclusively and relentlessly is the user.
We start by working with UX and strategists to look at data (quantitative and qualitative) and UX and creative (to apply design and UX thinking to customer touchpoints) so we can holistically look at how we can improve the customer experience in a Kaizen-like manner.
After that, we create a hypothesis that can be applied across similar customer touchpoints.
For example, we may hypothesize that certain landing pages are too focused on direct response and may not make potential customers feel comfortable with a high-consideration purchase.
So we build tests and testing strategies that don’t just test our assumptions about a mythical single path or a hypothetical single persona, but also work for the real world, where people shop on a lunch break, or look into a new SaaS technology over coffee. With UXO, it’s not just about improving a conversion by trying to continually smooth the same funnel. It’s about evaluating how the funnel works in the first place, and how it can work differently.
In order to reinvent that wheel, we have to break some traditions. First, we don’t do partner-driven sales processes, where you get sold on a tool and then an agency to go along with it. We talk to you like we’d want you to talk to us - as people with problems to solve. And we can bring an entire full-service agency to help you solve them, even if a testing and optimization program is the linchpin of your needs.
Likewise, we tailor our tools to our strategy and your needs — not the other way around. While budget is always a factor, we know the right tool for the job isn’t always the one that’s already in hand.
This also plays into how we test: carefully, with a focus on up-front usability research and heuristics, not just the seniority of an ideas originator. And we know that revenue isn’t something that we can always attribute — so we talk about it.
Instead of providing numbers based on irrelevant actions that turn into half-truths at the end of the year, we measure and report exactly what we can reliably track and attribute.
But you shouldn’t just test landing pages, websites or mobile apps. Taking an integrated approach is incredibly powerful.
Most social ads get incredibly low CTRs. Display and programmatic aren’t much better.
Hence, if you can iteratively improve your CPM buys you give your customers better and more relevant ads, but you can always dramatically transform your media effectiveness.
Many of our ads end up outperforming industry averages five- to tenfold. This means the 250k you spent to buy 100 million impressions just drove 500k visitors versus the 100k you would normally drive.
Taking this approach is even more important in our converging online/offline world with limitless interfaces and an always-connected customer.
We have to holistically improve their experience. We have to break our habit of trying to focus on single interactions and /or single channels when we’re testing. We have to recognize and appreciate how people experience brands in real life.
We also need to understand the value and importance of learning. Learning is more important than any winning test, campaign or conversion. Learning brings us closer to our customers. Learning helps us design experiences that empower customers.
Is the Future UXO?
Maybe UXO will never be a thing outside of the walls of Nebo. We’re just a small agency that’s trying to do good work while fighting the dehumanization of our industry.
But it might be a thing.
More importantly, we believe in this approach. We know it works. Our clients know it works.
It’s too simple, to be honest. Maybe I could’ve just said being radically and relentlessly focused on the customer was our strategy. Regardless of the simplicity, I hope other marketers embrace this approach.