how tactical testing can be unhealthy for croThere’s nothing wrong with testing, per se. There’s no doubt that it should be part of every content management system (CMS) – not building in the ability to optimize in your CMS seems retrograde. Testing, however, is not equal to conversation rate optimization (CRO).

Right now, testing velocity is all the rage. Marketers are primarily concerned with how many tests they can run in parallel and if they have enough data that’s stable and non-seasonal. There seems to be the notion that more and faster testing is the best measure of CRO progress.

Here’s why it’s unhealthy to focus solely on testing:

The Limitations of Testing – Climbing in a Blizzard

Picture this: You’re trying to climb to the peak of a mountain range in a blizzard. There’s a hail of stuff impinging on you, and you don’t know which way you’re going. So you take a step in the direction that leads steepest upward, and when you get to that spot, you take another step which gives you the most uplift from there.

The next morning, you’re going to see the landscape that you didn’t see in the blizzard. If you started somewhere in the shoulder of a mountain, and during the night, you went up until you couldn’t get up any higher, you end up at the top of a little hill.

Local Maximum Point Mountain Analogy

That blizzard is analytics data- we usually don’t have clean analytics data that’s reliable. So, we look for improvements and take steps in the direction that leads steepest upward, in the hopes that if we keep doing that 100% of our visitors will take our desired conversion action.

That’s not how it works, however.

If you focus solely on testing, you’re only going to reach the top of the small hill – your local maximum point.

Where you want to go, however, is the top of the tallest mountain in the area (global maxima). The problem is, via testing, you can’t get there from where you are, because then you’d have to get worse in performance. You have to get off that little hill down into the first Valley of Tears, climb up to an intermediate-sized hill, go down the back side of that, and only then can you start ascending to the top.

Why Tactical Testing is NOT the Answer to Everything

Here are some reasons tactical testing will not get you to CRO nirvana:

  • Testing velocity becomes an end onto itself. You end up focusing on the quantity of tests you can launch simultaneously, rather than the overall impact the tests have on your site.
  • Testing trivial stuff is a crutch. Famously, one of Google’s creative directors quit when they decided to test the RGB value of the Google blue on the call-to-action button. Google has ridiculous data rates – they can reliably, statistically measure anything, but should you be measuring some of those things?
  • Diminishing results as a page improves and game-changing ideas are scarcer. As your page gets better and your good ideas get used up, eventually, there aren’t going to be giant improvements because your ideas aren’t as interesting nor fundamental.
  • Undervalues and often preempts or compromises large-scale redesigns. Testing undervalues all other CRO activities: primary user research, experientially looking at every touch point, and re-examining your business model and your organization. All of these things matter, and you can’t tweak more fundamental things with technology platforms you’re on if you’re just paying attention to the web and on-site experience via testing.
  • Is largely incompatible with major improvements via redesigns, business model changes, or major technology platform implementations. You’re not going to make those quantum leaps past some point if you always just test and improve what you have. Once in a while, you have to stop the testing – in the middle of a major site redesign, for example. If you’re plugging in marketing automation platforms or other business intelligence stuff that’s going to change your on-site experience, or do lead scoring and have a one-on-one experience based on your site’s whole history of interactions with that person, you can’t do that via testing. You have to invest resources in technology platforms and implementation to get that capability.

What You Should Be Doing Instead

  • Pick flexible testing tools that allow you to test non-trivial ideas.
  • Understand when testing is not appropriate.
  • Focus on profit improvements and not testing velocity. Focus should be on effectiveness not efficiency. It’s not how many tests you’re running but the outcomes you’re getting with them. If you start flattening out with testing certain parts, think about other CRO activities that might move the needle faster.
  • Have tactical and strategic testing tracks going on side by side. Have two separate testing tracks – one for the small and trivial stuff that you’re just pumping out, and another one for the more fundamental things like reconfiguring your whole registration experience or checkout flow. So once a quarter, do one of these big strategic tests as well in non-overlapping parts of the site, and have the tactical stuff going on side by side with it.

Split tests and tweaks are important to the overall picture, but those tests are sometimes mistaken as the overall picture. That’s a big mistake – conversion rate optimization is a far richer, more involved process than that. Tactical testing is fine, but the bigger strategic objectives shouldn’t suffer for something as trivial as testing velocity.