Cardinal Path’s response to COVID-19 Cardinal Path is sharing all we know to help marketers during COVID-19.  Learn more.

We’re currently designing an A/B test for a client’s product page. It’s the first experiment on the page, and we’ve recommended a number of changes.

The client pointed out that our recommended experiment contains, in effect, several variables. (Or as I called it, a “cluster” of variables.) How then will we know which variable had the greatest impact on the results?

A very good question. And the fact is, we won’t know which variable had the greatest impact.

For a first test, the goal is usually to achieve the greatest possible lift in performance. In most cases, you’re not going to achieve that by making one isolated change (for example the wording of the call to action, or the color of a button.) Usually, there’ll be a combination of elements under review.

How do you decide what to change? Well, it’s a mixture of art and science. In reviewing a page, you consider established best practices and what you’ve learned from past experience, then hypothesize as to how the page could be made to perform better. Just a few things you might consider:

  • Is there a simple, obvious call to action?
  • Why should the user do as you ask? What’s the payoff?
  • Does the page invoke urgency? Why should the user act now?
  • Are there any unnecessary distractions on the page?
  • Is there anything on the page that might undermine its trustworthiness or make the user hesitate?
  • Does the page communicate effectively with all different personality types? (For example, Humanistic, Competitive, Methodical and Spontaneous personalities?)
  • Are there any particular persuasion tactics that could be employed on the page? (For example, Social ProofLikingAuthorityReciprocity,  The Contrast Principle…  For more ideas, see this post.)

As you can imagine, you can usually spot a whole raft of issues. So on a first test, the redesigns are usually quite dramatic. You’ll have a large cluster of variables to test.

And yes, that means you won’t know which changes had the strongest impact. It’s even possible that some of your changes had a negative impact. From a scientific viewpoint, these experiments aren’t very “clean”.

But that’s what follow-up tests are for. You can’t expect to get all your answers from one test.

As I wrote years ago (in discussing when it’s advisable to end a test early) I think we should ask ourselves why we do these experiments. Almost always, it’s for marketing. The goal is to improve performance, not to advance scientific knowledge. “Cleanliness” takes a distant second to enhancing the bottom line.

Sharing is caring!

Popular

COVID-19 Crisis Navigator​

In partnership with Dentsu, Cardinal Path helps you distill the overwhelming news and information into a bi-weekly report highlighting emerging trends and insights during the pandemic.

EXPLORE THE REPORT

Thank you for your submission.

Message Sent

Thank you for registering.

Message Sent

Success!
Your message was received.

Thank you.

Message Sent

Thank you.

Message Sent

Thank you

Message Sent

Thank you.

Message Sent

Thank you

Message Sent

Thank you

Message Sent

Thank you.

Message Sent

Thank you.

Message Sent

Thank you for registering.

Message Sent

Thank you.

Click here to download access the tool.

Message Sent

Thank you for registering.

Message Sent

Thank you for registering.

Message Sent

Thank you for registering.

Message Sent

Thank you for registering.

Message Sent

Thank you for registering.

Message Sent

Thank you for registering.

Message Sent

Thank you for registering.

Message Sent

Thank you for registering.

Message Sent

Thank you for registering.

Thank you for submitting the form.

Thank you for submitting the form.

Cardinal Path hosted a live session to connect with you and answer all your questions on Google Analytics.
Get all the expertise and none of the consultancy fees in this not-to-be-missed, rapid-fire virtual event.

Thank you for registering.

Cardinal Path is continuing with its series of free training. Next we are conducting training on Google Data Studio. Check it out here.

Message Sent

Thank you for registering.

Message Sent

Thank you for registering.

Message Sent

Thank you for your submission.

Message Sent

Thank you for registering.

You will receive a unique link to view the webinar via email.

Message Sent

Thank you for registering.

Message Sent

Thank you for registering.

Message Sent

Success! Thank you
for reaching out.