A/B Testing with Google Analytics Content Experiments | Cardinal Path Blog
The Blog

A/B Testing with Google Analytics Content Experiments

Google Analytics has announced a new A/B testing feature called Content Experiments. This is a pretty significant evolutionary step for Google Analytics in making it an analytics and optimization tool. Think of this as Google Website Optimizer being baked right into the Google Analytics interface. Using Content Experiments in lieu of GWO will allow you to easily define content URLs and goals for your experiments, analyze your reports more efficiently and will eliminate the need for all those extra GWO tracking codes on your site.

I want to review the basics of A/B testing and running GA Content Experiments, as well as discuss some important technical details and advanced considerations. For existing GWO users it’s very important that you get familiar with GA Content Experiments because the standalone GWO product will be unavailable after August 1, 2012.

What is A/B Testing?

A/B testing takes a lot of forms. For this, we are specifically talking about A/B page testing. You define a control page (page A) and a variation (page B) of that original page to test against.  The purpose of this test is to expose your audience to the different versions of a page to determine which version will result in more conversions for your site.

A/B testing is very easy (and free) with Google Website Optimizer. Read how we helped YouTube increase signups by 15.7% through conversion optimization.

How to Setup a Content Experiment

Google Analytics Content Experiments - start page

You can access Content Experiments by logging into your Google Analytics account, opening the profile you want to run an experiment in and click the Standard Reporting tab. In the left menu click Content, then Experiments, then start experimenting!

The Google documentation for Content Experiments is pretty thorough so I don’t want to duplicate the info here, but I want to briefly describe the Content Experiment process and provide some additional commentary.

  1. Prepare. It’s very important that you identify the business objectives of your site, then setup GA goals to align with those objectives. Then you will need to put some good thought into what to test. Identify poor performing pages by reviewing your Landing Page reports for pages with high bounce rates, or your Page reports for pages with unusually high exit rates. Then hypothesize on what can be done to improve the performance of those pages. This is what you’ll want to base your experiment on.
  2. Configure & Modify. After you’ve identified your control page and created the variation pages, you’ll need to provide the appropriate URLs in the experiment setup. At this time a single content experiment can support up to 5 variation pages in addition to your control page. You will then configure some additional experiment options (such as identifying the GA goal you’re trying to improve) and grab the experiment code that will need to be placed in the <head> of your original page.  Read the detailed instructions on setting up an experiment.
  3. Track Progress and Stopping an Experiment. The reporting screen for an experiment will show you the number of visits, conversions and conversion rate that each experiment page has contributed to. An experiment can end via a few different methods: a) GA determines a winning page; b) The experiment has expired after running for 3 months; or c) You’ve manually stopped the experiment.
  4. #winning. If your experiment is able to identify a winner, congrats! If the winner was a variation page, you may want to consider replacing your control page with the winning version.

After you’ve completed an experiment, start the process again with another page and keep trying to make incremental improvements to your site!

Important Things to Consider

Content Experiments Graph

Be patient. Context Experiments will not choose a winning variation until an experiment has run for at least 2 weeks. This is a good thing. I can’t tell you how often I’ve seen erratic conversion activity in the opening days of a test. Conversion trends usually take a few weeks to stabilize. It won’t matter if you have enough of a visitor sample size within your 1st day. Day to day and week to week activity is so varied. Just wait a few weeks.

Calculate your sample size estimations. Although you can configure up to 5 variation pages for an experiment, be aware that adding a variation will require more visitors to meet your sample size requirements and will increase the time your experiment needs to run for.  Use our sample size calculator tool to determine the proper sample sizes for your experiment and how long you can expect a test to run for before determining a winner.

Be very careful when running multiple concurrent experiments. The danger is that if you have visitors from one experiment that interact with elements of another experiment running on your site, it’s very difficult to account for these interactive effects.  This will skew your data and lead to some faulty conclusions. Even though you can setup 12 experiments in a profile, that doesn’t mean that you should run 12 experiments at once. Unless you’re absolutely confident in how to analyze multiple concurrent experiments, it’s okay be safe and run one experiment at a time.

Content Experiment redirects are safe. What actually happens in Content Experiment A/B testing is that when a visitor visits your control page and GA instead shows a variation page to the visitor – it will immediately perform a JS redirect to the variation page. With GWO this commonly resulted in some wacky referrals showing up in your analytics data unless you reconfigured your analytics code.  Content Experiments has resolved this issue automatically by passing over the ‘utm_referrer’ query string parameter that tells GA what the true referrer is.  And in terms of SEO, there shouldn’t be any concern. Although I believe that search bots are pretty good with processing JavaScript, remember that Google Analytics is a JS and cookie-based solution, which many bots don’t typically handle very well. In addition, the Google search team is very aware of Google-based tracking/testing codes and will preserve your original pages as they should be.

URL Parameters pass through on redirects.  If somebody visits your original page via a URL with extra query string parameters (such as GA campaign tags), those parameters will be preserved and visible in the redirected URL.

Handling dynamic content. In the past I’ve always used GWO multivariate experiments to handle testing dynamic templates like product display pages. We don’t have that option, though, with Context Experiments (more on that later). If your dynamic content is served via query string parameters, you can rely on the previously mentioned passthrough of URL parameters to render your pages appropriately.  But if your dynamic content is served via permalink-style URLs, there isn’t an easy way to do this currently with Content Experiments. Let’s keep hoping for Content Experiments MV testing in the future!

What About Multivariate Testing?

As of right now, Google Analytics Content Experiments only supports A/B testing and not multivariate testing. With GWO shutting down after August 1, this may (temporarily) eliminate any MV testing with a free Google tool. This is unfortunate since I’m personally a big fan of MV testing. The extra flexibility that comes with that mode allowed me to test anything regardless of website architecture hurdles. There are plenty of other tools out there that are also great for MV testing, but I sure would like MV testing integrated with my Google Analytics profiles.  Fingers crossed!

There are really so many opportunities and considerations with Google Analytics Content Experiments. Do you have any questions or opinions about the tool? Leave a comment or let me know on Twitter. (@adrianvender)

This entry was posted in Technology, User Experience, Web Analytics and tagged , . Bookmark the permalink.

Copyright © 2015, All Rights Reserved. Privacy and Copyright Policies.