Today I was reading a white paper on a collaborative study by Psychster Inc. and that reminded me of just why I like the kind of easy testing you get with tools such as GA and GWO. This study asked which ad types are more effective: banner ads, newsletters, corporate profiles with fans and logos, corporate profiles without fans or logos, get widgets, give widgets, and sponsored content. To answer this question they formulated a study involving respondents from Facebook and While the test was web based, it was modeled after a pretty standard focus group test: you site a group of people down, make them interact with something, then ask opinions. In this case they took 478 people from Allrecipes and 681 from Facebook, showed them a narrated video of some one interacting with an ad mock-up, and then presented them with a survey. The Psychster white paper then compared the following 4 opinions (I assume the actual study had more options) for each content type :

  1. would they click links and interact as shown in the video
  2. would they buy products from that brand
  3. would they recommend the brand to a friend
  4. Did they see that activity as an advertisement

The question being asked seems innately quantitative. Which is more effective? Which has a greater chance of producing a sale/conversion? The survey approach used here doesn’t seem to effectively answer that, so much as it tests each users considered opinion (ignoring the role that irrational decision making plays in the buying process).

On the internet, though, it’s fairly easy to look at exactly how users are responding to advertising. Even just using GA a single person could set up the following test fairly easily:

  1. Create the seven above ads (alright, this step might take more than one person).
  2. Publish them across however many sites complete with source, campaign and medium variables.
  3. Buy a large number of impressions per site.
  4. Setup your conversion funnels
  5. Create a custom report breaking each medium down by campaign and source with basic stats (conversion funnel starts, finishes, visitor count, whatever other stats you think will be useful)
  6. Compare your report, conversion funnels, landing page pageviews and number of impressions you received to see
    • Which ads brought the most page views per impression.
    • Which ads brought the most conversions per impression.

Sure, there is room for error here as well (it’s a pretty basic test) but given some smart analysts/testers I am sure this could become a much more effective testing environment. For instance, one could carefully design a GWO test instead, host each ad on a site that they can edit (in this tests case and get even more statistically certain information.

Of course, where the Psychster test succeeds is in the offline queries, such as “I would recommend this brand to a friend”. I can’t think of a good way to test this using the kind of quantitative data that web analytics provides (though maybe our readers could leave their ideas in the comments) and perhaps that’s why they chose this method instead of looking at performance data.

That said, for the first query (“I would click…”) and perhaps even the second (“I would buy…”) this can give some pretty strong data, and illustrates just how much research power lies in the hands of anyone with a website.