The Blog

Site Tagging Best Practice: Please Define “Best”

The IAB gets its foot wet and attempts to define some standards around site tagging. It’s a welcome move – actually I’m surprised that the Digital Analytics Association didn’t take it on earlier. The IAB is a mature organization that has more capabilities, so in any case, I’m in agreement with eConsultancy and the tag management vendors they interviewed: it’s a good and welcomed move.

The first step for the IAB is undertaking is a “Site Tagging Best Practices” draft for public comment.

Tags, Tags Everywhere

If you are reading this post, you probably know what about tags (or beacons if you prefer to use that terminology). You also probably feel there’s something wrong with them… and you are right. The IAB offers a brief overview of what kind of little buggers tags can be:

“A tag is a lightweight fragment of code implemented on a website that, when called by the browser can facilitate real-time transfer of data between the originating site and another party, or may interact with the site layout and content. These transfers make it possible to create a targeted website or provide opportunities to optimize creative messaging for a more personalized user experience. As interactive advertising evolves, so does the proliferation in tags available on websites. The increase in tags has created an increase in operational train, negative impact to user experience through latency, and increased privacy concerns with unintentional data transfers.”

I like the little visual history of tagging – although they could have started in the “pre-tags era” with logs (1992 – pretty much day one of the Web). Or right around when the very first ads appeared as early as 1993.

They summarize the top issues stemming from tags:

  • Operational strain: the supposedly, “Easy! Just one line of JavaScript on every page” was a marketing ploy used by most vendors;
  • Customer experience impact: “customers first”, yet, bad tagging practice is often putting the site at risk;
  • Privacy is a hot topic these days and rather than complaining about how most people simply “don’t get it”, it is our responsibility to act ethically and cautiously;
  • Data leakage is another one. As Joseph Lines from QuBit mentions in the eConsultancy article, I have never ran into cases were data was inadvertently shared with others, although I have seen agencies managing multiple client profiles under the same web analytics account… a risky practice.

The one that I feel the IAB missed is very important:

  • Data quality: poor tagging practice and the lack of standards makes every implementation unique and slightly different from one another. At Cardinal Path we have developed a robust framework for tagging, this is one of our unique skills that sets us apart. We make it a requirement to do an audit on every single one of our clients before we do any type of analysis – and trust me, what we uncover is generally messy and nasty. There’s nothing worse than making decisions on the wrong data.

Tag Deployment Checklist

The proposed Tag Deployment Checklist is a great start, although it focused strictly on the tagging aspect and avoids the configuration counterpart critical to good data quality and sound analytics practice. As this point, the list is very high-level and generic and I doubt the IAB will ever be able to provide hands-on details and tactics to accomplish this type of audit efficiently.

Are there pre-existing issues?

  • Take a snapshot of the page before and after a tag configuration.
  • Document any loading errors that are pre‐existing issues unrelated to tag implementation.

Are the intended tags firing?

  • Take inventory of existing and intended tag implementations and carefully assess all use cases and which tags should be active.
  • Using your browser QA tool of choice, confirm all tags are executing as they should. Make sure that there are no tags blocking the site at the top of the page and that requests and redirects are happening correctly.
  • Check for any JavaScript run-time errors identified by the browser tool in the tag load process. If any new errors appear, speak to the vendor and investigate the error type and reason.

Is data being collected and passed correctly?

  • Use tag QA software to review the parameters the tag is collecting
  • Verifying with the vendor, or in your account, that the data has been properly collected.
  • You should validate accuracy of data collection with each vendor.

Are tags affecting each other?

  • After each tag configuration, confirm tags are not negatively affecting each other.
  • Confirm that any tag dependencies are being met. For example, if an event-tracking tag in an analytics tool calls a library, the library tag must fire first

My Take

It’s nice to see the IAB taking on this issue with the help of TMS vendors. I guess independent TMS vendors have a vested interest in demonstrating their value in a market where both Google and Adobe – the two leaders – offer their TMS tools for free. Surprisingly lacking from the IAB Data Council are the web analytics vendors themselves – only Google is listed as a member. Where are Adobe, Webtrends, IBM?

Also note how they mention to use “your browser QA tool of choice” – frankly, the choices are quite limited if you don’t want to mess around with debuggers and proxies. If anything, the advent of TMSs makes it even harder to conduct sophisticated quality assurance of your implementation. As I recently took back ownership of WASP – the Web Analytics Solution Profiler that I created in 2006, you can rest assured we are actively working on the next generation of data quality tools. In fact, it is so ground breaking that several concepts are being considered for patents. The IAB emphasis on standards and quality comes at an interesting time!

This entry was posted in Technology, Web Analytics and tagged , . Bookmark the permalink.
  • Todd Belcher

    Thanks for the emphasis on data quality, and for reminding that comments are “due” by Friday! Will be sure to weigh in… looking forward to the next generation of WASP.

  • Sébastien Brodeur

    I don’t see any value for the industry coming from the DAA lately. The last standard dated from 2008. The last time I really hear of them was the name change (a long time ago now.)

    It seem is sole purpose is to promote eMetrics and the outdated UBC classes. I’m sorry for the people involve in DAA, but this is my perception.


Cardinal Path Training

Copyright © 2014, All Rights Reserved. Privacy and Copyright Policies.