Last week I came across an analytics conundrum.
I had read many times that Average Time on Page (ATOP), and alternatively, Average Time on Site (ATOS), were good metrics for user engagement. Specifically, knowing how long people were spending reading and considering our posts could help you determine if you were making people think. However, our ATOS and ATOP was somewhat low (like REALLY low) and I didn’t understand why.
It had never clicked with me before, but how ATOP and ATOS are determined creates problems for blogs. ATOP and ATOS are calculated based on page calls, to illustrate:
p1’s time stamp is 0:00, and p2’s is 0:01, so the time on p1 must have been 0:01. P3’s was loaded at 0:15, so the time on P2 must have been 0:14 (time it was loaded of next page, minus the amount of time spent spent on the page before). Sensible, no? Except, what about your last page? With these calculations the time on the last page is never counted.
Blogs are known for having huge bounce rates. Some one visits a blog, finds a new post, reads it, then leaves. In fact, blog post pages are very frequently exit pages, so on a blog, if you have a large portion of your traffic entering a page, reading it, then leaving, how do you calculate ATOP or ATOS?
Further, if a regular visitor sees a new post, reads it, then leaves then that last page—the page they spent the most time on—is not going to be counted. That makes ATOS and ATOP a pretty innacurate sample.
In fact, it seems to me that ATOS and ATOP doesn’t actually calculate what their names suggest. ATOP calculates an average time on pages that are not exit pages. ATOS certainly doesn’t calculate the average time on site, because it discounts a huge number of pages (every exit page). This creates a host of problems for a blog, as the content that you want to measure is often the content that is not being counted.
There are no useless metrics
Brian Katz often says that there are no useless metrics, so I thought it might be educational to ask him about this issue:
Brian, am I correct in assuming that time on page is a useless metric for a blog?
Based on a conversation on this side of the office, and this blog post by Avinash Kaushik, I understand that time on page is calculated by taking the time-of-load of one page and comparing it to the time-of-load of the next. In the case of a blog, the most time on site is going to be on each individual article, but those will also have a huge bounce rate (since people come for an article, read it, leave).
- your time for each post is going to be a flawed sample (the people who read then clicked to the next article, or didn’t read and clicked back to the index)
- Your average time on site is going to be TOTALLY off, since the people who landed on a blog post, read it, then left are not going to be counted at all.
Brian, enlighten us please.