Join our free event Dialogue Thursday on November 13. Sign up now!

When analyzing online visitor behavior, above all, study goals that are as concrete as possible and contribute directly to the "bottom-line" of your business.

But to get a complete picture of visitor behavior, you sometimes want to include all available relevant data in your analysis from different perspectives. In this context, "Time on page" and "Session duration" often come around the corner. Indeed, in terms of engagement metrics, only a limited number of metrics are available within standard Google Analytics implementations. Especially for purely informational and publisher websites, where - due to the lack of hard conversion moments - steering on "engagement" metrics is sometimes the only option.

But beyond that "Time on Page" and "Session duration," as the metrics are called in English, typical vanity metrics are, there are still some considerable snags in these metrics. The main shortcomings as well as possible solutions are discussed in this article.

How is Time on Page measured?

The method of measuring time spent on a website or page in GA has been in use since Google Analytics was still called Urchin, a time when javascript capabilities in browsers were less extensive than today.

Simply put, GA calculates the time a visitor spends on a page by comparing the timestamp of the beginning of that page visit with the timestamp of the next page the user visits. Which also immediately reveals the biggest flaw of this method of measurement; the time a visitor spends on the last page of the visit is not measured.

4 pitfalls, what are they?

1. Last page is not measured

Suppose (from an example of Google support pages) that a visitor sees three pages, and stays on each of these three pages for 5 minutes. This is a total session length of 15 minutes. But GA sees this differently.

Time on page

GA calculates that there was a total visit duration of 10 minutes. And for the last page "Page 3" in this case no time is recorded at all, but a pageview. Which would report as (theoretical) Time on page for Page 3 in this case 0:00. This leads to a large bias in the (average) time per page or session as reported in GA, and it can also vary considerably by page and visitor segment.

2. Large bias with high exit or bounce rate

Of visitors who visit only 1 page (bounces), no Time on Page is recorded at all. While this is very common on publishers' content websites, for example, and as mentioned earlier, it is precisely these websites that make greater use of softer engagement metrics such as Session Duration and Time on Page.

The problem also plays out above average for pages with a high exit rate; it may then appear that engagement is low due to a low (distorted) time on page, when in reality the visitors had just reached their "destination" and studied that page with attention and then closed it (ergo exit; no time recorded in GA).

3. Does not take tab-browsing behavior into account

GA only looks at when the page, and the subsequent page, loads. But whether the visitor actually pays attention to the page is an entirely different question.

Now it is commonplace to keep numerous tabs open and switch between them, especially on desktop devices. So it may well be that someone loads a page, then "disappears" to other tabs that hold their attention, and only comes back to that particular tab after 20 minutes and then continues surfing. Then as "Time on page" 20 minutes will be reported in GA (by the way if this takes longer than 30 minutes, a new session, and not a "Time on page", will be logged in GA).

For someone who is heavily tab-switching for a period of 20 minutes, this adds up to many hours of "Time on page" logged in the various GA profiles of all visited websites (it is estimated that ⅔ of all websites have GA implemented).

4. No unambiguous explanation possible

Leaving aside the above measurement problems; suppose the average "Time on page" and "Session duration" do add up exactly. Even then, it is dangerous to draw too many conclusions from this, because it can be interpreted in different ways. Does a long "Time on page" mean that the visitor appreciated the content? Or, on the contrary, was the message difficult to understand and/or the visitor was in a state of confusion about the next step?

Is it disappointing to see a short "Average Session Duration" in a particular segment, or does it actually mean that the visitor had found their information quickly and easily?

What are solutions & alternatives?

It is sometimes suggested that we should at least correct for the bias that bounces and exits cause, which then leads to new metrics like "Average time over non-bounced views" (link). But this is only a stopgap measure. It does not solve the real problem of missing measurements.

The crux is in measuring the time on last visited page. This was very difficult until recently because once a visitor leaves the page, all javascript processes and http-requests from that page are also terminated. This made it impossible to quickly send a signal to Google Analytics saying "user has left the page".

Over time, several hacks have emerged to still better measure time. Usually by sending an event to GA every x-number of seconds (example) making it possible to measure in "buckets" how long visitors stay on all pages.

Javascript illustration of timed events

Time on page

But this approach is not ideal either; first, the total number of events can add up very quickly if many time-events are sent for each page visited. Such that you might sometimes exceed the limit of 500 hits per session comes out. Moreover, it still does not measure exact time, nor does it solve the trap of tab-browsing behavior. Finally, analyzing this event data is also cumbersome.

Accurate measurement of time

Fortunately, the development of javascript does not stand still. The introduction of two new features in particular offers solutions for correctly measuring "Time on page."

  1. With the Page Visibility API first, it is possible to see if a window has focus. If a visitor navigates to another tab or back, this can be noticed in javascript from now on. This API is now used by almost all browsers. supported.
  2. With the Beacon API it is then possible to quickly send a signal to an (analytics) server after a page has been closed. This solves the 'lack of measurement on last page' problem. This API is also used by more and more browsers supported, only Safari & iOS support is still missing, but that won't take long.

So with these two features, it is possible to correctly measure time and attention, as was done several years ago noted. Back then, however, browser support for this was much more limited than it is today.

A library that makes good use of these features, including clear implementation instructions, is the Autotrack GA library, created by Google engineers themselves (but not officially supported by Google). This library offers several plugins, including the "Page Visibility Tracker“.

Time on page

After implementing this plugin, two new metrics are available in Google Analytics: Avg. Page Visible Time (per Session) & Avg. Page Visible Time (per Page). Metrics that are basically true for all pages. Both in terms of time measured only when the page has focus (and is not a background tab), and time measured at the last page of the visit as well, according to a comment from the plugin's creator:

Time on page

To get a sense of how much the actual "Page Visible Time" differs from the default time as reported by GA, I installed the plugin on an SPA (single page application) streetview website of mine. So this is a website where all visitors see only 1 page, the home page = the website.

In this particular case, we see that the user spends on average much more time on my website (4:06), than the GA average (2:25) suggests. With the probable cause that for a lot of visitors previously "0" was measured as time, because they only visited 1 page.

Comparison with GA Session Time (second column) and Autotrack Plugin Page Visible Time (third column)

Time on page

Alternative engagement metric: Scroll depth

One of the most well-known metrics for tracking engagement on articles, namely measuring the extent to which people scroll down, cannot go unmentioned here either. Countless GA plugins have been created over the years that measure this in different ways. For example, by driving events at specific breakpoints (25, 50, 75%) link , or with the Autotrack plugin where a custom metric stores the maximum scroll depth. And with the advent of the Google Tag Manager scroll depth trigger setting up such a measurement within GTM has become even much easier.

In conclusion

It disappoints me a bit that Google Analytics is not taking the lead itself in including new engagement metrics such as, for example, scroll rate and better measurement of time. Because judging from the large number of articles and questions about engagement metrics, the need is definitely there. And although Google Analytics in a broad sense is not standing still in terms of further development, this does apply to the basic metrics and interface within GA. Certainly those used most by occasional GA users and/or smaller organizations.

On the other hand, at least for the tinkerers among the analysts, plenty of fun challenges remain!

 

This article was published on April 12 at Webanalists.com