The fallacy of website ‘engagement’ KPIs

Author - ClarityDX

Posted By ClarityDX

Date posted 2nd Nov 2020

Category Marketing

Array

Nearly every website project brief we receive has engagement related metrics outlined as objectives or KPIs that will be used to decide whether the project has been a success.

But the reality is that these engagement related metrics are often the wrong things to be measuring. In some cases, a lower bounce rate doesn’t mean a more successful website. In other cases, average session duration is just as misleading.

Here’s why you should think carefully about them before putting them into a website brief.

“We want to lower our bounce rate”

A bounce rate measures how many people come on to a website page and then leave again before visiting any other pages. A higher number means more people coming and leaving again straight away. And a lower number is often suggested by marketers as meaning a website is performing better.

But in certain situations, someone might come to a site, get what they need and then leave again. In fact you could even argue that if one page provides everything it needs to the visitor and they don’t need to click through to any other pages, then the job is done. In this case, a higher bounce rate could suggest a more successful website, not a less successful one.

A great example of this would be the BBC Weather landing page. What do you think the bounce rate is? I don’t actually know the answer, but could hazard a reasonable guess that it’s in the 90%+ range.

The most common journey is probably that users Google ‘BBC weather’, click on the page, get the weather information they need from this one page, and then leave again. I.e. they bounce.

Does this mean the website is not a success? No, quite the opposite.

The point here is that bounce rates are nuanced, and measuring them has to be done in the context of the overall customer journey the visitor is on and what they are looking for at that stage.

“We want a higher time on site”

Another so called ‘engagement metric’ that comes up often on briefs we receive is time on site, or average session duration, or other ‘time’ related measures.

Marketers need to reframe what success looks like when it comes to how much time a user spends on their website.

Sure, if you’re a publisher running a website such as the Evening Standard, then increasing time on site will be a priority. Literally the more time people spend on your website, the more money you make.

But if you’re a marketer in the B2B space, your currency is leads, pipeline and revenue. If a user spends 30 seconds on your website or 5 minutes on your website, there isn’t necessarily a direct link to the numbers that really matter.

Once again it could be argued that if you get your visitors to the information that they need and want as quickly as possible, then you should be aiming for your ‘time on site’ to come down, not to go up. If it goes up, it could be a bad thing – an indication that users are actually struggling to find what they need.

We have the numbers to back this up. We’ve launched numerous new website projects where we haven’t seen much change in bounce rates or session duration, or they’ve gone up or down in the opposite direction to what our clients might have expected. But lead generation has gone up, and in the context of our B2B technology or B2B services clients, that’s what is most important.

“We want pages per session to increase”

This one is really similar to the point above around time on site. A user viewing more pages on your website isn’t necessarily a good thing.

Why is forcing your users to have to read more content and click through more pages a good thing? Surely we should be optimising for the opposite? Frictionless digital experiences that get the user to what they want as fast as possible, and therefore in the fewer pages per session possible.

Again, if you’re a publisher running an online magazine then pages per session is worth measuring. Ultimately your revenue is directly tied to users reading more content.

But in lots of other environments, pages per session is frankly a useless metric that could not be not just irrelevant, but also potentially send you in the wrong direction completely.

What about engagement when it comes to SEO?

It’s because of the points above that arguments that Google considers engagement metrics as part of SEO rankings don’t really add up.

Engagement related metrics differ significantly across different websites, industries, and digital journeys. A weather site is different from a magazine which is different to an ecommerce site which is different to a B2B site.

So engagement metrics are murky at the best of times, and can’t be relied upon algorithmically to decide whether a website is successful or not and therefore how it should be ranked. Too many SEOs make blanket statements like “you need to improve your bounce rates to improve your rankings”. The reality is much more nuanced than this, and in most cases this statement is not correct. We’ve demonstrated above that a high bounce rate is not necessarily a bad thing, so why would Google rank a site poorly because of it?

So what should we measure?

The point here is not that these metrics should be ignored entirely, just that some careful thought needs to be put into when and how they are considered.

If you’re an online magazine aiming to build an engaged audience and increase your ad revenues, then these engagement measures are very important for you.

But if you’re a B2B company looking to increase awareness and drive lead generation, they might not be relevant at all. More useful will be things such as conversion rates, organic search visibility and content popularity.

To wrap up, digital KPIs used to assess website performance need to clearly link back to wider commercial objectives, and so it’s worth considering how relevant certain engagement metrics really are to you before you throw them into your next website brief by default.

Let's Talk

Do you have a web design and build project coming up that you would like to talk about?