7 Questions to Improve Your Data-Based Web Analytics Decisions
How to make sure you make the right data-based decision in 7 easy steps.
The following is a guest post by Claudiu Murariu, Founder of InnerTrends, author of The Experiment.
Picture the scene: It’s 10am on a Monday and you’re sitting in a meeting room surrounded by tired-looking colleagues, all hoping the speaker will hurry up already because it’s time for some coffee. Someone proposes a solution based on the latest data and you can feel the collective sense of relief that finally, a decision can be made. Coffee time?
Not so fast!
As a web analytics consultant for both large and small companies, I’ve been in many meetings like this. The client makes a claim based on web analytics data and wants to jump right into action and use it to make a business decision.
But here’s the catch, often the data isn’t telling them the story they think it is and they’re actually jumping into a decision that may cost them dearly. That’s why I take my role of challenging them very seriously, so we can end the meeting confident that we are on the right path.
Discover how to build actionable segments and deliver optimized and personalized experiences that drive higher revenues and engagement.
Based on the scientific approach to data (after all, web analytics is science) and the baloney kit detector of skeptic Michael Shermer, the following questions help us move forward with certainty:
1. How reliable is the source of the claim?
The first thing you want to check when you see data that stands out is if the aggregator that collected it has been implemented correctly. This is the most common culprit of spikes in web analytics data. Reading the data before checking out how it got there is a total waste of time and effort!
True story: The marketing expert in charge of the AdWords campaigns for a very big online store decided to go ahead with a project to improve the site’s performance from paid advertising. After a full week of work he deployed the new campaign on a Monday.
At the end of the week he was called to present the results to the top management. Tens of thousands of dollars had been spent with the best results ever. There was an increase in conversion rate, but more importantly almost zero bounce rate.
He received a big bonus and was made employee of the month. Two months later, at a web analytics training I gave to the company, this example was given to me to illustrate how skilled they were at optimizing their AdWords campaigns.
If you’ve worked in web analytics for as long as I have, you know that there’s no such thing as zero bounce rate, especially when there’s a lot of traffic involved. I ran a quick test in front of everybody showing them that all of the landing pages used in the advertising campaign had event tracking triggered from the moment people loaded the page, which automatically made the bounce rate drop to zero.
The event tracking had been pushed to the website by another department just a few hours before the Google AdWords campaign was deployed.
You can imagine how everybody felt at that meeting, especially the AdWords guy. Luckily, his campaign really was an improvement, but now the company had the accurate supporting data.
2. Have the claims been verified by someone else?
All aspects of any online business are strongly interconnected. When a specific department of a business is impacted by an event, its effects will be felt elsewhere too.
For example, if you have a very successful social media campaign, the email marketing department should feel it as well. Searches for your brand name in Google should also increase. Marketing is a mix of all of these different parts, not a lot of independent units each going about its own business.
Whenever you see a story in your data, take a step back and ask yourself, who else should be seeing results from this? Find that person and try to validate your findings.
True story: The CEO of a major local brand selling photo equipment sent an email to the marketing department telling them that he planned to cut the affiliate budget because they were spending a lot on it, and based on Google Analytics reporting, it only generated $200,000 worth of sales. The marketing department replied in seconds saying that that they definitely needed to keep affiliate marketing because based on the affiliate platform reports, they generated $1.5 million in sales.
I offered to help them investigate what was really happening. I found that affiliate marketing generated $200,000 sales from click and buy behavior, based on last click attribution, and assisted in sales totaling more than $1.5 million.
After helping them decide on a specific attribution model, we applied this model to the data and got them a more accurate picture. They decided to keep affiliate marketing.
3. What else can be explained with the data you get?
This is one of my favorite questions by far because the same data can tell many stories. It’s important to try and find out what each and every story is so that you can challenge them directly to find out which is the most conclusive.
When you see that the time spent on your website by users increases, it doesn’t necessarily mean that users decided to look at a specific product in more detail. It might mean that they didn’t find what they were looking for. The same goes for increases or decreases in the pages per visit rate.
True story: An eBay-like store was getting a lot of traffic to its expired products. Even more surprisingly, the conversion rate of that traffic was higher than the conversion rate of the people who visited available products.
That puzzled the site manager and their first thought was that they should make their active products more like the expired ones. That didn’t seem like the right conclusion, but they didn’t know how else to explain it.
Upon further inspection, they found that the category of expired products getting the most views was ‘antiques.’ When they then looked at the conversion rate for the active ‘antiques’ products they found it to be significantly higher than average.
The takeaway was that actually, the expired products didn’t have a better conversion rate than active ones, although it certainly looked that way at first glance.
4. Are the other metrics telling the same story?
After you have found a story that stands up to closer inspection, the next step is to make sure that all other relevant data is consistent with the story you have.
The metrics you measure for your website visitors are interconnected. When one metric is affected you can bet on it that other metrics will be affected as well. What does it mean when pages per visit increases but time spent on the website doesn’t? What about the other way around?
True story: Before becoming a consultant I used to work in web analytics for a great startup, reporting to the marketing manager. One day as he was going through the Google Analytics reports he discovered a bounce rate of over 90% on the pricing page.
An urgent meeting was called and the whole department had to drop everything else and work together to change the pricing page so that the bounce rate dropped. The designer was tasked with creating a new layout, the webmaster had to change the registration form on the page and a budget was allocated for remote user testing after the changes were done.
I decided to dig into the data a bit more because the extremely high percentage did not sound right, and I was right to do so. After checking the conversion rate of the page I found it was far higher than 10%. How was it possible to have more people converting and bouncing than people visiting the page?!
The bounce rate was over 90% specifically for the people who arrived to the website directly on that page, but that accounted for only a few dozen visitors. Thousands of people got to the pricing page via other pages of the website and they were not being included in the bounce rate calculation.
High bounce rates in most cases should have an obvious impact on conversion rates as well. Don’t just look at one metric, make sure the other metrics tell the same story.
5. Have you allowed someone to challenge your conclusion?
Probably the easiest way to validate a story based on data, or a decision that you want to make is to allow someone else to challenge it and try to disprove it.
If they fail to disprove your conclusion, it is definitely worth going ahead and digging into it a bit more. That doesn’t always mean it is right. However if they manage to disprove it, you more likely have gone in the wrong direction and you can go back to the drawing board.
True story: A customer was checking a conversion funnel and saw a very low conversion rate for the first step of the funnel.
Even though most funnels I’ve worked with have the highest abandonment on the first step, and things looked to be right, I still like to challenge each every piece of data presented to me.
My question to the customer was: “Are all the people that get to the first step getting the option to go to the second step?” The customer paused for a few seconds and told me “I need to go and fix this.” There are many scenarios where people don’t necessarily have the option to clearly go to the second step. Sometimes it just takes someone outside of your situation to take another view of the data and save you from making wrong decisions or wasting resources.
6. Is this how websites work?
When you’ve been in business online for a little while you start to get a feeling for how things work. You know there are no websites with 100% conversion rates or 100% returning (or new) visitors.
I don’t remember seeing the ten commandments of how the internet works, but I’m pretty sure one of them might be “You can’t have more people entering your website than exiting it.” Imagine having people locked in your website forever so that they never leave!
True story: An advertising agency was preparing a report for one of their customers on how to improve their marketing strategy and the agency wanted to present it to me before sending it to their customer.
The report focused on optimizing paid campaigns and showed that one of the campaigns, although well targeted from a keyword perspective, had almost a 0% conversion rate while others were closer to 3%.
The agency was recommending that the campaign be closed but that made no sense to me, especially because the targeting looked to be really well done. A 0% conversion rate for a site with otherwise good conversions, and what looked to be a well defined campaign is just not how the web works.
I tried to find another explanation for the 0% conversion rate and it turned out that the conversions from this campaign were being attributed to different campaign due to the UTM variables placed on the landing page of this campaign.
The campaigns defined in those UTM variables had a conversion rate above 3%. Not too shabby!
7. Are you taking it personally?
This is maybe the most difficult question to ask yourself, which is exactly why you have to ask it. With the amount of data you have in any web analytics tool it’s very easy to cherry pick only the data that tells the story that you want to hear. You have to always be a skeptic.
True story: I have worked with one of my preferred clients for almost a year to push the organization towards data driven decisions. Most of the decisions makers have been with the company since the very beginning, when their intuition played a big role in the company’s growth, but intuition alone was no longer enough.
After deploying advanced web analytics tracking for pretty much every metric they asked for, we got to the point where everybody needed to present data for any change they wanted to apply to the platform.
An A/B test was proposed for the algorithm behind the listing of the products inside the website. We selected the target audience together along with the success metrics, and the A/B test was implemented.
Two weeks later the results were presented, but both the audience and the success metrics had been changed, altering the winner of the split test.
The problem was that the control version, the original implementation that was being challenged, was the best. However, the guy who proposed the A/B test did not want his version to look like it was worse, so he searched for the data that made it appear better, causing his version to win.
He didn’t set out to lie or deceive others. He was taking things too personally and could not accept that a version he suggested was not an improvement so instead he only used the data that showed it in a better light.
Build a powerful conversion optimization plan. Learn how to integrate advanced segmentation, A/B testing and personalization in a single framework.
In Summary
It’s always better to make the right decision instead of just listening to the story you want to hear, no matter whose baby it is. And to do that, asking these questions should always help you find the best solution.