Across all analytical marketing professions, data integrity is guarded and protected like money in a vault. A major advantage of web analytics over traditional market research methods is the ability to collect and analyze all data, not just sample data.
Do you remember the exit polls of the 2000 Presidential election, where myopic sampling methods caused a major fiasco when TV networks prematurely declared Bush the winner of the presidency? If only pollsters could have polled every person voting in every county, not just samples. Do you think the whole controversy could have been avoided? How can 8,132 voters accurately represent a nation of 280 million? As pollsters learned, they can’t. This is the strength of web analytics: data collection of an entire population.
But what happens when web analytics don’t collect data of an entire population?
A Coming of Age Story
During the X Change show, I had the chance to talk to lots of analytics pros who work in the trenches every day. One story in particular caught my attention and -it seemed like- everyone around me.
It’s the story of a young analyst, who worked as consultant at Omniture for several years. He recently hit the jackpot landed a job heading up analytics for a rapidly growing SaaS company. When the time finally came to do his first big presentation for the executive team, he was disappointed to find some of the data he wanted to present looked anomalous. For some reason, data on the entire population wasn’t collected.
Without the time or resources to track down the cause of the wonky data before the presentation, this young analyst chose to use an asterisk on the PowerPoint slide. It was a heartbreaking story of a once bright and shiny young analyst, learning about the real world.
Welcome to the Real World
We’ve all experienced what this young analyst felt. It’s like learning that Santa Claus isn’t real; it’s innocence lost; the kind stuff that makes nice people become bitter, cynical, and jaded. It’s having to undermine your data with an asterisk because somebody else messed it up!
The problem is that Analytics aren’t front and center in requirements or processes in most companies. All too often, the analyst is left out of the conversation. When that happens, who is looking out for the data? It isn’t the content team, it’s not the director of marketing, and it definitely isn’t IT. How can an organization make data-driven decisions without the data?
Fact: When the right JS isn’t on the most important pages, your organization doesn’t get the data it needs to get ahead – or even to just stay in the game. Lost data causes the best-designed strategies to become little more than high-minded wishful thinking.
What you see in your analytics suite is skewed data. If you recognize it, it’s a lucky catch. But I’ve found that in so many cases, data just isn’t being collected, and what’s worse, it’s really difficult to see it’s not being collected! A classic case of the unknown unknown. Even if it is, sometimes it’s useless because the JS isn’t configured (no page names, for example.) The result: an asterisk on your PowerPoint slide.
Bottom line: tracking down individual pages with individual data collection problems in an environment where you don’t know what you don’t know, is like trying to find a needle in a haystack on the surface of Mars with the rover!
Improving data through planning and process
The adoption of some simple (and minimally invasive) practices can go a long way toward improving your data. The assistance of a few purpose-built tools can actually prove that the data collected is complete, valid, and actionable. Personally, I’m a fan of this three-pronged approach: Development QA, Quarterly Audits, and Continuous monitoring.
- After pages are built in dev, use a tool to audit the analytics implementation on each page.
- After the pages move to staging servers, all pages should get a complete “pre-flight test”. The auditing tool should be run in live mode, allowing server calls to be made. The success of these server calls should be checked in the analytics solution.
- Audit new pages after they are pushed out to production to ensure the implementation is still intact and ready for prime time.
Every page over the entire site should be audited quarterly to confirm analytics are working correctly, to identify any problems, and to show complete implementation for reporting purposes. Specific items to watch for are:
a. All pages are tagged
b. JS File is linked and working
c. Pixels are loading
Landing pages, checkout process, unsubscribe forms, and other mission-critical pages or multi-step processes should be continuously monitored to ensure the pixels always fire. This prevents long periods of data loss.
With some forethought and the right tools, you can make huge strides toward ousting the asterisk. Processes, such as I’ve discussed here, can remove much of the ambiguity and doubt in analytics data. By confiscating the weapons that management uses to attack analytics recommendations, you can help your organization make use of the insights you find.
- Metrics and significant data associations are presented as the facts they are, not just hypothesis or observational data.
- Valuate web analytics to management, advertisers, and publishers.
- Earn upper-management / executive action based on Web Analytics observations and inferences.
- Earn funding for marketing initiatives.
John Pestana is a co-founder of Omniture, and has recently co-founded ObservePoint, makers of web analytics tag auditing and monitoring software. More information on tag auditing and pixel monitoring tools can be found at http://www.OustTheAsterisk.comand http://www.ObservePoint.com
0 thoughts on “Data Integrity: the fundamental characteristic of actionable analytics”
Interesting article as we’ve been searching for ways to audit our tagging. The more we look in depth at our data we are finding gaps because of browser incompatibility with web analytics, wrong or missing tagging,…