Written by Robin O’Sullivan originally published on the Newsweaver blog, September 22, 2016.
Data analysis can provide professionals in many walks of life, from sports to communications, with insights into their performance. However, without a focus on the bigger picture (your internal communication goals) the data may not be providing useful information.
The English Premiership has returned for its nine month carousel of craziness, with the usual fanfare, hyperbole, razzamatazz and overexposure all present and correct.
Away from the showbiz side of the circus though, one of the more interesting trends is the use of metrics to analyze performance. It is becoming more pronounced season on season, and indeed is emerging now in most sports.
Metrics have always been important in some sports, particularly in the US, where both American Football and Baseball are a data analyst’s dream. But it is only in recent years, similarly to the field of internal communications (IC), that metrics and data analysis have really gained a position of prominence in analyzing a game a football.
When watching match analysis today, the average football fan is now exposed to player heat maps and individual player metrics, as well as team metrics.
One particular example from round two of the Premiership gives a great example that IC can learn from in terms of effectively analyzing metrics. It is a perfect example of showing the worth of examining metrics both in an overall context and in detail.
Recently Burnley, a team just promoted to the Premier League, played Liverpool. Of course, in the previous century Liverpool was one of the bigger teams in England, and is still regarded as a solid Premiership team, despite not having won the league for almost 27 years and counting.
Liverpool were favorites to win the game, having strengthened their squad over the summer, and having had a good result in their first game of the season.
If we were to look at the metrics from the match, without knowing the final score, I would wager that most people would put their money down on Liverpool winning the game comfortably. That’s a logical presumption looking at the stats table. They had 80% possession. They had an impressive 26 attempts on goal, whereas Burnley mustered just three. They had 12 corners to Burnley’s one. And Burnley committed 14 fouls to Liverpool’s five, indicating that they were under more pressure during the game than their opponents.
Those are some of the basic outputs from the match. And it paints a picture of Liverpool superiority over Burnley. But now let’s dig a little deeper. Liverpool fired off 26 efforts on goal, but only five efforts were on target: less than 20%. Burnley had just three efforts on goal, but two were on target: a 66% return.
So in fact, despite all the possession, all the corners and all the shots, Liverpool only had three more efforts on target than Burnley. And this goes to the heart of what we talk about in an IC context when discussing measurement: which IC metrics are important and give insight and which metrics provide indications, but need a deeper dive.
IC metrics such as Pageviews can be interesting and give indications of employee behaviour, but unless they are placed in a wider context or broken down further, they may give us an impression that is misleading.
For example, if I have 1,100 employees in my audience, let’s say I look at a Pageviews metric for a specific page that I want my employees to read. I find that that page has generated 1,000 Pageviews, which might appear instinctively to be very positive.
But if I add greater context to the Pageview metric by combining it with the average time on page metric for that page, which is only ten seconds, then our view of the 1,000 Pageviews changes.
Now we begin to wonder why, even though we are generating Pageviews, employees are not staying on the page long enough to properly engage with and consume the content.
Let’s flip that example. Again we have our superficially positive 1,000 Pageviews for the specific page. And we have calculated that it should take about two minutes for an employee to consume the content on the page. Turns out our average time on the page is one minute 47 seconds. Pretty close to what we expect, so IC can be relatively happy. And then we see that of those 1,000 Pageviews, the average scroll depth on the page was 90%, again telling us that the majority of employees accessing the page read most if not all of the entire page of content. So overall, IC can be pretty happy with how things are going, looking at those metrics.
In the Premiership example, Liverpool had 26 efforts on goal. Sounds positive. But only five on target. Sounds like shooting practice is required. Let’s add the most important metric as the final piece of our analysis: the final score. Burnley won the match 2-0.
So while an initial look at the metrics from the match would suggest Liverpool dominance, a look at the metrics on a deeper level, and including the final score (in an IC context the goal of the content item) reveals the full picture and shows us that our initial assumption on viewing the basic metrics was incorrect.
Based on the result of the game Liverpool might dig deeper into some of the other statistics to get a more complete picture of their performance. It had 80% possession, but how much of that was possession in areas of the pitch that might lead to a goalscoring chance for example? Twelve corners, but how many of those did the team execute in the way they expected to and in a way that led to a goalscoring opportunity?
In IC terms the result of the match, the most important metric, may be the number of unique employees that have engaged with the content. Back to our 1,000 pageviews. Before really considering that metric in the broader context we need to know what the goal of that content was. Liverpool’s goal was to win the match against Burnley, so no matter what amount of possession or efforts on goal, they did not realise that goal.
IC’s goal in this example is to have as many employees as possible consume the page content. So number of Pageviews seems positive, the average time on page is around what IC was hoping for, and scroll depth indicates that those consuming were engaging with the entirety of the content. But the missing piece of the jigsaw is the number of employees we reach. Is it five users generating all the Pageviews, which would be a terrible return even though all other metrics look positive? Or was it 1,000 employees generating those Pageviews, which would give that page a reach of 90% of the audience: a tremendous result on every level?
Even breaking that down further, via employee segmentation, we may find that the 100 users that did not view the content are all from the same department: Sales (it’s always Sales isn’t it?). This may indicate that while the campaign was successful in achieving its goal, IC may need to undertake some some investigation as to why the Sales Department are not engaging with content at the level the rest of the employee base are. Are IC not using Sale’s preferred channel of communication for example? This holistic view of the metrics, complemented by as detailed a breakdown of the metrics as possible, facilitates IC learning and improving its campaign execution campaign on campaign.
So in terms of IC measurement, if you are planning, executing and reporting on a strategic or tactical communications campaign, to really understand how you are performing, you need initially to set a goal and identify the metrics that will indicate to you whether you have achieved that goal.
Then, analyze your results using as a wide a view as possible to get the entire picture. Finally, dig deeper into specific metrics to see what you can learn about how your employees, or specific groups of employees within the organization, are adopting and engaging with the content you are consuming.
Or even better: Monitor your results regularly during your campaign execution and identify if you are on the right track to realize your campaign goals. If you’re not, pivot and make changes to your campaign content plan to reset your course toward realizing your campaign goal.
And take those insights and use them to ensure that your next IC campaign is more effective and successful, as you strive to continuously improve IC’s performance within the organization.