Is your content marketing as accountable and measurable as your other channels? Is it even possible to know how much revenue your content marketing brings in, directly or indirectly? The answer is now yes – on both counts.

Creating and trafficking content can cost thousands of dollars – from background research and ideation to project management, production, editing and approval, it’s both time- and resource-intensive.

For all the time and money we put into creating engaging content, we owe it to ourselves to pin it against objectives, measure it, and ultimately know that it’s working hard for us. After many years of creating and tracking different types of content, we’ve identified a formula for not only creating great content, but also measuring and learning from it once it’s live.

What makes good content good?

In the performance realm, measuring creative is nothing new. We spend a lot of time A/B testing and optimising ads for paid search and display, but to date very few people have been able to apply the same level of objective assessment to longer-form content. The problem with measuring something like content is that we all have our own individual tastes and subjective opinions; there is no universal “good”.

But as strategists and creators, we know what emotion we want our readers and viewers to feel when they consume our content, and this gives us some clear objectives to work towards: all the content we create must inspire, excite, educate or convert. Experience has also taught us that coming up with great content isn’t a fluke. There’s a formula to it, and it relies on four key factors:

 

Audience + Topic + Format + Attribution

This formula gives us a framework that we can measure against – and it has culminated in what we call the Content Scorecard. The Scorecard is a methodology that helps make sense of the subjective… by making it objective. It employs hard metrics to measure audiences, topics and formats, and cross-references this with attribution reporting to give a comprehensive view of our content’s performance.

How do you measure the “right” format or audience?

The Scorecard measures if we have the right topic and format by combining the content’s theme or subject matter with engagement and completion metrics, ranging from time on site and completion rate to actions on retargeting pools, subscriber numbers and social applause. To measure if we have the right audience, the Scorecard again looks at the engagement rate, theme, completion depth, and subscriptions sections, but also considers increases in repeat traffic and other signals that suggest audience growth and uplift. With the topic, format and audience performance data in hand, we can then conduct further attribution analyses to calculate the overall performance achieved.

The Scorecard in action

One of the first brands to benefit from the Scorecard was our travel client, Starwood Hotels. A batch of 60 pieces of long-form content and videos were scored on a quarterly basis by a panel of 10 people. After three rounds of scoring, we were able to group the content into broad categories and assign weightages according to location and market-based segments, which then informed future planning. The results helped us optimise the subsequent format mix and content lengths, ensuring maximum readership/viewership.

This framework helped us identify not only the topics that piqued our audiences’ interest, but also which of them directly impacted room bookings (i.e. had a direct causative impact our ROI). In short, we were able to add an objective layer to the subjective and demonstrate a real impact on Starwood’s broader KPI set, which in turn built credibility and budget allowances in the longer term.

The attribution layer

The Scorecard measures how content is resonating with our audiences; now the next question is whether the audience is actually having a positive effect on a brand’s bottom line. Most content marketers tend to track long-term trends in sales and look for a correlation over time because direct conversions (e.g. bookings or sales) rarely represent the true scope of content’s benefits.

However some brands want to see direct attributable value in the immediate term. To achieve this, we turn to the ABC framework:

1.     Acquisition tells us where the traffic came from;

2.     Behaviour tells us how users engaged;

3.     Conversion tells us how we’ve performed against our objective, be it a micro or macro conversion.

The ABC framework is uniquely suited to defining how a particular channel, customer segment or section of a website is contributing to your overall goals. Much of its power stems from the fact that it can be applied to historical data – even if you don’t have a content scorecard set up yet. But to make it work, you need to know what questions to ask your data.

Asking your data the right questions

Data without context is meaningless, and the only way to define the right context is to know what you’re looking for. When preparing a retrospective report for one of our national retail clients, this is exactly how we approached it.

Following the ABC framework, the client’s “C” (Conversion) objective was ecommerce transactions, and for “A” (Acquisition) we wanted to look at traffic to the on-site content hub. This left “B” or Behaviour as our key focus for measurement.

To analyse user behaviour with respect to conversions and the content hub, we segmented users into two buckets: those who viewed the blog vs. those who didn’t. This was perhaps a predictable split, but the findings that emerged were anything but.

The fruits of our analysis.

We found that just 6% of all converters were accessing the blog during their buying journey – a tiny 2.26% of all sessions. But this small blog audience was performing at an incredible rate, spending 66% more per transaction than non-blog users, and at a 65% higher conversion rate too. Even more incredible was the revenue per user: at $12.83, it was 6 times higher than the site average of $2.12.

An incredible 6:1 difference in Revenue per User.

We then asked ourselves, was this a causative or correlative relationship? Were they converting and spending more because they were reading the blog, or were they reading the blog because that’s what converters just do? The numbers again gave us our answer: 274 of the 332 converting users viewed the blog before purchasing. We also saw that 222 of them were from organic search – though none of them were new, first-time users.

All in all, by understanding what we needed to learn from our data and intelligently segmenting it to answer our questions, we were able to uncover numerous valuable insights, none of which would’ve been visible from an aggregated data set.

1.     Blog readers generated 6 times the revenue per user compared to non-readers.

2.     Blog readers converted at a 65% higher rate than purchasers who didn’t.

3.     Blog readers spent 66% more per transaction than non-readers.

4.     Blog readers who converted were never first-time users.

5.     Blog readers viewed over 10 pages per visit, and 14 per converting visit (compared to 5.6 and 9.61 for non-readers respectively).

For the client, the key takeaway was clear: do more to engage this audience, and continue trying to grow it. For us, the key takeaway was that content isn’t just a performance channel – it can be the highest-performing channel of all. You just need a clear measurement system in place. So, we ask you again: is your content marketing as accountable and measurable as your other channels?

Shares
Divya Voleti

Divya Voleti is a Strategy Director at iProspect. In her current role, she drives consumer first thinking and works with clients to co-create customer engagement strategies. She is also one of our Innovation Champions, driving collaboration and client success using the widely recognised design thinking framework.