In the age of store beacons, multi-channel attribution and real-time data logging, performance marketers are spoilt for choice when measuring and explaining results.

But what happens when a client doesn’t have access to these options? What if they still rely on physical instore data like foot traffic to measure marketing and sales performance?

Without a direct CRM integration, how could they work out what channels are delivering value, and how could we as agency folk provide the insights they needed?

This was the challenge that faced the Brisbane iProspect office when a big-box retail client, with around 50 locations nationwide and no robust means of cross-referencing online and offline activity, asked us to show how our digital efforts were affecting store sales.

 

Isolating the problem

 

Anytime one attempts to derive useful information from data, the biggest hurdle is always the same: eliminating variables.

Retail performance at this scale – especially when existing brand awareness and above-the-line activity skew the results – is a difficult thing to parse down into channel-specific findings.

Fortunately for us, the client had recently launched locations in two new states, and did so with relatively understated ATL support.

This presented us with a convenient environment for an experiment; in the absence of outside factors, the influence of our digital efforts would be more visible and representative than ever before.

So we began a test.

 

The test

 

The premise was simple enough: we’d record every AdWords click and compare this against actual instore foot traffic and transaction data, looking for any significant correlation.

For a large, clean sample, we concluded we’d need to dominate paid search across both branded and non-branded terms, achieving as close to 100% impression share as possible. Total market saturation.

We bid aggressively and bought category-wide non-brand keywords. While the usual spend on this type and volume of campaign would usually be in the mid-thousands, we up-weighted by 4-5x to ensure maximum coverage. This also helped to make up for the low ATL presence in these locations.

 

The results

 

The campaign launched a few months after the stores did, and continued for 6 weeks. For benchmarking, we compared our numbers against the preceding 6-week period.

While we were expecting to see some gains as our AdWords spend increased, we were surprised by how immediate and enormous the changes were.

Here’s a snapshot of results from the 6-week test:

•       1,934 store visits (1296% and 848% increase respectively)

•       27,196 clicks (688% and 1000% uplift per location)

•       275% and 1700% increase in transactions per store

•       $6.28 per store visit on weekends

•       7.11% click/store visit rate

In addition to these major improvements, we also saw significant correlation between day-to-day click volume and actual foot traffic. For example, high weekend volume and mid-week dips both perfectly matched.

 

Looking beyond CPA

 

A key knock-on benefit of this test was that it allowed us – and the client – to look past basic CPA.

Rather than simply asking “What was our AdWords spend vs. sales revenue this week?” we could now work backwards with our findings and inform decision-making at a far more granular level.

Knowing the average order value of an instore sale, and knowing that about 20% of foot traffic ended up converting, we could apply this to our 7.11% click/store visit rate and adjust our budgets and bidding strategies accordingly.

More importantly, the client could accurately calculate where their spend needed to be set to achieve a particular foot traffic or sales target.

 

Other findings and takeaways

 

We uncovered a number of interesting secondary findings too. Firstly, we found that while mobile searches drove 69% of store visits, tablet users had the highest click/visit rate – a fascinating outcome that we hadn’t predicted.

Secondly, our last-click attribution data also showed brand terms as around 70% of store visits, very unlike the 80-20 split we see in most verticals.

While we’re cautious to jump to conclusions, we suspect that this lower branded number shows that we were driving more upper-funnel response alongside the uptick in foot traffic.

We’d like to have been able to get full visibility across the entire funnel, though even with this basic analysis, we’ve gained valuable insights into the offline effects of paid activity, which will help guide decision making around future advertising investments and planning.

Shares
Bharat Tarachandani

Bharat Tarachandani is a Paid Media Manager for iProspect Brisbane. Bharat works across Search, Programmatic and Social, specialising in the innovative use of automation and data to drive performance for a wide range of clients in the Travel and Retail verticals.