Skip to main content

Reverse Engineering the Ad Budget

New research finds ways to make advertising more efficient, without spending extra money

Business target
  • A study found that brands in niche or growing categories rely more on analytics-based budgeting.
  • Brands within the same company use different ad budgeting methods to leverage unique conditions such as market leadership.
  • Shifting to analytics-based decision-making is not profitable for mature brands or brands that do not operate in uncertain categories. But analytics— particularly experimentation—pays off when managers are uncertain about advertising effectiveness.

 

Marketing managers who hold the advertising purse strings have a lot on their mind. How do we best reach the target consumer? What’s the minimum spend to get results? Can we ever really know with precision whether an ad was effective? 

On this laundry list of considerations, perhaps the least sexy is the process by which the advertising decision is made. Do we go all-in with the new data analytics department that the firm recently created at great expense? Or do we rely on tried-and-true methods that we’re more comfortable with?

Despite advances in analytics, the answer is clear: the ad world is still ruled by intuition. Ceren Kolsarici, the Ian R. Friendly Fellow of Marketing at Smith School of Business, has shown that marketers base 70 to 80 per cent of their decisions on heuristics — essentially, easy-to-apply rules of thumb mixed with experience.

But Kolsarici is not about to make ad managers feel guilty. Her research shows that it is not an either/or decision between heuristics and analytics, or even a matter of picking one best decision-making method. Her advice to ad managers is to agonize less over optimizing their ad spend and to devote more time optimizing the ad budgeting process itself.

How to do it? Kolsarici’s latest research, with Demetrios Vakratsas (McGill University) and Prasad Naik (UC Davis), shows the way. It offers a rare glimpse of the budgeting process, based not on surveys of managers but on how they actually spend money and on the sales results. The researchers use simulations to show how budgets can be improved by adding analytics to the recipe.

Dining at the advertising buffet

Generally, marketing managers treat the ad budgeting process as a buffet — they pick and choose from a number of approaches. This “multiple stakes in the ground” strategy may make sense since products are often at different stages and face varying levels of competition. But it also means managers never really know which approach results in the best sales performance.

Kolsarici and her colleagues confirmed that big advertisers commonly use a mix of four budgeting strategies. Two are based on heuristics or managerial judgement: 


• Advertising-to-sales (A/S) ratio: based on the percentage of past or expected sales. This is effective in a growing market but can lead to sales erosion in a declining market.

• Competitive parity: budgets set proportional to market share. An approach that could lead to erosion of ad effectiveness due to increased clutter as brands try to keep up with competitors.

And two are based on analytics:

• Baseline spending: the mathematically-driven optimal budget assuming managers are certain about their advertising effectiveness.

• Adaptive experimentation: continuous testing and refining by manipulating the advertising budget above or below the optimal level, leading to over- or under-spending. Not to be confused with field experiments, this approach gives an up-to-date and rolling view of ad effectiveness. 


Their study looked at how these four ad budgeting strategies were used at eight top-performing brands of hybrid cars, beer and yogurt. A few trends emerged. 

For one, the four advertising budgeting methods were deployed at different proportions by each brand, possibly to leverage unique conditions such as market leadership and category growth. Car brands, for example, tended to experiment more because hybrid cars are an emerging niche where little is known about consumer response to new auto technology or brand advertising. 

By contrast, yogurt is a mature category. So yogurt brands used heuristics-based methods almost exclusively, balancing between advertising-to-sales ratio and competitive benchmarking. 

And beer brands used a mix of approaches. For example, Bud Light relied on baseline spending, Coors Light used a balance of analytics and heuristics, and Miller Lite relied on managerial judgement. 

“Brands that are in niche categories or that are in growing categories use more analytics, specifically experimentation, because managers are more uncertain about advertising effectiveness,” says Kolsarici.   

Not only do ad budgeting approaches vary for brands in different companies or industries, but they also vary within the same company. This highlights the importance of brand architecture

Consider Bud Light and Budweiser, the top-selling brands of Anheuser-Busch. Bud Light (operating in a category where ad effectiveness is fairly well known) is the brand leader and wants to keep it that way. It based 78 per cent of its ad budget on baseline spending, an analytics approach. On the other hand, Budweiser—which is experiencing declining sales—based 78 per cent of its ad budget on the advertising-to-sales ratio, a heuristic approach. 

Beware the field test

Kolsarici says the study is the first to empirically show how big brands use adaptive experimentation to deal with uncertainty, in particular the great unknown about ad effectiveness. 

She cautions managers to be aware of the difference between field tests, which are episodic, and adaptive experimentation, which is dynamic. Field tests only capture insights at a particular point in time or for specific campaigns or test regions.

“There is a lot of research that shows that field experiments have very little power,” she says. “It’s statistically impossible to isolate something effective from not effective. We’re proposing continuous experiments that never end. So it’s almost an integral part of your decision-making process.”

In case you’re not sold on the value of adaptive experimentation, Kolsarici and her colleagues ran a series of simulations using the same brands and advertising data, but then tweaked the mix of approaches. They shifted 50 per cent of ad dollars from competitive benchmarking to adaptive experimentation while keeping the ad budget steady. 

The result: all brands experienced a sales lift. But car brands did especially well. Sales for the Honda Civic jumped 12 per cent and for the Toyota Prius 23 per cent. Clearly, some brand managers leave a lot on the table by undervaluing analytics.

“The moral of the story is shifting to analytics in terms of the decision-making process isn’t very profitable for brands that are mature or that don’t operate in uncertain categories,” says Kolsarici. “But if you’re uncertain about your ad effectiveness, like when you are running a totally new campaign or introducing a new product, experimentation really pays off.”

Kolsarici says managers can use these insights on the four advertising approaches to reverse engineer their own budgeting strategy as well as the strategy of their competitors. CMOs can use the same framework to understand, in a bias-free way, how brand managers really make decisions and whether or not it’s worth devoting more resources to analytics. The key metrics are ad spending and sales performance.

“Often when you look at ad optimization, we look at the optimal budget — should we increase or decrease the budget,” says Kolsarici. “Now, we’re not talking about increasing the budget at all. We’re talking about increasing the efficiency of the process, but with the same budget.

“In marketing more than any other sort of domain in business, we have to justify our decisions because we get the biggest part of the budget.”

 

Alan Morantz