Blog

When it comes to ad revenue, there are a lot of moving parts that contribute to a digital publisher’s success or failure. Profitably matching the bid and the ask for a given ad unit may seem relatively straight-forward in principle, but in practice it can be a tortuous affair. The sheer volume and speed of transactions can present a complicating factor. And when you consider timeout settings, frequency caps, floor pricing, inventory types, contending line items, bidder mix, and geo targeting — the real scope of the job begins to shift into focus.

Add to that the delicate value chain linking requests, impressions, viewability, and CTR and the immensity of the undertaking really begins to show. Factor in the publisher’s different properties, and pages and you can really understand how many moving parts there are to monitor and manage. Throw brand safety, privacy settings, and technical hiccups into the mix (load, server, rendering issues, etc.) and the challenge really crystalizes.

An Evolving Approach to Analytics

The interactions that dot this data landscape only grow faster and vaster as programmatic becomes more dominant. At the same time, despite serving the same underlying business interests, direct and programmatic demand often compete over the same inventory. That's another important vector for potential misalignment and sub-optimal revenue performance.

With so many interconnected monetization factors, expressed across so many different dimensions, there are literally tens of thousands of different data points that need to be reviewed daily.

To make sure money isn’t being left on the table, publishers need to examine those data points in light of seasonality, recent performance, market influences, normal variance, and factor entanglement. To do that they'll need prescriptive analytics tools.

Historically, publishers have been slow to embrace new tooling and technology providers have been happy to focus on other industries. But now that's beginning to change. In this article we'll explore the why and the how of that change touching on the history and future of publisher analytics in the process.

From Native Reporting

To make use of their in-built reporting capabilities, publishers turn to their SSPs, OMSs, and ad inventory management systems (e.g. GAM). These native reporting portals are very basic though; they can be difficult to use, they’re only minimally customizable, and they lack any sort of robust analytics capabilities. 

Most significantly, they aren’t unified. A typical mid-market publisher will work with multiple demand partners and may be hooked up to as many as 10 different SSPs. Working natively and separately in each platform’s reporting hub is simply not sustainable. 

And while many rely on GAM to pull in data across their various SSPs, for non-AdX networks, GAM’s visibility and reporting is limited. Some are willing to make do with that type of high-level, broad stroke overview, but others are not. Which is why there are so many tools devoted to aggregating and normalizing native reporting data for more detailed and centralized analysis. 

To Export Analytics

For many publishers, each day begins with an export extravaganza pulling reports from different systems to download the data for further (out-of-system) analysis.

Sometimes the data fetch is automated, other times not. In all cases though, insights rely on the human labor behind them. If the people behind the process don’t put in due effort, no real insights are extracted. Normally that effort involves a great deal of spreadsheet manipulation and formulaic analysis.

This allows for excellent data access, but mining that data is a very involved and laborious process that doesn’t scale very well. The smaller and less complex the publisher operation, the better this type of approach tends to work. When it comes to large and complex ad ecosystems, however, this approach often results in a lot of overhead and operational bottlenecks. 

It’s because of these limitations that more enterprising publishers look to invest in systems that can automate routine data entry, extraction, and exploration tasks. 

To Dashboards

This is where dashboards come into play. Dashboards — either built on top of mass-market BI solutions or designed specifically for the purpose of helping publishers keep a better handle on their data — look to distill large amounts of data into snackable KPI trackers or time series visualizations. As a technology, dashboards are no one-trick pony, but there's little doubt that visualization is the best trick in their bag. 

Assuming the individual reports are properly configured to capture the relevant metrics, dashboards (like Looker, Tableau, even Power BI) make it easy to spot when a pattern of performance has been broken. Good dashboards are also easy to outfit with rule-based, threshold-triggered alerts. 

For most publishers, BI dashboards represent the pinnacle. Many in fact are still working from an export-n-analyze approach and relate to dashboard data reviews aspirationally. But dashboards are still a far cry from the type of prescriptive analytics tools that are really needed.

Too Much and Never Enough

oolo-leads-the-next-gen-of-prescriptive-analytics-tools-for-publishersWhile loading up on data capture and visualization capabilities has been a popular trend over the last decade, for most publishers it results in overload and underutilization. The larger the output dataset and the more siloed the data sources, the more difficult it becomes to see the story behind the data and make smart decisions accordingly.

Publishers today have more data than they know what to do with. They have dashboards and dashboards of the stuff, but it’s not enough.

To achieve an accurate and context-aware view of your ad or revenue performance through a dashboard, you’ll not only need that dashboard to be (wo)manned by a sufficiently talented and expert human manager, but there’ll need to be a lot of manual fine-tuning of report setting and alert rules. And even then it’s like swiveling a spotlight across the terrain as you look on from a watch tower. You’re given snapshot visibility of the places you chose to look rather than a continuous and panoramic view. 

In the world of business analytics, technological advancements are said to follow three distinct stages of progress. First come the descriptive tools, then the predictive & detective tools, and finally prescriptive tools. Descriptive tools tell us what has already happened. Predictive and detective tools help anticipate and prevent problems while automatically identifying any issues that happen to slip through. And finally prescriptive tools do everything predictive tools do while also laying out a path to action in view of detected issues.

When we look at how publisher analytics have advanced, it's clear that despite real progress, the industry is still very much in the descriptive stage. None of the tools in the standard kit can detect let alone predict issues, and none of them come anywhere close to prescribing an ideal course of treatment. And that shortcoming bears very real consequences for publishers.

No amount of manual data review and no dashboard-based alerting can deliver truly panoramic, truly end-to-end coverage. So there’s always a fear that important issues will be missed. To mitigate that risk, decision makers may look to tighten the weave of their alerting dragnets, so that even if they can’t achieve true 360° vision, they can derive a good enough sampling to make accurate and representative extrapolations. In practice, that means lowering thresholds and reconfiguring rules to dial up the alerting sensitivity. 

The problem with this approach is that when rule-based dashboard alerts are configured in a trigger-happy manner, the result is a cascade of false-positives. When that is your starting point, the team will either spend way too much time investigating and invalidating irrelevant issues, or, they’ll take each new alert with an increasingly large pinch of salt to the point that they became altogether desensitized. Either way, the outcome is an operational nightmare and defeats the logic of increased alert sensitivity.

At its core, this damned if you do, damned if you don’t predicament reflects the limitations of rule-based logic and simple threshold triggers.

Monitoring schemes predicated on rigid rules and sterile thresholds will have no way to simultaneously account for seasonality, normal growth, natural fluctuations, factor interdependencies, known changes, and outside influences. Without accounting for all those things, there will always be a profound lack of context hamstringing any monitoring efforts. 

Hurting from crude alerting

lack-of-prescriptive-analytics-tools-hurts-web-publishersYou can set up an alert, for example, to be sent if revenue falls 10% or more below its 10-day trailing average. But that ignores the season, which might have you actually expecting a 10%-20% dip. 

Or, you can set up an alert to sound whenever a key metric remains at least 5% above or below its 3-year adjusted average for a given 3-day period.

That would better account for long-term seasonality, but it would also sacrifice a ton of granularity and timeliness, while losing important anchoring to more recent norms. 

And with related triggers all sounding different alarms for the same underlying incidents (e.g. requests, fill rate, impressions, and revenue), you’ll also be stuck dealing with issues of redundancy.

Working with rigid if-then rules and simple thresholds, every time workarounds are found and built-in to the system to level up its context-awareness, it creates more administrative complexity (and often creates unexpected problems elsewhere in the system). And still, at the end of the day, most shortcomings are never fully solved.

Getting to the root of the matter

Even when working perfectly, dashboards just aren’t enough. They’re powerless when it comes to investigating and resolving the issues they surface. This is because dashboards only reflect the signs of a problem. They don’t actually expose and explain the problems themselves. What’s missing is an understanding of the chain of events that got us here — traced all the way back to an initial development. That initial development, let’s call it the root cause, is the key to the whole thing. 

So regardless of detection, problems still need to be validated and investigated to understand what they’re really about. Without root cause understanding, it will simply be impossible to plan and enact effective interventions. And that's what prescriptive analytics tools are all about.

For publishers, going from detection to correction requires investigation. It's a process that is still mostly conducted manually and takes place outside of the dashboard — consuming a huge amount of time and internal resources. With a list of growing responsibilities, only so much time in the day, and limited (wo)man power, it's a luxury publishers simply cannot afford. 

Prescriptive Analytics Tools: Automated Monitoring, Guided Touch

Whether by way of native reporting portals, standalone analytics, or integrative dashboard solutions, the most prevalent approaches to ad revenue monitoring fail to deliver panoramic perspective and granular awareness.

data-dashboards-come-up-short-prescriptive-analytics-tools-pick-up-the-slack

These solutions don’t actually monitor anything so much as they set up a framework of tripwire alarms. These alarms sound when triggered, but based on predefined interactions rather than situational observations or contextual understanding. Just like tripwires, they can easily be circumvented and, just like tripwires, they tend to produce more false alarms than useful warnings. 

These frameworks, at best, flag issues that they cannot explain. They cannot cross-correlate data points to untangle the mess of connected factors that underlie shifts in the data. They cannot tell when a change is trivial or significant and they cannot tell the cause — let alone advise on the appropriate intervention. 

This is why something more intelligent than dashboard alerts is needed and something more efficient than manual investigations. What’s needed are prescriptive analytics tools that deliver holistic, ongoing, and actionable ad operations intelligence. A solution that monitors and analyzes all relevant ad characteristics, mechanics, and potential impact dimensions 24/7. 

Publishers need a dedicated solution. A solution capable of taking in the whole picture at once and avoiding framing/anchoring biases. A solution that learns and understands context, and examines comparable situations rather than merely numbers. A solution that can understand which variables are entangled and which are independent and present the full picture in an informative and actionable manner. 

Such a solution would constitute a major leap forward when compared to dashboards that only reflect pre-defined parameters and only upon request. The type of prescriptive analytics tools needed would not only be designed to quickly detect problems, but to quickly diagnose their cause and prescribe suitable interventions. A solution that empowers employees rather than merely serving as the springboard for their open-ended investigations.  

A solution of this sort would provide a central gathering point and shared frame of reference around which all stakeholders could gather and collaborate — breaking silos down rather than building them up. And it would provide expertise-in-a-can — directing the revenue and operations teams how to leverage their limited resources to optimize impact. 

They need to know that they’ll never miss a critical revenue issue and that they won’t be wasting countless hours on manual data investigations. 

Powered by machine learning and refined with context-aware predictive models, oolo monitors your entire datascape 24/7 from end-to-end — delivering actionable and noiseless insights with true root-cause explanations. 


Subscribe to The Monetization Station

Stay up-to-date and in the loop

Comments

0
Catch and patch more revenue leaks more quickly
with 24/7 monitoring and holistic impact analysis