Request a demo

Time is running out for today’s programmatic advertising ecosystem, as seismic changes taking place now catalyse a transition to a new way of thinking, and the re-engineering of an outdated ecosystem that has never truly been fit for purpose in the B2B world.

The traditional programmatic ecosystem, built around technologies developed organically over the past three decades, is struggling to adapt to the unique requirements of the B2B world and the backdrop of increasingly stringent privacy and regulations and other changes, including the impending third party cookie deprecation and Google’s GA4 rollback.

The programmatic ecosystem comprises on one side the seller, typically a web publisher, and on the other, the buyer; the advertiser. Through data enabled in a real-time framework, the programmatic concept connects the buyers with the sellers in a real-time auction, putting a value on every individual web impression. With the evolution of this process came the emergence of middleware, a wide range of ad tech participants adding varying degrees of value to the supply chain – in some cases, almost no value – by acting as connecting conduits between buyers and sellers. A downside of this is the degradation of the ad spend by the demand side platform (DSP) of anything from a few pence up to a much larger margin.

A system unfit for B2B purpose

Programmatic was always designed to be a B2C-centric advertising set-up that used cookies to identify individual people and their online browsing activities across domains, profiling their interest in certain products and then targeting them in a very precise manner. That system has never really worked for B2B, where transactions typically entail complex decision-making processes within longer sales cycles and where B2B stakeholders tend to be teams of people – decision-making units, not individuals – making higher-value, often multi-million-pound investments.

The B2B purchasing consideration period can be as long as two years, rendering the web cookie, with its average 30-day lifespan wholly inadequate as an identifier. And the buying journey is still largely being done via human meetings, demos, events and phone calls, augmented by deep research, including on the premium content internet. Unlike generic consumer-focused content, B2B content is highly specific and tailored to the needs of professionals seeking in-depth research and information. This is a huge advantage for digital ad delivery that is aligned with contextual targeting.

Contextual understanding of content

However, the B2B supply chain challenges extend beyond procurement to the inadequacy of the tools being used to understand page context and the wider webpage environment. To satisfy the nuances of B2B the depth of contextual understanding must go beyond the levels of topic-categorising tech previously deployed in the past. The B2B supply chain of the future will use a URL level analysis to understand not just keywords in the text, but the topics in the pages, videos and audio, but also the pages’ written content. This will augment the targeting enabled in the wider supply chain, but will provide a forensic level of page signal data enabling the laser targeting of specific B2B offers to the environments where IT decision-makers are actively researching.

As the supply chain continues to evolve over the coming months and years, I expect the entire model to come under threat. It is a 30-year-old ecosystem that has been running on centralised data models, where the data of every participant, from publisher to advertiser, has been fed into a single ad tech vendor’s database, all enabled by cross-domain, third-party cookies and a vague ecosystem awareness of how the data is then leveraged. It is a flawed system. The big ABM (account-based marketing) platforms and companies operate on a model where they have a tracking script, based on third-party cookies. By giving every website the same tracking script, data can be built into a central repository. Effectively it’s a data grab exercise.

From centralisation to decentralisation

In an increasingly privacy-aware world, a future-facing supply chain could be rebuilt based on a decentralised model, with a standalone marketing technology stack, installable on a client’s own infrastructure, comprising the three core pillars; measurement, identity and activation. This facilitates data sovereignty, avoids data leakage, and empowers advertisers and publishers to share first-party identifiers safely.

The three core pillars – measurement, identity and activation – are the foundation of the future advertising supply chain. Measurement, involves analytics and attribution based on first-party cookies and first-party identity, the new wave of unique ID partners linked to deterministic data signals, on the vendor’s own domain. Identity must be enabled by world-class B2B data and analytics with audience segmentation capabilities, bridging the worlds of custom data platforms (CPDs) clean rooms and web analytics, by working with first-party cookies, unique identifiers (uIDs), and first-party data, including email and CRM data and uID-based programmatic buying. There will be no cross-domain cookie replacement. Activation requires the ability to leverage all forms of identity alongside measurement and to fine-tune the buying algorithms to match each B2B vendor’s distinct challenges, positioning and buying cycles.

Underpinned by these three pillars, the new supply chain will be built around a ring-fenced tech stack that you can run on the enterprise vendor’s own servers and machines and will comprise a set of sub-stacks that can be lifted and planted into their own hosting cloud infrastructure and run against their own servers and machines. With this they can build genuine and meaningful first-party cookie data and uIDs, satisfying both user privacy and organisational data sovereignty. Vendors can gather the data they need for wider marketing activities and turn it into a currency that can be used for smart activation, programmatic and other media buying strategies. Crucially, their competitors will be unable to leverage it.

Ownership of data

Data sovereignty, the verifiable ownership of data, is now a critical consideration in the new advertising paradigm, emphasising the importance of auditable and transparent data ownership, and that makes first-party data incredibly important. However, the supply chain of data is only auditable when you have a clear definition of any data segment that you are building. An example of that is intent data. Many of the incumbents operating in the B2B intent data space have crafted their own definition of things like ‘intent to buy’, but haven’t always shared that definition or allowed it to be customised to fit the vendor’s distinct cycles. But, you cannot audit a supply chain of data, unless you know how it’s been defined or built. In a decentralised system, every big vendor will be running their own version; from taxonomies to trading algorithms. It could be the same underlying tech, but they’re all running their own version of it. And with that, they can set their own data definitions.

Forward-thinking brands are also turning to a relatively new technology stack area known as a data clean room, a piece of intermediary technology, where the advertiser can upload their first-party data safely, from a privacy point of view, and the publisher can upload their first-party data. This creates a middle ground where the data overlaps, and without sharing proprietary, confidential information, can be shared between publisher and advertiser.

Time to act

Data will increasingly be moving into more customisable, auditable and transparent frameworks. Where third-party data declines, strategic partnerships between world-class content producers and advertisers will fill the gaps, through the strategic leveraging of first-party data.

The loss of third-party cookies is already impacting heavily on the buying and selling of digital media. In 2020 when Google announced the deprecation of third-party cookies, a snap poll of advertisers and marketers revealed that they estimated they would need to invest between 5% and 25% more in their marketing budgets to make up the gap in market addressability. In January this year, Google blocked the first 1% of Chrome users’ third-party cookies. The impact on the 1% of users whose cookies got blocked was a 30% reduction in ad spend going to the publishers. Advertisers were no longer getting performance and were unable to find their audiences because they had disappeared overnight. That 30% reduction tells them that the 5% to 25% in increased investment was badly underestimated, even at the high end of that forecast.

That 30% drop on the 1% of Google’s cookie deprecation is a clear sign that the ecosystem is not ready for the changes that are coming. It’s going to be GDPR all over again, with everyone scrambling at the 11th hour to find a way to fix it. The solutions will not be immediate. First, we need to migrate our thinking from the centralised world of intermingling data sharing and connections to a decentralised world and excel very quickly at first-party data collection.

The single biggest catalyst for change will be the advertisers’ acceptance of their ignorance and recognition of the urgency for change. Until now, the industry hasn’t had a reason to change. But the reality of the removal of 1% of cookies leading to a 30% reduction in the addressable market is a wake-up call for advertisers to get to grips with what’s been happening to their money.

It is an inconvenient truth. The old system worked, but it isn’t going to work any longer. There will be no direct replacement for the cookie. Advertisers will need to create their own supply chain, whether that is outsourced to a company, or moved in-house, at a huge cost of time and money. And it starts with migrating your thinking to the concept of a new supply chain model, and the transparency, control and better management of your data, and therefore your destiny, that the transformation will bring.