The declining value of a click: how changing behaviour is masking the impact of SEO

A few months ago, I was reviewing performance with a leadership team who were convinced their SEO had started to fail.

Traffic was down. Rankings had softened in a few key areas. The dashboard told a familiar story, one most teams recognise quickly and react to fast. But the commercial numbers didn’t line up with that narrative.

Inbound leads were holding. In some cases, improving. Sales conversations were moving faster than they had the previous year. The pipeline wasn’t showing the kind of stress you would expect if demand had genuinely dropped.

So we had two conflicting views of reality.

  • One defined by clicks, suggesting decline.
  • Another defined by outcomes, suggesting stability, even progress.

That gap is where most of the confusion around SEO measurement now sits.

Clicks used to act as the bridge between activity and impact. They gave us a visible, measurable connection between content and revenue. Not perfect, but reliable enough to make decisions with confidence.

That bridge is starting to disappear.

AI search, aggregated results, and changing user behaviour are removing the need to click. People are still researching, still forming opinions, still deciding. They’re just doing more of it without ever visiting the sites that influence them.

Which raises a more difficult question than “why is traffic down?”

If users no longer need to click, how do you understand what is actually driving demand?

The role clicks quietly played in decision-making

It’s easy to dismiss clicks as a vanity metric. In practice, they were more useful than that. They provided something rare in marketing. A consistent, scalable signal that linked effort to outcome.

The model most teams operated within was simple enough:

Content creates visibility. Visibility generates clicks. Clicks become sessions. Some of those sessions convert.

That chain gave structure to decision-making. It allowed teams to prioritise, to allocate budget, to justify continued investment in SEO as a channel.

It also created a sense of control. If clicks increased, things were working. If they declined, something needed attention.

Underneath that, there were always gaps.

A buyer might read three articles, leave, and come back weeks later through a different channel. Someone might see your content, mention it internally, and influence a decision you never see. Brand familiarity builds gradually, rarely tied to a single interaction.

Clicks never captured any of that.

But they didn’t need to. They captured enough.

The signal was incomplete, but directionally reliable. And for most growth teams, that was sufficient.

What changes when the click disappears

The shift with AI search is not subtle, even if the impact appears gradually in the data.

Instead of acting as a gateway, search is becoming an endpoint. Users ask a question and receive a synthesised answer. Multiple sources contribute to that answer, but the user often engages with none of them directly.

From a behavioural perspective, very little has changed. People still research. They still compare options. They still build confidence before making a decision.

From a measurement perspective, almost everything has changed.

Clicks used to act as a consistent signal of influence, a visible step between discovery and action. A user had a question, searched for it, evaluated options across multiple sites, and clicked through as part of that process.

Those clicks gave you a trace of how demand was being formed.

What has changed is not the need for information, but how that information is accessed. Increasingly, the early part of the journey no longer requires a click at all.

Search results answer questions directly. AI-generated summaries aggregate multiple sources into a single response. Buyers can compare options, understand categories, and narrow their choices without visiting individual sites.

The research still happens. The interaction just doesn’t.

So the sequence shifts.

Instead of:

Search → Click → Learn → Compare → Return → Convert

It becomes:

Search → Learn → Compare → Decide → Click → Convert

By the time a user clicks, they are no longer exploring. They are validating or acting. Clicks no longer represent the process of demand being built. They represent the point at which demand becomes explicit.

Nothing about the commercial reality has changed. Only our ability to observe it.

I see this play out in buying journeys more often now.

A potential customer spends time exploring a category through AI-generated summaries. They develop a clear view of the landscape without visiting individual sites. Days later, they search for a specific provider by name and convert.

In the data, the conversion is attributed to branded search or direct traffic. The earlier influence, where their understanding was shaped, is missing entirely.

This problem didn’t start with AI, it just became harder to ignore

It would be convenient to blame this entirely on AI search, but that would miss a more important point.

Attribution has been weakening for years Restrictions on tracking reduced visibility across sessions. Users moved between devices, making journeys harder to connect. Buying decisions, particularly in B2B, became more distributed, involving multiple people interacting with different touchpoints at different times.

Even in relatively simple journeys, the idea of clean attribution was already an approximation.

AI has not broken a stable system. It has accelerated the decline of one that was already under strain.

The difference now is that the gaps are becoming harder to ignore.

When clicks start to fall away as well, the remaining signals become heavily skewed towards the end of the journey. What you can still measure becomes what you assume is working.

That is where distortion begins.

How measurement bias reshapes investment decisions

When you rely on incomplete data, you rarely feel like you are making a compromise. The numbers still look precise. The dashboards still update in real time. The reports still produce clear answers.

But those answers are based on a narrower view of reality.

Channels that capture demand, particularly branded search and paid activity, continue to show strong attribution. They sit closest to the point of conversion, so they retain visibility.

Channels that create demand become harder to defend.

Content that shapes understanding but doesn’t generate a click looks ineffective. PR that drives awareness but doesn’t produce trackable sessions appears invisible. Community and word-of-mouth remain largely outside the system altogether.

Over time, budgets follow the data.

I have seen this pattern repeatedly in growth-stage businesses. Investment shifts towards what can be measured with confidence. Bottom-of-funnel activity expands. Top-of-funnel work is questioned, then reduced.

Initially, performance often improves. Efficiency metrics look stronger. Cost per lead decreases. Conversion rates rise.

Then, gradually, something else changes.

Pipeline becomes less consistent. New demand slows. Growth becomes more dependent on existing brand awareness rather than new market creation.

By the time that becomes visible, the cause is usually several quarters behind.

What the data looks like when behaviour shifts

Across a number of accounts we manage, there is a pattern that appears often enough now to be worth paying attention to.

Organic traffic declines, sometimes steadily, sometimes in steps.

At the same time:

  • Conversion rates increase
  • Lead quality improves
  • Sales cycles shorten

On the surface, this can look like optimisation. Better pages. Stronger messaging. Improved targeting.

Sometimes that is true.

But in many cases, the underlying driver is behavioural.

Users are arriving later in their decision-making process. They need less convincing because they have already done the work elsewhere. The click happens closer to the point of action.

One client in the software space saw this clearly. Traffic from non-branded organic search dropped over the course of a year. Internally, that triggered concern.

But when we looked at pipeline data, inbound leads were more qualified than before. Fewer early-stage enquiries. More buyers asking detailed, specific questions.

The journey had not disappeared. It had moved. And the parts that moved were the ones we could no longer see.

Why attribution now overstates demand capture

Attribution models are often treated as a source of truth. In reality, they answer a narrower question. They tell you where a conversion happened, not what caused it.

When upstream signals weaken, that distinction becomes critical.

Branded search is a good example. It frequently appears as the dominant driver of conversions in analytics platforms. That can lead to the conclusion that brand demand is strong, and that branded search is the primary engine behind it.

But branded search is rarely the origin of demand. It is the point at which intent becomes explicit.The cause of that intent is distributed across multiple interactions:

  • Content consumed over time. 
  • Conversations with peers. 
  • Exposure to different channels. 
  • Repeated moments of recognition.

Most of these interactions leave little or no measurable trace.

As a result, attribution increasingly reflects the end of the journey while obscuring the beginning.

Decisions made on that basis tend to reinforce demand capture at the expense of demand creation.

Trying to measure what happens before the click

If the core issue is a lack of visibility into upstream influence, the obvious response is to look for new sources of signal.

At Pieo, we have spent time trying to understand how to make that part of the journey more visible without relying on traditional tracking.

The approach we have taken is relatively simple. Ask customers directly.

At the point of conversion, we capture open-ended responses to  the question “How did you hear about us?”

These responses are then structured using an AI-powered tool, First Signals, to identify patterns across channels, content types, and moments of discovery.

Like any dataset based on human input, it comes with nuance. Responses reflect perception as much as recall, and that introduces variation. But that is also where the value sits.

Over time, clear patterns begin to form.

Channels that appear underweighted in attribution data consistently surface as early drivers of awareness. Content that generates relatively few clicks is often cited as shaping understanding and influencing decisions.

What this reveals is not noise, but a different layer of signal. One that sits closer to how demand is actually created, rather than how it is ultimately captured.

It does not replace last-click attribution. It broadens the picture of what is actually driving demand.

Accepting that certainty is no longer available

There is a natural tendency to look for a new model that restores clarity. A better attribution system. A more advanced tracking setup. A way to reconnect the journey end to end.

In most cases, that clarity is not coming back.

User behaviour is too fragmented. Too much of the journey happens in environments that are not observable. AI sits between the user and the source of information, absorbing interactions that used to be visible.

This shifts the role of measurement.

From providing precise answers to offering directional insight.

In practice, that means working with a combination of:

  • Quantitative signals that capture parts of the journey
  • Qualitative insight that fills in gaps
  • Pattern recognition over time rather than point-in-time certainty

For leadership teams, this creates a different kind of challenge.

Decisions still need to be made. Budgets still need to be allocated. But the data supporting those decisions is less definitive.

Judgement becomes more important, not less.

What changes when you stop expecting perfect attribution

When teams accept that measurement will be incomplete, behaviour tends to adjust in subtle but important ways.

There is less reliance on single metrics as definitive indicators of performance. Clicks, cost per lead, and last-click attribution are treated as inputs rather than answers.

More attention is given to system-level outcomes.

Is pipeline growing or shrinking? Are deals moving faster or slower? Is inbound demand becoming more or less qualified?

Trade-offs become more explicit.

Short-term efficiency can be increased by focusing on demand capture. Long-term growth depends on continued investment in demand creation, even when it is harder to measure.

This is where most of the tension sits.

Not in understanding what is happening, but in deciding how much uncertainty you are willing to accept in order to sustain growth.

If clicks tell you less, what are you really measuring?

The visible decline in clicks is easy to interpret as declining SEO performance. In some cases, that will be true.

In many others, it is a measurement problem rather than a demand problem.

The influence of content has not disappeared. It has become less visible.

That creates a risk.

Decisions made with confidence, based on clean-looking data, that only reflects part of the system.

If your reporting shows you where demand is captured, but not where it is created, the question is not whether the data is wrong.

It is how much of the picture you are currently missing, and how that is shaping the decisions you make next.