THE INTERNET ISN'T JUST BUILT FOR YOU ANYMORE
And the companies that understand this first will own the next decade.
For most of the internet's life, we've all worked with the same assumption: there's a human on the other side of the screen. Every headline test, every layout tweak, every ad slot and paywall has been about one thing: getting a person to click, scroll, and read.
Over the last 12–18 months, that assumption quietly broke.
The Numbers Are In
A new State of AI Traffic analysis shows that automated traffic is now growing eight times faster than human traffic. Automated traffic grew 23.5% year over year in 2025, while human traffic rose just 3.1%. In many networks, automated activity has already overtaken human usage.
AI‑driven traffic is a major driver of that shift. Average monthly AI traffic increased 187% last year, and traffic from AI agents and agentic browsers — systems that don't just crawl but actively use the web — grew nearly 8,000% from a small base.
At SXSW, Cloudflare's CEO Matthew Prince put it bluntly: by 2027, bot traffic will exceed human traffic online, driven by the rise of generative AI agents. This is not a temporary spike. It is the new baseline.
AI isn't just changing your traffic. It's changing who your audience is.
This isn't just about "bad bots" and "good bots" anymore. That framing is already obsolete. The real question is no longer "How do we block machines?" It's "How do we build for them?"
Publishers Are Facing the Wrong Direction
Most publisher teams we talk to are still optimising for a world that's disappearing. They are tweaking headlines for Google Discover, fighting over newsletter open rates, and treating AI crawlers as either a security problem or something they hope will go away on its own.
Meanwhile, the real shift is happening quietly in user behaviour. Your audience is asking ChatGPT and Perplexity the questions they used to type into Google. When AI answers, it cites someone. The question is whether that someone is you — or Wikipedia, Reddit, and your closest competitor.
On a recent Tollbit webinar, one publisher summed it up simply: we now have two audiences — humans and agents — and most newsrooms are still only writing for one. Humans still come for stories, depth, and voice. Agents come for structure, clarity, and signals they can trust in real time.
In most publisher setups we see, AI bot traffic isn't treated as an audience at all. It sits outside analytics, outside revenue models, and usually only shows up when the infra bill spikes or the site slows down. That's a problem, because the "invisible audience" is now often bigger than the visible one.
You don't just have readers anymore. You have systems deciding whether you show up at all.
The Napster Moment
We've been here before.
When Napster disrupted the music industry, the initial response was lawsuits and DRM. The industry spent years trying to put the genie back in the bottle.
The shift came from Spotify: a product experience so good that paying became easier than pirating — even if many artists would argue they never saw a fair share of that new value. Publishing is at risk of repeating that pattern: a new layer of AI products built on top of your work, without a business model that truly works for the people who make the content.
Publishing is having its Napster moment right now.
Some publishers are blocking AI crawlers outright. Others are signing licensing deals with OpenAI and Microsoft. There's already a boom in AI licensing marketplaces — but not yet a boom in the money flowing back to publishers.
And beneath that, a new layer of infrastructure is emerging. Arc XP has announced a native integration with Tollbit to let publishers automatically detect AI bots, set access rules, and route agents through a bot paywall for licensing and payment. HUMAN Security is partnering with Tollbit to combine sophisticated bot detection with programmatic monetization of authorised AI scrapers.
The ones who will win this phase won't be the companies that block the most or sign the very first deals. They'll be the ones who understand what this new machine audience actually needs — and build for it.
Two Audiences, Two Strategies
A human reader wants a compelling headline, a strong lede, and a well‑structured narrative. They want to understand not just what happened, but why it matters and how it feels.
A machine needs something else entirely.
Machines need:
- Structured data and unambiguous entities (who, what, where, when, how)
- Semantic clarity in HTML, schema, and metadata
- Stable URLs and canonical sources that don't move or break
- Clear authorship, licensing, and authority signals they can safely rely on in an answer
The publishers who thrive in this new reality will maintain two parallel content strategies — one for humans, who still want stories, depth, and editorial voice, and one for agents, which need structured, citable, authoritative content they can reference and attribute in real time.
This isn't about replacing one with the other. It's about accepting that you now have two audiences — and only serving one of them means losing the other.
The Economics Are Shifting
The cost of servicing bot traffic is real and still widely underestimated. AI agents crawl aggressively, often without passing meaningful traffic back, while consuming bandwidth, compute, and security overhead. At the same time, the opportunity side is growing quickly.
If AI‑driven traffic is compounding at 187% year over year, and if companies like OpenAI can charge tens of dollars per thousand AI searches, then the publishers whose content powers those answers have a claim on a new revenue stream that didn't exist two years ago.
We're already seeing early infrastructure for this new content economy — Microsoft and others experimenting with publisher content marketplaces for AI licensing, Arc XP and Tollbit connecting real‑time bot detection with a commercial pathway, and HUMAN and Tollbit offering a combined stack for AI scraping agents.
These are not side projects. They are the early plumbing of a world where being cited by an AI system may ultimately be more valuable than a Google click.
The Question You Should Be Asking
The key questions for publishers and brands are shifting.
Not "How do I protect my content from AI?"
Not just "How do I get an AI licensing deal?"
The real question is:
If not, someone else is. And every day you wait, that someone else gets harder to displace. Answer ranking in AI systems — the invisible answer graph — compounds just like search rankings once did.
If you're not showing up in AI answers today, you're already losing ground. The good news: it's still early. The answer graph is being built right now — and you still have a chance to shape it.
The Problem Anseri Is Built to Solve
Most publishers today have no idea whether they exist in AI answers. They don't know when they're cited, where they're invisible, or what signals actually move the needle.
At Anseri, we're a small team obsessed with one question: how do we give publishers real visibility and leverage in the AI answer layer? We make your presence in AI answers visible — and then help you fix it.
We focus on AI citation infrastructure: the systems, signals, and structure that make your expertise the source AI trusts.
The internet isn't just built for you anymore. But the next one can be.