On March 26, 2026, HUMAN Security released its annual State of AI Traffic report, revealing that agentic AI traffic grew by nearly 8,000% in 2025, while total automated traffic is growing 8x faster than human traffic. The bottom line: your website is now visited — and used — by robots that act like buyers.
What HUMAN Security measured
HUMAN analyzed over one quadrillion interactions across its customer base in 2025. Automated traffic grew 23.5% year-over-year — versus just 3.1% for human traffic. But the breakdown is what matters:
- Training crawlers — still dominant at 67.5% of AI traffic, but declining in relative share
- Real-time AI scrapers — up 600% in 2025, fueled by AI search engines like Perplexity and ChatGPT Search
- Agentic AI systems — the smallest category, but the fastest-growing: +8,000%. These are systems that browse, compare, sign up, and purchase on behalf of users.
The most striking number: in 2025, 2% of observed AI agent actions reached checkout flows. This is no longer just monitoring — it's transacting.
Why this is a turning point for SEO
Until now, we optimized for two audiences: humans and Googlebot. That era is over. AI agents like OpenAI's Atlas and Perplexity's Comet now actively navigate your pages to complete tasks for their users. They read your content, evaluate your offers, and act on them.
What this means: optimization is no longer just about "being found on Google." Your site also needs to be usable by AI agents executing autonomous tasks. Your content must be structured, factual, and machine-readable.
Cloudflare's CEO predicted in 2025 that bots would overtake human web traffic by 2027. Given these numbers, that timeline looks optimistic. The tipping point may come in 2026.
What your website needs to do now
Three concrete adaptations are now urgent:
- Check your server logs — Identify which agents are already crawling your site. Since March 20, Google has been rolling out its own "Google-Agent" user agent in logs to track this traffic. Others will follow.
- Structure your content for machines — AI agents parse well-structured content better: clear headings, factual data, schema.org markup. This isn't additional SEO work; it's the same recipe as optimizing for AI engine visibility (GEO).
- Ensure your forms and funnels work for agents — If an agent tries to request a quote or sign up for your newsletter, your WAF shouldn't block it (unless you explicitly want it to).
The question is no longer "do AI agents visit my site?" — they do. The question is: can they do something useful when they get there?
The sectors most exposed right now
Not all websites face this transition equally. Some sectors are being impacted right now:
- E-commerce and retail — 77% of agent actions happen on product and search pages. If your prices, descriptions, and availability aren't structured (schema.org Product, Offer), you're invisible to comparison agents.
- B2B services — AI agents are already qualifying vendors for their users. An agent asked "find me an SEO agency in London" will read your service page before a human even clicks.
- Hospitality and restaurants — Reservations, hours, menus. If an agent can't read your structured hours and complete your reservation form, it moves to a competitor.
- Finance and insurance — Rate comparison agents are becoming real. schema.org FinancialProduct markup is becoming critical.
How to measure your AI agent exposure
Before acting, measure. Three concrete signals to monitor today:
- Analyze your Apache/Nginx logs — Search for user agents containing "GPTBot", "PerplexityBot", "ClaudeBot", "Google-Agent", "comet" (Perplexity), "Atlas" (OpenAI). Export 30 days of logs and filter for these strings. You'll be surprised by the volume.
- Check your robots.txt — Have you actively blocked any of these bots? If so, you may have blocked your own visibility in AI search engines. That's a valid choice, but it should be intentional.
- Validate your structured data — Use Google's Rich Results Test and Schema Markup Validator. If your structured data fails validation, agents can't extract your information cleanly.
The monetization question
There's a fundamental tension that the HUMAN report doesn't resolve: AI agents extract value from your site without necessarily generating ad revenue. An agent that reads your catalog and makes a recommendation doesn't trigger a monetizable pageview.
It's the same debate as neighboring rights and AI — but at the scale of transactional traffic. The 2% of agent actions that reach checkout are the good news: there, you have a conversion. But the remaining 98% that read, compare, and pass summaries to a human user? That's editorial work consumed without direct remuneration.
The only viable short-term answer: make sure you're the source cited in those summaries, not an ignored source. Which brings us back to content quality, domain authority, and structural clarity.
Our take
At Cicero, we've long argued that SEO and GEO are two sides of the same challenge. This HUMAN Security report is the quantified confirmation. Your content must now serve two masters simultaneously: the human who searches, and the agent who acts. Fortunately, what's good for one is good for the other: clarity, structure, expertise, proprietary data.
The number to remember: +8,000% in one year. This isn't a trend to anticipate. It's a reality to manage now. And in 12 months, businesses that built their presence around these new rules of the game will have a structural competitive advantage that will be hard to catch up to.
Sources
Growth and SEO content strategist, I founded Cicéro to help businesses build lasting organic visibility — on Google and in AI-generated answers alike. Every piece of content we produce is designed to convert, not just to exist.
LinkedInGo further: