Google Analyst Warns: AI Bots Risk Internet Gridlock By Server Overload

Google Search analyst Gary Illyes warns that the proliferation of AI agents and their intensive data processing demands are set to cause significant internet congestion and overload website servers, potentially degrading web performance for all users.

A Google Search analyst has sounded the alarm: the rapid proliferation of AI agents and their often aggressive web-crawling tactics are on a trajectory to create significant internet congestion and potentially overwhelm website servers.

Gary Illyes, from Google’s Search Relations team, highlighted that while the internet itself is built for substantial traffic, the sheer volume and intensity of some AI crawlers present a novel challenge. This emerging issue, he cautioned, could degrade web performance for users and strain the digital infrastructure that underpins the online world.

The crux of the problem, Illyes explained during Google’s “Search Off the Record” podcast, isn’t merely the act of bots fetching web pages. Instead, the primary concern, according to Illyes, lies with the subsequent, resource-intensive tasks of indexing and processing the vast amounts of data these AI agents collect.

He noted that some AI bots attempt to crawl the entire internet in remarkably short timeframes. Illyes mentioned attempts to scan the web in as little as 25 seconds, which can easily overwhelm servers. This intensive activity raises critical questions about the escalating tension between AI’s insatiable need for data and the operational health of the internet. Illyes captured the sentiment by stating, “everyone and my grandmother is launching a crawler”

The Swelling Tide Of AI Crawlers

This surge in automated activity is largely fueled by businesses deploying a new generation of AI tools for diverse purposes, including content creation, competitive research, market analysis, and extensive data gathering. However, this increased crawling comes with consequences.

For instance, Search Engine Journal detailed how SourceHut faced service disruptions due to aggressive Large Language Model (LLM) crawlers, leading them to block several cloud providers. The scale of this traffic is substantial; data from Vercel, also reported by Search Engine Journal, showed OpenAI’s GPTBot generated 569 million requests in a single month, with Anthropic’s Claude accounting for 370 million.

Ad metrics firm DoubleVerify further noted an 86 percent rise in general invalid traffic (GIVT) – bot traffic that shouldn’t be counted as ad views – during the second half of 2024, attributing this surge to AI crawlers, with a significant portion coming from AI scrapers.

Compounding the issue, many AI crawlers tend to disregard the robots.txt protocol, the long-standing web standard intended to guide bot behavior. This non-compliance means website owners have limited control over which bots access their content and how frequently.

This increased activity from non-search engine AI bots can consume server resources, potentially impacting the ability of legitimate search engine bots to crawl and index critical pages efficiently.

 

Cloudflare’s Countermeasures And The Evolving Defense

In response to these challenges, companies like Cloudflare have been developing increasingly sophisticated countermeasures. In March, Cloudflare introduced AI Labyrinth, a system designed to actively mislead and exhaust unauthorized AI crawlers by luring them into mazes of auto-generated content. The company’s reasoning is that if AI scrapers are busy consuming fake pages, they are not extracting real value.

This was not Cloudflare’s first foray into AI bot defense. In July 2024, the company launched a complimentary tool aimed at helping websites block AI bots. This was followed in September 2024 by the “Bot Management” suite, which provides live monitoring and more granular control over bot access. Cloudflare CEO Matthew Prince asserted that with their system, “every AI crawler gets flagged, even those employing disguises.”

The company even described its enhanced solution as an “armed security guard”, a significant upgrade from the passive “no entry” sign that robots.txt effectively represents.

The Shifting Web

The ineffectiveness of robots.txt against determined scrapers remains a central issue. Content licensing firm TollBit, as reported by Reuters, has stated that many AI agents simply bypass this web standard.

Even Google, with its vast infrastructure, faces challenges in managing crawling efficiency. Illyes acknowledged that while Google strives to reduce its crawling footprint, indicating that new AI product demands often counteract these efforts.

Looking ahead, the web’s user base itself may be undergoing a fundamental shift. Industry observer Jeremiah Owyang, speaking to The New Stack, predicted that “the most common visitor to a website in the future is probably not going to be humans, but AI agents that are surrogates reporting to humans.”

He further suggested this represents a significant transformation for the internet, explaining that “the data layer and the content layer is about to separate and decouple from the presentation layer,” fundamentally altering how web content is accessed and consumed.

This aligns with a Gartner forecast, cited by The New Stack, that search engine traffic could plummet by 25% by the end of 2025 due to AI’s influence. For businesses. Ignoring the rise of AI agents risks decreased visibility and a significant decline in organic traffic.

As the web continues to evolve, website owners are urged to proactively assess their infrastructure, strengthen access controls beyond robots.txt, optimize database performance, and diligently monitor incoming traffic to differentiate between human users and the growing array of automated agents. Initiatives like Common Crawl, which crawls the web and shares data publicly to reduce redundant traffic, were also mentioned by Illyes as a potential model for a more sustainable future.

Markus Kasanmascheff
Markus Kasanmascheff
Markus has been covering the tech industry for more than 15 years. He is holding a Master´s degree in International Economics and is the founder and managing editor of Winbuzzer.com.

Recent News

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
0
We would love to hear your opinion! Please comment below.x
()
x