Fake Googlebots used to run application-layer DDoS

Security experts at Incapsula are observing a surge in the used of fake Googlebots to launch and carry out application-layer DDoS.

Googlebots crawl websites are the software used to index and rank the content of websites for the popular search engine. It’s not a mystery that the visits of Googlebots are not considered a possible indicator for activities related to cyber threats, for this reason principal defensive solutions don’t block this kind of traffic.

Anyway cyber criminals are aware that Googlebots traffic is considered harmless, so they start using spoofed Googlebots to launch layer-7 (application-layer) distributed denial of service attacks (DDoS). Fake Googlebots could be also used for site scraping and spam campaigns.

Researchers at security firm Incapsula have identified an increase in the exploitation of fake Googlebot visits to hit a targeted website with malicious traffic, it has been estimated that for every 25 Googlebot visits, companies are likely to be visited by a fake one. The experts observed over 400 million search engine visits to 10,000 sites, a total amount of nearly 2.19 billion page crawls over a 30 day period. Information about fake Googlebots comes from inspection of more than 50 million Googlebot impostor visits.

Nearly the 25% of  spoofed Googlebots is used by hackers to run DDoS attacks, according to expert to Igal Zeifman, which explained that Incapsula’s technology protects its customers by identifying malicious Googlebots because Google crawlers come from a pre-determined IP address range.

“Hackers are looking for a loophole. The more advanced [mitigation] tools are able to identify Googlebots, which is done by a cross-verification of IP addresses. But this also shows a low level of understanding by hackers of how modern DDoS protection works. They assume you can’t do IP cross verification.” said Zeifman.

Application-layer attacks are today the most insidious DDoS attacks due to frequency and the volume of malicious traffic they generate, these attacks have grown dramatically in the last months as attackers exploit capabilities of huge botnets to overwhelm victim’s resources.

“You don’t have to create a big flood to generate 5,000 visits per second,”“It’s easy to generate 5,000 per second. Layer 7 attacks are more common for sure than Layer 3 or 4 events. The reason is that it’s easier to execute and more dangerous, even in low volumes.”  Zeifman said. 

Application-layer DDoS Attacks consume less bandwidth and are less noisy respect the volumetric attacks, however, they can have a dramatic impact to targeted services.

Typically website designers tend to over-provision for the number of visitors per interval of time, but an attacker could be able produce malicious traffic exploiting fake Googlebots.

Attackers can use a mixed attack strategy combining application-layer DDoS attacks to network-layerDDoS simultaneously, in this case the power of the attack could be very dangerous for victmins.

Let’s close with a few interesting key findings from the study conducted by Incapsula processing over 210 million Googlebot sessions:

  • Googlebot’s average visit rate per website is 187 visits/day.
  • Googlebot’s average crawl rate is 4 pages/visit.
  • 98.12% of Googlebotsis originated in US

Pierluigi Paganini

Security Affairs –  (fake Googlebots, DDoS)

Pierluigi Paganini

Pierluigi Paganini is member of the ENISA (European Union Agency for Network and Information Security) Threat Landscape Stakeholder Group and Cyber G7 Group, he is also a Security Evangelist, Security Analyst and Freelance Writer. Editor-in-Chief at "Cyber Defense Magazine", Pierluigi is a cyber security expert with over 20 years experience in the field, he is Certified Ethical Hacker at EC Council in London. The passion for writing and a strong belief that security is founded on sharing and awareness led Pierluigi to find the security blog "Security Affairs" recently named a Top National Security Resource for US. Pierluigi is a member of the "The Hacker News" team and he is a writer for some major publications in the field such as Cyber War Zone, ICTTF, Infosec Island, Infosec Institute, The Hacker News Magazine and for many other Security magazines. Author of the Books "The Deep Dark Web" and “Digital Virtual Currency and Bitcoin”.

Recent Posts

U.S. Gov imposed Visa restrictions on 13 individuals linked to commercial spyware activity

The U.S. Department of State imposed visa restrictions on 13 individuals allegedly linked to the…

4 hours ago

A cyber attack paralyzed operations at Synlab Italia

A cyber attack has been disrupting operations at Synlab Italia, a leading provider of medical…

5 hours ago

Russia-linked APT28 used post-compromise tool GooseEgg to exploit CVE-2022-38028 Windows flaw

Russia-linked APT28 group used a previously unknown tool, dubbed GooseEgg, to exploit Windows Print Spooler…

15 hours ago

Hackers threaten to leak a copy of the World-Check database used to assess potential risks associated with entities

A financially motivated group named GhostR claims the theft of a sensitive database from World-Check…

23 hours ago

Windows DOS-to-NT flaws exploited to achieve unprivileged rootkit-like capabilities

Researcher demonstrated how to exploit vulnerabilities in the Windows DOS-to-NT path conversion process to achieve…

1 day ago

A flaw in the Forminator plugin impacts hundreds of thousands of WordPress sites

Japan's CERT warns of a vulnerability in the Forminator WordPress plugin that allows unrestricted file uploads…

1 day ago

This website uses cookies.