Fake Googlebots used to run application-layer DDoS

Pierluigi Paganini July 25, 2014

Security experts at Incapsula are observing a surge in the used of fake Googlebots to launch and carry out application-layer DDoS.

Googlebots crawl websites are the software used to index and rank the content of websites for the popular search engine. It’s not a mystery that the visits of Googlebots are not considered a possible indicator for activities related to cyber threats, for this reason principal defensive solutions don’t block this kind of traffic.

Anyway cyber criminals are aware that Googlebots traffic is considered harmless, so they start using spoofed Googlebots to launch layer-7 (application-layer) distributed denial of service attacks (DDoS). Fake Googlebots could be also used for site scraping and spam campaigns.

Researchers at security firm Incapsula have identified an increase in the exploitation of fake Googlebot visits to hit a targeted website with malicious traffic, it has been estimated that for every 25 Googlebot visits, companies are likely to be visited by a fake one. The experts observed over 400 million search engine visits to 10,000 sites, a total amount of nearly 2.19 billion page crawls over a 30 day period. Information about fake Googlebots comes from inspection of more than 50 million Googlebot impostor visits.

Nearly the 25% of  spoofed Googlebots is used by hackers to run DDoS attacks, according to expert to Igal Zeifman, which explained that Incapsula’s technology protects its customers by identifying malicious Googlebots because Google crawlers come from a pre-determined IP address range.

“Hackers are looking for a loophole. The more advanced [mitigation] tools are able to identify Googlebots, which is done by a cross-verification of IP addresses. But this also shows a low level of understanding by hackers of how modern DDoS protection works. They assume you can’t do IP cross verification.” said Zeifman.

Application-layer attacks are today the most insidious DDoS attacks due to frequency and the volume of malicious traffic they generate, these attacks have grown dramatically in the last months as attackers exploit capabilities of huge botnets to overwhelm victim’s resources.

“You don’t have to create a big flood to generate 5,000 visits per second,”“It’s easy to generate 5,000 per second. Layer 7 attacks are more common for sure than Layer 3 or 4 events. The reason is that it’s easier to execute and more dangerous, even in low volumes.”  Zeifman said. 

Application-layer DDoS Attacks consume less bandwidth and are less noisy respect the volumetric attacks, however, they can have a dramatic impact to targeted services.

GoogleBots statistics

Typically website designers tend to over-provision for the number of visitors per interval of time, but an attacker could be able produce malicious traffic exploiting fake Googlebots.

GoogleBots statistics 2

Attackers can use a mixed attack strategy combining application-layer DDoS attacks to network-layerDDoS simultaneously, in this case the power of the attack could be very dangerous for victmins.

Let’s close with a few interesting key findings from the study conducted by Incapsula processing over 210 million Googlebot sessions:

  • Googlebot’s average visit rate per website is 187 visits/day.
  • Googlebot’s average crawl rate is 4 pages/visit.
  • 98.12% of Googlebotsis originated in US

Pierluigi Paganini

Security Affairs –  (fake Googlebots, DDoS)

you might also like

leave a comment