Detect Googlebot from User-Agent String

Identify Googlebot desktop and smartphone crawlers from User-Agent strings. Learn the different Googlebot variants and how to verify authentic Googlebot requests.

Bot Detection

Detailed Explanation

Identifying Googlebot in User-Agent Strings

Googlebot is the web crawler used by Google Search to index web content. It uses several different User-Agent strings depending on the crawl mode.

Googlebot Desktop

Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

This is the classic Googlebot UA for desktop crawling. It is straightforward to identify by the Googlebot/2.1 token.

Googlebot Smartphone

Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.6099.144 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

The smartphone Googlebot mimics a real Chrome mobile browser but appends the (compatible; Googlebot/2.1;...) identification at the end.

Other Google Crawlers

  • Googlebot-Image/1.0 — Image search crawler
  • Googlebot-Video/1.0 — Video search crawler
  • Googlebot-News — Google News crawler
  • AdsBot-Google — Checks ad landing page quality
  • Mediapartners-Google — AdSense content matching
  • APIs-Google — Fetches robots.txt and sitemaps

Verifying Authentic Googlebot

Important: Anyone can set their User-Agent to Googlebot. To verify a real Googlebot request:

  1. Reverse DNS lookup on the IP address — should resolve to *.googlebot.com or *.google.com
  2. Forward DNS lookup on the hostname — should resolve back to the original IP
  3. Google publishes a list of Googlebot IP ranges in JSON format

Mobile-First Indexing

Since Google switched to mobile-first indexing, the smartphone Googlebot is the primary crawler for most sites. The desktop Googlebot is used as a secondary crawler. Understanding which Googlebot variant is visiting your site is important for debugging indexing issues.

Use Case

SEO engineers and DevOps teams detect Googlebot to serve pre-rendered pages, manage crawl budget, debug indexing issues, and ensure important content is accessible to Google's crawler. It is also used to prevent bot traffic from being counted in analytics dashboards.

Try It — User-Agent Parser & Analyzer

Open full tool