AdsBot Google: The Ad Quality Crawler That Ignores Wildcard robots.txt
What Is AdsBot-Google?
AdsBot-Google is Google's specialized crawler for ad quality verification. Unlike Googlebot, which crawls pages for search indexing, AdsBot-Google crawls the landing pages that Google Ads advertisers link to — checking those pages for compliance with Google's advertising policies and for quality signals that influence ad auction outcomes.
If you run Google Ads campaigns and link to a landing page, AdsBot-Google will periodically crawl that URL. The information it collects affects your Quality Score, which in turn affects your cost-per-click and ad position. Landing pages with policy violations (prohibited content, misleading claims, malware) that AdsBot detects can result in ad disapproval or account suspension.
AdsBot-Google is part of Google's broader effort to ensure that the ads it shows lead to high-quality, policy-compliant destinations. It operates independently of both the search indexing system and the organic ranking system — AdsBot findings do not directly affect organic search rankings, only ad quality and eligibility.
AdsBot-Google User Agent Strings
AdsBot-Google has two user agent variants — one for desktop landing page checking and one for mobile:
Desktop (AdsBot-Google):
AdsBot-Google (+http://www.google.com/adsbot.html)
Mobile (AdsBot-Google-Mobile):
Mozilla/5.0 (iPhone; CPU iPhone OS 14_7_1 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/14.1.2 Mobile/15E148 Safari/604.1 (compatible; AdsBot-Google-Mobile; +http://www.google.com/mobile/adsbot.html)
Note that the desktop AdsBot user agent string is minimal — it does not pretend to be a browser. The mobile variant includes a full iOS browser user agent to simulate how a mobile user would experience the landing page, since many advertisers serve different content to mobile devices and Google needs to check the actual mobile experience.
Both variants resolve to Google IP addresses and pass the reverse DNS verification test (hostname ends in google.com). If you see these user agents in your server logs, you can verify them using the same two-step DNS lookup used for Googlebot.
Why AdsBot Ignores Wildcard Disallow Rules
This is the most important and counterintuitive behavior of AdsBot-Google: it does not respect wildcard Disallow rules in robots.txt. Specifically, AdsBot-Google ignores any Disallow directive under User-agent: * unless the advertiser has explicitly opted out of ad serving on the affected URLs.
Google's rationale is straightforward: if an advertiser is paying to run ads that point to a URL, Google needs to be able to check that URL for policy compliance regardless of robots.txt. A wildcard Disallow cannot block this verification because blocking it would prevent Google from detecting policy violations, which would harm users who click on those ads.
# AdsBot-Google IGNORES this: User-agent: * Disallow: /landing-pages/ # AdsBot-Google RESPECTS this: User-agent: AdsBot-Google Disallow: /landing-pages/ # AdsBot-Google-Mobile also needs its own rule: User-agent: AdsBot-Google-Mobile Disallow: /landing-pages/
This behavior surprises many site owners who assumed that a wildcard Disallow would keep all bots away from certain directories. For AdsBot, that assumption is wrong. If you want to block AdsBot from specific pages, you must use an explicit User-agent: AdsBot-Google rule.
How to Block AdsBot Specifically
To prevent AdsBot-Google from crawling specific pages or directories, you must address both the desktop and mobile variants explicitly in your robots.txt:
# Block both AdsBot variants from a specific directory User-agent: AdsBot-Google Disallow: /private-landing-pages/ User-agent: AdsBot-Google-Mobile Disallow: /private-landing-pages/ # To block AdsBot from the entire site: User-agent: AdsBot-Google Disallow: / User-agent: AdsBot-Google-Mobile Disallow: /
Important caveat: blocking AdsBot-Google from your landing pages may affect your Google Ads ad quality scores. If AdsBot cannot crawl your landing page, Google may assign it a lower Quality Score for Ad Policy compliance checks, which could increase your cost-per-click or restrict your ad serving. Only block AdsBot from pages that you are not using as Google Ads landing pages.
If you are blocking AdsBot from your entire site because you do not run Google Ads, this is perfectly safe and has no effect on organic rankings.
What AdsBot Checks on Your Landing Pages
AdsBot-Google evaluates several dimensions of your landing pages:
- Policy compliance: AdsBot checks for content that violates Google Ads policies — prohibited products (certain pharmaceuticals, weapons, financial scams), deceptive claims, adult content in non-adult-approved campaigns, and similar violations. A landing page with prohibited content will cause the linked ad to be disapproved.
- Landing page experience: Google evaluates the user experience of the landing page for its Quality Score calculation. This includes page load speed, mobile-friendliness, whether the page content is relevant to the ad's keywords, transparency about what the page offers, and ease of navigation.
- Malware and security: AdsBot scans for malware, deceptive downloads, and phishing attempts on landing pages. If malware is detected, the associated ads are immediately suspended.
- Page accessibility: If the landing page returns errors (404, 500), redirects to a different page than the advertised URL, or requires login to access, AdsBot records this and it may affect ad eligibility.
- Content relevance: The content on your landing page should be relevant to the keywords and ad copy in your campaign. AdsBot extracts text content to assess this alignment.
AdsBot vs Googlebot: Key Differences
AdsBot-Google and Googlebot have fundamentally different purposes and behaviors:
| Dimension | Googlebot | AdsBot-Google |
|---|---|---|
| Purpose | Search index and ranking | Ad quality and policy compliance |
| robots.txt wildcard | Respected | Ignored (must use explicit rule) |
| Pages crawled | All discovered pages | Only Google Ads landing pages |
| Affects organic rankings | Yes | No |
| Affects ad quality score | No | Yes |
| Mobile variant | Googlebot Smartphone | AdsBot-Google-Mobile |
| Crawl rate control | GSC crawl rate limiter | No GSC control; robots.txt only |
AdsBot and Your Sitemap
AdsBot-Google does not read your XML sitemap. It crawls only the specific landing page URLs that are designated as ad destinations in Google Ads campaigns. The sitemap is irrelevant to AdsBot's operations — it crawls on demand based on active ad URLs, not based on your site's crawlable content graph.
This means that including or excluding a URL from your sitemap has no effect on whether AdsBot crawls it. AdsBot will crawl any URL that an active Google Ads campaign links to, regardless of sitemap status, as long as it is not explicitly blocked by a named AdsBot rule in robots.txt.
Conversely, if you have landing pages that you do not want to appear in organic search — perhaps they are thin, conversion-focused pages that would look poor in organic results — you can noindex them while still keeping them accessible to AdsBot. A page can be noindexed for organic search while remaining fully crawlable for ad quality verification. Use the noindex robots meta tag on the page itself:
<!-- On your landing page <head> --> <meta name="robots" content="noindex, follow"> <!-- Or via HTTP header --> X-Robots-Tag: noindex, follow
This configuration keeps the page out of Google's search index while keeping it accessible to AdsBot for quality checking, which is the correct setup for dedicated paid search landing pages that you do not want to compete with organic content.