Cloudflare is one of the most widely used CDN and security providers in the world — and one of the most common reasons sites silently lose search rankings without knowing why.
When Cloudflare's Bot Fight Mode, Under Attack Mode, or overly aggressive WAF rules are active, they can block Googlebot alongside malicious traffic. Google's crawler gets a 403, a CAPTCHA challenge page, or a JavaScript verification loop — none of which it can pass — and stops indexing your pages.
How to tell if Cloudflare is blocking Googlebot
There are three signals to check:
- Google Search Console crawl errors.Open GSC → Settings → Crawl Stats. A sudden drop in "crawled pages per day" is a strong indicator. Look for 403 status codes in the crawl response breakdown.
- Test Googlebot manually.Use GSC's URL Inspection tool → "Test Live URL" → "View Tested Page." If the screenshot shows a Cloudflare challenge page, you have your answer.
- Fetch with a Googlebot user agent. Run this command from a non-Cloudflare IP:
If you get a Cloudflare challenge HTML response instead of your page, Cloudflare is blocking the crawler.curl -A "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" https://yourdomain.com
Why this happens
Cloudflare cannot perfectly distinguish real Googlebot from bots spoofing Googlebot's user agent. By default, Cloudflare's Bot Fight Mode is aggressive and may flag Google's crawlers — especially if your site's traffic patterns look unusual, if you're on a shared IP range, or if you recently activated a new Cloudflare product.
The specific Cloudflare features most likely to cause problems:
- Bot Fight Mode — blocks bots that mimic legitimate user agents
- Super Bot Fight Mode (Pro/Business plans) — aggressive mode, known to block Googlebot in some configurations
- Under Attack Mode — JavaScript challenge blocks all non-browser clients
- Custom WAF rules — a rule blocking all non-browser user agents will block Googlebot
How to fix it
Option 1: Verify Googlebot's IPs in Cloudflare
Google publishes its crawler IP ranges. You can whitelist them in Cloudflare → Security → WAF → Tools → IP Access Rules. Add each of Google's IP ranges with "Allow" action.
Google's IP ranges are available at https://developers.google.com/static/search/apis/ipranges/googlebot.json.
Option 2: Create a WAF exception for Googlebot
In Cloudflare → Security → WAF → Custom Rules, create a rule with:
- Field:
http.user_agent - Operator:
contains - Value:
Googlebot - Action:
Skipall WAF rules
This is safe because real Googlebot only comes from Google's verified IP ranges. Combine this with an IP allowlist for defense-in-depth.
Option 3: Disable Bot Fight Mode for verified bots
In Cloudflare → Security → Bots, enable "Allow Verified Bots." This tells Cloudflare to pass traffic from bots that have verified themselves (Googlebot, Bingbot, etc.) rather than challenging them.
After fixing: verify crawling has resumed
After applying your fix, use GSC's URL Inspection to re-test a blocked URL. You should see your actual page content in the screenshot. Wait 24–48 hours, then check Crawl Stats again — crawl volume should return to baseline within a few days.
A full re-crawl of affected pages may take 1–4 weeks depending on your site size. If you have a sitemap, submit it again in GSC to accelerate discovery.