← All disapproval codes
critical blocked by robots.txt

Landing page blocked by robots.txt

What this means

Your robots.txt is blocking Googlebot from crawling product landing pages. Without crawl access, Google can't verify the product and disapproves it.

Why this happens

robots.txt rules sometimes block /products/ or /shop/ paths, especially after staging-to-prod migrations or when SEO teams try to control crawl budget.

How to fix it

  1. Check robots.txt at yoursite.com/robots.txt — look for Disallow rules covering product paths.
  2. Allow Googlebot specifically: 'User-agent: Googlebot' followed by 'Allow: /'.
  3. Resubmit feed; Google retries within 24 hours.

Common pitfalls