← All disapproval codes
critical 14526475

How to fix: Robots.txt error

What this means

Google can't fetch or parse the site's robots.txt file at all, blocking crawl of products and 3D assets.

Why this happens

Landing pages must be crawlable, available, fast-loading, and consistent with the feed data. Issues here include 404s, server errors, redirect loops, robots.txt blocks, missing schema markup, country-specific availability problems, and content/price mismatches with the feed.

How to fix it

  1. Test the landing page in a private browser session — confirm it loads quickly, displays the product, and matches the feed (price, title, availability).
  2. Check robots.txt to confirm Googlebot is allowed to crawl `/products/` paths.
  3. Eliminate redirect chains — Google allows max 2 redirects.
  4. Add structured data (JSON-LD Product schema) to confirm price and availability programmatically.
  5. If the issue persists after fixing, request a website check in Merchant Center to expedite re-review.

Common pitfalls