Your robots.txt is blocking Googlebot from crawling product landing pages. Without crawl access, Google can't verify the product and disapproves it.
robots.txt rules sometimes block /products/ or /shop/ paths, especially after staging-to-prod migrations or when SEO teams try to control crawl budget.