Robots-Sitemaps-google-webmaster-tools-solution
Robots-Sitemaps-google-webmaster-tools-solution

A robots.txt file consists of one or more rules. Each rule blocks (or or allows) access for a given crawler to a specified file path in that website.

Here is a simple robots.txt file with two rules, explained below:

Following bellow technique to solved Sitemap Contains URLs Which Are Blocked by Robots.txt

When Web developers improperly configuring their robots.txt file into webmaster tools then Blocked sitemap URLs are typically generated. Often time disallowing anything you need to insure that you know what you’re doing on Google Console otherwise, this warning will appear and the crawlers /indexed may no longer be able to crawl your site.

Now follow few things and solve a “sitemap contains URLs which are blocked by robots.txt” error:

  • Cheque for any Disallow rules within your robots.txt file. Find out the robots.txt file or create new file called robots.txt. it should be located in your root directory as follows: https://yourdomain.com/robots.txt / http://yourdomain
    .com/robots.txt
  • Check very carefully http/https host. If you’ve recently Migrated from HTTP to HTTPS, Oviously Created a new property for the HTTPS version and than the robots.txt file is available via HTTPS.
  • Now Using the robots.txt Tester available within the Google Search Console to check which warnings/errors are being generated.