Google: we won’t index your site if we cannot access your robots.txt file

In an online discussion, Google’s Eric Kuan said that Google won’t index a website if they cannot access the robots.txt file of the site:

If Google is having trouble crawling your robots.txt file, it will stop crawling the rest of your site to prevent it from crawling pages that have been blocked by the robots.txt file.

If this isn’t happening frequently, then it’s probably a one off issue you won’t need to worry about. If it’s happening frequently or if you’re worried, you should consider contacting your hosting or service provider to see if they encountered any issues on the date that you saw the crawl error.

If you’re unsure about the technical details of your website, check your web pages with the website audit tool in SEOprofiler. The website audit tool in SEOprofiler checks all pages of your website and it informs you about web page errors that can cause problems with Google.

You can get a website audit tool for a very low price by ordering our special offer below:

Get a website audit for the special offer price

Please tell your friends and colleagues about SEOprofiler and click one of the following buttons:

Share this!
Share On Facebook
Share On Twitter
Share On Google Plus
Share On Linkedin

Johannes Selbach

Johannes Selbach is the CEO of SEOprofiler. He blogs about search engine optimization and website marketing topics at "http://blog.seoprofiler.com".

You may also like...

Show Buttons
Hide Buttons

Get instant access

Enter the URL of your website:(required)

Your email address: (required, needed for verification, no spam)

Choose a password: (required, at least 8 characters)