Google: your robots.txt file should be simple

In an online discussion, Google’s John Mueller said that it’s better to have simple robots.txt files:

“What is it that you’re trying to achieve with the robots.txt file in your case?

First off, one thing that is likely wrong with your robots.txt file is that crawlers obey the most specific user-agent line, not all of them. So for Googlebot, that would be only the section for Googlebot, not the section for *.

The * section is very explicit in your case, so you’d probably want to duplicate that. Past that, why is there a section for Googlebot? Are these URL patterns that you want to disallow for all search engines perhaps?

The “*” section is likely much more complex than you really need. When possible, I’d really recommend keeping the robots.txt file as simple as possible, so that you don’t have trouble with maintenance and that it’s really only disallowing resources that are problematic when crawled (or when its content is indexed).”

How to check your robots.txt file

Among many other things, the website audit tool in SEOprofiler also checks the robots.txt file of your website:

robots.txt analysis

SEOprofiler also offers a free robots.txt creator:

robots.txt creator

The robots.txt creator can be accessed in the free demo version of SEOprofiler. If you haven’t done it yet, try SEOprofiler now. You can get the full version for just $1 (this offer is available once per customer):

Try all SEOprofiler tools for $1!

Please tell your friends and colleagues about SEOprofiler and click one of the following buttons:

Share this!

Johannes Selbach

Johannes Selbach is the CEO of SEOprofiler. He blogs about search engine optimization and website marketing topics at "http://blog.seoprofiler.com".

You may also like...

Show Buttons
Hide Buttons

Get instant access

Enter the URL of your website:(required)

Your email address: (required, needed for verification, no spam)

Choose a password: (required, at least 8 characters)