URL settings and crawl delay in the website audit tool

The website audit tool in SEOprofiler enables you to specify which URLs should be audited.

Session IDs and ad tracking parameters

The website audit tool crawler automatically strips session IDs and ad tracking parameters from URLs. If a landing page is linked with several different utm_whatever links then the website audit tool will index the page only once.

URL parameter exclusion

The website audit tool crawler understands wildcards in your robots.txt file. For example, if you want to exclude all the URLs that contain /print-version then just add the following in the robots.txt file of your website:

User-agent: sp_auditbot
Disallow: /print-version*

Crawl-delay

Of course, the website audit tool crawlers also follow the crawl-delay commands in your robots.txt file. If you want to limit the number of pages that the website audit tool can access per minute, use the crawl-delay command. For example, add the following to your robots.txt file if you allow one page per minute:

User-agent: sp_auditbot
Crawl-delay: 60

Important: If you have a crawl delay of 60 seconds, a maximum of 1440 pages can be analyzed per day. If possible, avoid the crawl-delay command if you want to have quick results.

errors, warnings and notices

The website audit tool in SEOprofiler analyzes many different things that influence the position of your web pages in Google and other search engines. If you haven’t done it yet, try the website audit tool now:

Try the website audit tool

Share this!
Share On Facebook
Share On Twitter
Share On Google Plus
Share On Linkedin

Johannes Selbach

Johannes Selbach is the CEO of SEOprofiler. He blogs about search engine optimization and website marketing topics at "http://blog.seoprofiler.com".

You may also like...

Show Buttons
Hide Buttons

Get instant access

Enter the URL of your website:(required)

Your email address: (required, needed for verification, no spam)

Choose a password: (required, at least 8 characters)