The website audit tool in SEOprofiler enables you to specify which URLs should be audited.
Session IDs and ad tracking parameters
The website audit tool crawler automatically strips session IDs and ad tracking parameters from URLs. If a landing page is linked with several different utm_whatever links then the website audit tool will index the page only once.
URL parameter exclusion
The website audit tool crawler understands wildcards in your robots.txt file. For example, if you want to exclude all the URLs that contain /print-version then just add the following in the robots.txt file of your website:
Of course, the website audit tool crawlers also follow the crawl-delay commands in your robots.txt file. If you want to limit the number of pages that the website audit tool can access per minute, use the crawl-delay command. For example, add the following to your robots.txt file if you allow one page per minute:
Important: If you have a crawl delay of 60 seconds, a maximum of 1440 pages can be analyzed per day. If possible, avoid the crawl-delay command if you want to have quick results.
The website audit tool in SEOprofiler analyzes many different things that influence the position of your web pages in Google and other search engines. If you haven’t done it yet, try the website audit tool now: