“Crawling a URL and parsing the HTML response works well for classical websites or server-side rendered pages where the HTML in the HTTP response contains all content.
Google says that it can take several seconds to render a page:
“Googlebot queues all pages for rendering, unless a robots meta tag or header tells Googlebot not to index the page. The page may stay on this queue for a few seconds, but it can take longer than that.”
Google recommends pre-rendering
Web page speed is a ranking factor. If your web server pre-renders your pages, you can be sure that search engine crawlers find the right content, and your web pages will be faster.
You should use meaningful HTTP status codes
Although website visitors cannot the see HTTP status code that a web page sends, it is very important that your pages use the correct HTTP status codes:
“Googlebot uses HTTP status codes to find out if something went wrong when crawling the page.
“You should use a meaningful status code to tell Googlebot if a page should not be crawled or indexed, like a 404 for a page that could not be found or a 401 code for pages behind a login.”
The website audit tool in SEOprofiler checks the HTTP status codes of your web pages. It also shows errors that have a negative influence on your web page rankings.
The website audit tool in SEOprofiler shows you how search engines see your web pages. You can create your SEOprofiler account here: