JavaScript and search engine optimization (SEO)

Over the past few years, Google has dramatically improved the ability to index web pages that require JavaScript to display their content. Can you get high rankings with JavaScript-only web pages? If Google can index your JavaScript pages, will they also rank your pages?

JavaScript and SEO

How do web pages get into Google’s index?

Step 1: Google crawls web pages with Googlebot

Before a web page can be listed on Google’s search result pages, it has to be discovered by Google. Google uses a web-crawler with the name ‘Googlebot’ to discover new pages.

Googlebot fetches web pages, follows the links on these web pages, fetches these pages, follows the links, etc. Web crawlers such as Googlebot are simply programs that can analyze links and HTML code. Web crawlers do not render web pages. JavaScript is not executed on this stage.

Step 2: Google indexes the pages

Googlebot has the main role when it comes to find new web pages. Google’s Caffeine algorithm is used to index the pages. In that phase, Google tries to render the pages, JavaScript on the crawled pages is executed with a web rendering service (WRS).

Unfortunately, it is not clear to what extent Google executes the JavaScript on the found pages. If rendering the pages shows that there are new links on the page that are only available through JavaScript, these URLs will be sent to the crawler.

Step 3: Google ranks the pages

After rendering the page, Google tries to understand the content of the found web pages. Depending on the content, the quality of the content and other factors (such as the external links that point to the web pages), Google will rank the pages.

JavaScript and SEO

Googlebot does not render the content of web pages. It just discovers new pages and parses the content of these pages. Caffeine renders the content and it processes the JavaScript on the pages.

Google can index and rank JavaScript to some extent. However, experience shows that there are still a lot of problems with JavaScript pages. For that reason, you should make it as easy as possible for Google to index your pages. Making your web pages and your JavaScript search engine friendly is important if you want to get high rankings.

Google recommends ‘progressive enhancement’. That means that you should use only HTML for your web page content and then use AJAX (Asynchronous JavaScript And XML) to improve the appearance of your web pages. That’s the best way to optimize your pages because Google can see the full content of your web pages in the HTML code, and users will get a good looking website.

If you use a framework such as Angular JS, use tools that prerender the pages to make sure that Google can index the full content of your web pages. If you rely on Caffeine, you cannot be sure if the full content of your pages will be parsed. Some content of your pages might be invisible to Google if it relies on JavaScript that is not supported by Caffeine.

What you should do now

Google says that they can index and rank JavaScript. Unfortunately, many webmasters cannot confirm this. In addition, most search engines and social networks do not render JavaScript.

For that reason, it is important that you deliver your web page content in plain HTML to search engines. The easier search engines can parse your web pages, the more likely it is that your content can be indexed correctly.

To make sure that your web pages can be parsed by all crawlers, check your web pages with the website audit tool in SEOprofiler. You can create your SEOprofiler account here:

Check your pages

Johannes Selbach

Johannes Selbach is the CEO of SEOprofiler. He blogs about search engine optimization and website marketing topics at "http://blog.seoprofiler.com".