advanced SEO Software

How to optimize JS for search engines

What is JavaScript SEO?

JavaScript SEO is part of Technical SEO and focuses on making it easier for search engines to crawl, render, and index websites built with JavaScript.

Common tasks include:

  • Optimizing content inserted via JavaScript
  • Implement lazy loading correctly
  • Follow internal linking best practices
  • Prevent, find and fix JavaScript issues


Note: If you need to refresh your knowledge of basic JS, read our guide: What is JavaScript used for?

How does Google crawl and index JavaScript?

Google processes JS in three phases:

  1. crawl
  2. rendering
  3. indexing
How Google handles JS
Image source: Google

Google’s web crawler (called Googlebot) queues pages for crawling and rendering.

Crawl all URLs in the queue.

Googlebot makes the request. The server then sends the HTML document.

Googlebot then determines the resources it needs to render the page’s content.

This means crawling HTML. It’s not a JS or CSS file, because rendering JavaScript requires a huge amount of resources.

Think about all the computing power Googlebot needs to download, read, and execute your JS. 2 billion websites.

So Google defers rendering JavaScript. Queue anything that isn’t running for later processing when resources become available.

Once the resource is granted, headless Chromium (a Chrome browser without a user interface) renders the page and executes the JavaScript.

Googlebot will process the rendered HTML again for links. It then queues the URLs it finds for crawling.

In the final step, Google uses the rendered HTML to index your page.

Server-Side Rendering, Client-Side Rendering, and Dynamic Rendering

Google JavaScript indexing issues are primarily based on how your site renders this code: server-side, client-side, or dynamic rendering.

server-side rendering

Server-side rendering (SSR) is when JavaScript is rendered on the server. The rendered HTML page is then served to the client (browser, Googlebot, etc.).

For example, when you visit a website, your browser sends a request to the server that holds the website’s content.

After processing the request, the browser returns the rendered HTML and displays it on the screen.

SSR tends to improve a page’s SEO performance for the following reasons:

  • Reduces the time it takes the main content of the page to load
  • Reduce layout changes that hurt the user experience

However, using SSR can increase the time it takes for a page to accept user input.

This is why some JS-heavy websites choose to use SSR on some pages and not on others.

In such a hybrid model, SSR is usually reserved for important pages for SEO purposes. Also, client-side rendering (CSR) is generally reserved for pages that require a lot of user interaction and input.

However, implementing SSR is often complex and difficult for developers.

Still, there are tools that can help implement SSR.

  • Gatsby and Next.JS in the React framework
  • Angular Universal for Angular Framework
  • Nuxt.js for Vue.js framework

read This guide For more information on setting up server-side rendering, see .

client-side rendering

CSR is the opposite of SSR. In this case, the JavaScript is the client-side (browser or Googlebot in this case) using the Document Object Model (Dom).

Instead of receiving content from an HTML document as with server-side rendering, you get bare-bones HTML with JavaScript files that render the rest of your site using your browser.

Most websites that use CSR have complex user interfaces or many interactions.

check out This guide See for more information on how to set up client-side rendering.

dynamic rendering

Dynamic rendering is an alternative to server-side rendering.

Infographic of how dynamic rendering works

Detect bots with potentially problematic JS-generated content and deliver a server-rendered version without JavaScript.

while showing the client-side rendered version to the user.

Dynamic rendering is a workaround, not a Google recommended solution. This creates extra unnecessary complexity and resources for Google.

If you have a large site with rapidly changing content that needs to be indexed quickly, consider using dynamic rendering.

Or if your site relies on social media or chat apps that require access to page content.

Or if the crawler that’s important to your site can’t support some features of JS.

But in practice, dynamic rendering is rarely the long-term solution. See below for more information on dynamic rendering settings and some alternative approaches. Google guidelines.

Note: Google usually refers to dynamic rendering as “cloaking” (the act of presenting different content to search engines and users). Dynamic rendering is not ideal for other reasons, cloaking rules It is documented in Google’s spam policy.

How to make your website’s JavaScript content SEO friendly

There are a few steps you can take to help search engines properly crawl, render, and index your JS content.

Find errors using Google Search Console

Googlebot is based on the latest version of Chrome. However, it doesn’t behave in the same way as a browser.

In other words, launching your site does not guarantee that Google can display your content.

URL Inspection Tool for google search console (GSC) can check if Google can render your page.

Enter the URL of the page you want to test at the top. and press enter.

Please enter the URL site

next,”Test Live URLrightmost button.

Live URL test button

After a minute or two, the tool will say “live test” tab. Now select the “View tested pagesClick to see the code and screenshots of the page.

Page views tested

More information” tab.

Details tab

A common reason Google is unable to render JS pages is because the site’s robots.txt file blocks rendering. often accidental.

Add the following code to your robot.txt file to avoid blocking crawling of critical resources:

User-Agent: Googlebot
Allow: .js
Allow: .css

Note: Google does not index .js or .css files in search results. They are used to render web pages.

There is no reason to block these critical resources. Doing so may result in content not being rendered and, as a result, not being indexed.

Make sure Google is indexing your JavaScript content

After making sure your page is rendering properly, make sure your page is indexed.

You can check this with GSC or the search engine itself.

To check with Google, use the “site:” command. For example, replace below with the URL of the page you want to test.

If the page is indexed, it will show up in the results. It seems like:

Indexed pages appear in SERPs

Otherwise, the page will not be indexed by Google.

If the page is indexed, check whether the section of content generated by JavaScript is indexed.

Again, use the “site:” command to include a snippet of JS content on your page.

for example: "snippet of JS content"

Checking if this particular section of JS content is indexed. If so, it will be displayed within the snippet.

like this:

Indexed JS content displayed in SERPs

You can also use GSC to check if your JavaScript content is indexed. Again, use a URL inspection tool.

This time instead of testing the live URL,View crawled pages” button. And check the HTML source code of the page.

View crawled pages button

Scan HTML code for snippets of JavaScript content.

There are several reasons why JS content may not be displayed.

  • Unable to render content
  • Unable to detect the URL as JS is generating an internal link pointing to the URL on click
  • Page times out while Google is indexing your content

Run a site audit

It is a technical SEO best practice to perform regular audits on your site.

Semrush’s Site Audit tool can crawl JS just like Google does. Even when rendered on the client side.

To get started, enter your domain and clickcreate project

Site audit tool

next,”valid” for JS rendering in crawler settings.

Enable JS rendering in Site Audit

After crawling, “problem” tab.

Issues tab overview

Common JavaScript SEO issues and how to avoid them

Here are some of the most common issues and best practices for JavaScript SEO.

  • Blocking .js files in your robots.txt file prevents Googlebot from crawling these resources. This means they cannot be rendered and indexed. To work around this issue, allow crawling of these files.
  • Google doesn’t wait that long for JavaScript content to render. The content may not have been indexed due to a timeout error.
  • Search engines don’t click buttons. Use internal links to help Googlebot discover pages on your site.
  • If you use JavaScript to lazy load a page, don’t delay loading content that needs to be indexed. Focus primarily on images and text content when setting up lazy loading.
  • Google often ignores hashes, so make sure you have a static URL generated for your site’s webpage. Make sure the URL looks like this: ( Not this ( or this (

one step ahead

With what you’ve learned about JavaScript SEO, you’ll be well on your way to creating efficient websites that rank well and your users love them.

Ready to dive deeper?

For more information on JS and technical SEO, I recommend reading:

Standing Desk Reviews