What It Is and How to Handle It

The success or in any other case of JavaScript rendering of your on-line content material can influence your website’s efficiency, search engine visibility, and general consumer expertise. 

On this article, we’ll clarify:

  • What JavaScript rendering is, and the way completely different browsers and search engines like google deal with it
  • The influence of JavaScript rendering on search engine optimization (SEO)
  • The three primary strategies for dealing with JavaScript rendering
  • examine for JS rendering points in your website

What Is JavaScript Rendering?

JS rendering is the method of how a browser interprets and executes JavaScript code carried out on a webpage. After which transforms it into the ensuing content material the consumer sees displayed within the browser. 

Equally, in an Website positioning context, search engines like google will crawl, render, and index a website’s JS content material. So it’s discoverable in customers’ on-line search outcomes.

So what’s JavaScript?

JavaScript (JS) is a programming language for creating interactive web sites (which comprise extra participating consumer components than conventional static websites) and net apps. 

Additional studying: Perceive what JavaScript does and its completely different functions in our Basic Guide to JS

JS is likely one of the most typical pc languages for the net, together with HTML and CSS.

Internet builders use JS so as to add performance to a website, together with:

  • Dynamic content material: Any content material generated by consumer enter or adjustments within the web site’s backend database. For example, personalised product suggestions, social media newsfeeds, and climate forecast info.
  • Animations: Examples embrace picture carousels, animated icons, loading animations, and interactive infographics
  • Kind validation: Web sites validate a consumer’s type enter earlier than they submit a type—to examine whether or not the consumer used a legitimate password format, as an illustration. This could enhance the user experience (UX) and cut back frustration because of type errors.

How Browsers Deal with JavaScript

Once you go to a web page, an online browser (the software program used to entry web sites) will usually deal with JavaScript in three steps:

  • Parsing: The browser’s JavaScript rendering engine analyzes the code script to know its construction and syntax
  • Compiling: The engine converts the script into code that the browser can execute
  • Execution: The browser executes code, which ends up in a totally rendered webpage

How Search Engines Render JavaScript

A search engine is software program designed to assist customers search the web. Different search engines deal with JavaScript in another way. Let’s see how the most well-liked ones do it.


Google’s JavaScript rendering course of takes place in three phases:

  1. Crawling: Uncover a brand new web page and obtain the textual content and media it comprises
  2. Rendering: Execute the JavaScript code to know the content material of the web page
  3. Indexing: Perceive what the web page is about and embrace it within the Google index

Right here’s a visible abstract from Google Search Central of what takes place when Google renders a website’s JS:

Google crawls, renders, then indexes.

Google’s net crawler, Googlebot, queues up webpages for crawling and rendering. 

First, it assesses whether or not every web page might be crawled. Googlebot then checks the rendered web page for hyperlinks. And queues any URLs it finds there for additional crawling.


Bing doesn’t support all the most recent JavaScript frameworks (collections of code libraries) for rendering.

The search engine’s official advice is for web sites to make use of dynamic rendering as an alternative of JavaScript.

What’s dynamic rendering?

A web site detects a customer’s consumer agent (user-identifying software program) to find out if it’s a human or a search engine crawler. It’ll render the content material in another way for people and search engines like google.

When detecting Bing’s search engine crawler, web sites ought to pre-render the content material through server-side rendering (SSR). After which serve a static HTML web site to the consumer.

Dynamic rendering means Initial HTML and JavaScript are rendered differently for browsers and crawlers

The benefit is that it minimizes HTTP requests (requests a consumer’s browser makes to the server) because the absolutely rendered web page is delivered instantly. 


Yahoo hasn’t disclosed official info on how its search engine renders JavaScript. But it surely does provide some advice on how web sites ought to deal with JavaScript:

  • Defer JavaScript loading: Execute JS code after the primary content material of the web page masses. This could enhance web page loading pace.
  • Minify JavaScript information: Scale back the dimensions of JS information by eradicating pointless information (like feedback and white house). To assist the code load quicker.
  • Take away duplicate scripts: Web sites can unintentionally find yourself with duplicate JavaScript information. Verify for and take away duplicate scripts to keep away from code conflicts and elevated web page load instances.


Yandex allows webmasters to choose whether or not to permit the search engine to render the JavaScript code on their pages.

The search engine recommends that web sites use server-side JavaScript rendering.

Verify if Googlebot Has Rendered Your Web site Correctly

On this part, we’ll concentrate on essentially the most broadly used search engine—and the one which performs a dominant position in Website positioning.

You need to use the URL inspection function in Google Search Console (GSC) to examine if Google’s net crawler has rendered your web site because it ought to. 

Click on on the search field on the high of the GSC dashboard, kind in your web site’s URL, and hit “Enter.”

example URL typed into the inspect URL field in google search console

The software will examine whether or not your website has been crawled. And if it’s eligible to seem in Google’s search outcomes.

Click on “View Crawled Web page.”

view crawled page button highlighted

Subsequent, click on the “Screenshot” tab adopted by “Take a look at Stay URL.”

Crawled page box open with Screenshot tab and test live url link highlighted

The software will present you a screenshot of how Googlebot renders your web site. 

Screenshot appears of how Google renders the webpage

Evaluate the screenshot with how your web site is rendered within the browser to examine for any points.

How Does JavaScript Affect Website positioning?

JavaScript rendering can have each constructive and unfavourable results in your web site’s Website positioning efficiency.

For example, interactive web site components and dynamic content material updates can present a greater page experience. Which Google tends to reward with higher search engine results page (SERP) visibility.

Web sites additionally use JavaScript to dynamically generate structured information—or schema markup—for webpages. 

google serp listing for best snickerdoodle cookies with rich snippet data like an image of cookies, star rating, and time to bake.

Whereas not a confirmed Google ranking factor, structured information can play an element in bettering your search engine visibility.

However, with out correct implementation of JavaScript code, JS rendering can have its Website positioning downsides. 

The outcomes?

  • Lowered web page load pace: Web sites might be slower to load. And we knowpage speed has been a Google ranking factor for years.
  • Crawling points: Serps generally have issues crawling and indexing pages that comprise JavaScript. Which might result in decreased website visibility in search engine outcomes.
  • Duplicate content: Points happen when the identical webpage has completely different codecs e.g., as a web page with JavaScript rendering and as a static HTML web page. Serps received’t know which model to show in search outcomes and may find yourself displaying each. Duplicate content material could seem on the positioning, too.

3 Strategies for Dealing with JavaScript Rendering 

In case your web site makes use of JavaScript (and it almost definitely does), one essential facet you should work out is how it should render JS code. This choice can finally have an effect on your web site’s efficiency and the general consumer expertise.

There are three JavaScript rendering strategies you possibly can select from: client-side rendering, server-side rendering, and static website technology.

All three strategies have their professionals and cons, which we’ll get into subsequent.

Shopper-Facet Rendering (CSR)

In CSR, JavaScript code is executed within the consumer’s browser as an alternative of on the server aspect. 

When a consumer visits a web site, the positioning’s server responds with an HTML file containing solely the fundamental construction of the web page. Together with hyperlinks to JS and CSS information. 

The consumer’s browser then downloads the information and executes them regionally, rendering the web site.

Professionals and Cons of Shopper-Facet Rendering (CSR)

The benefits of CSR embrace:

  • Lowered server load: Rendering JavaScript within the consumer’s browser reduces the general load on the server, which might keep away from points with server efficiency
  • Improved web site efficiency: After the preliminary web page load, subsequent web page updates are usually quicker. The explanation: CSR permits web page content material updates with out having to reload the web page.
  • Offline performance: As soon as a web page’s preliminary assets obtain, CSR lets customers entry the web page even with out an web connection.

However CSR additionally comes with its set of disadvantages:

  • Elevated preliminary load time: With CSR, it’s essential to obtain and execute JavaScript code on the consumer’s system earlier than rendering a web page. This could extend the preliminary web page load time—doubtless leading to worse UX.
  • Crawling and indexing points: CSR could make it tougher for search engines like google to crawl and index pages containing JavaScript. Probably negatively affecting a website’s search engine visibility.
  • Machine-dependent efficiency: Because it depends on code execution on the consumer’s system, CSR can carry out poorly on older gadgets 

Server-Facet Rendering (SSR)

SSR works by having JavaScript executed on the web site’s server. 

First, a consumer visits a URL utilizing their browser. The browser requests the webpage from the server. And the server renders the web page fully, sending it again to the browser.

Professionals and Cons of Server-Facet Rendering (SSR)

JavaScript server-side rendering presents:

  • Improved crawlability and indexing: Serps will be capable of render HTML content material in its entirety when crawling pages
  • Sooner preliminary load time: Rendering the web page in its entirety on the server, then sending it to the consumer’s browser ends in quicker preliminary load instances. Because of the consumer’s browser not having to obtain and execute the JavaScript by itself.
  • Elevated safety: SSR reduces enter on the consumer’s aspect. Which makes web sites much less inclined to cross-site scripting (malicious events injecting dangerous code onto an in any other case trusted web site) and different varieties of cyber assaults.

In the meantime, the primary drawback of server-side rendering is it might probably trigger a heavier load on a web site’s server, particularly during times of excessive visitors. This could decelerate your web site and even make it inaccessible.

Static Web site Era (SSG)

With SSG, web sites are served as static information—which means precisely as saved on the web site’s server and with none server-side processing. 

Rendering occurs on the web site’s server. However in contrast to in SSR, it happens earlier than a consumer even makes a request.

Professionals and Cons of Static Web site Era (SSG)

Static website technology permits for:

  • Improved web page load pace: Static website technology can considerably enhance load speeds in comparison with different varieties of rendering
  • Website positioning-friendliness: Serps can crawl static websites extra simply, which permits quicker indexing and rating
  • Value-effectiveness: In comparison with dynamic web sites, static websites require fewer assets to function

Static website technology additionally has a few downsides:

  • Lack of dynamic performance: It may be troublesome to implement dynamic performance (equivalent to personalised or user-generated content material) on static web sites
  • Content material administration points: It’s extra complicated to handle content material on static websites since each change requires coding. Not like dynamic websites, which might simply allow updates by way of a content management system (CMS).

Analyze Your JavaScript Web site

Use Semrush’s Site Audit software to uncover numerous points that would stop your JS-powered web site from reaching larger search engine rankings.

Enter your web site area and click on “Begin Audit” to examine your website.

Site Audit tool

You’ll be capable of customise settings for the audit, together with:

  • The consumer agent you need the software to make use of when crawling your web site
  • What number of pages to examine per audit
  • Particular pages you’d like to investigate (or exclude)

Be certain to allow JavaScript rendering within the “Crawler settings” part.

JS rendering section of site audit settings highlighted

Then, click on on “Begin Web site Audit.” The software will generate a report detailing the general well being of your web site.

Click on the “Points” tab. Sort “JavaScript” within the search field to filter by JS-related issues.

site audit shows javascript issues such as slow load speed, broken internal JavaScript and CSS files, and unminified files.

From right here, choose any challenge to assessment the checklist of pages which have it. Some frequent issues you may encounter are:

  • Damaged JavaScript information: JS information which have stopped operating for some cause. Evaluation these information and repair any points (or take away them out of your web site altogether).
  • Unminified JavaScript information: Information containing pointless traces, white house, and feedback that improve file measurement. Use a software like Minifier to cut back their measurement and have them load quicker.
  • Uncached JavaScript information: This challenge happens when a web site doesn’t have browser caching (a mechanism for downloading and storing information on the consumer’s system for repeated use) enabled for JS information. Allow browser caching in your website so information received’t want downloading a number of instances by the consumer’s browser, slowing down web page load instances.

Alternatively, click on “Why and the way to repair it” subsequent to every challenge kind to discover ways to tackle it.

450 issues with unminified JavaScript and CSS files highlighted with pop up describing the issue and how to fix it.

Join a free Semrush account immediately to start out bettering your JS web site’s technical and natural efficiency.

Leave a Reply

Your email address will not be published. Required fields are marked *