SEO for React Websites

Updated on April 15th, 2024 at 10:47 pm

React websites aren’t bad for SEO, however a poorly handled migration from a simpler HTML based website to a JavaScript, React or vue.js website can be incredibly damaging to your SEO performance.

React platforms are fundamentally different from websites built with PHP on CMS like WordPress because they rely on JavaScript to build webpages.

Here is what you need to know, from a technical SEO perspective, with specific examples of some of the challenges of conducting efficient SEO campaigns and performance optimisations on a website based on React JS or angular and whether you should choose a JavaScript framework like React for your established SEO website.

Why React sites are a challenge for search engines

The difficulty with react is that parsing JavaScript is a required step before HTML and content of webpages can be rendered.

Because react websites default to rendering website code on the client side, you will notice a steep decline in rankings if you have come from an HTML website. if you don’t take steps to mitigate the default rendering process of the CMS.

The problem with rendering JavaScript client side is that, while Google can, and normally eventually does, render all content it takes additional time and resources to do so.

HTML pages on the other hand are quick and simple to crawl and render.

Whereas sites that rely on JavaScript to render the content of web pages take longer for search engine spiders to crawl. If you’re serving a large website with many pages, moving to a react platform can be incredibly damaging to your SEO performance.

Here are website CMS features critical for SEO.

What is crawl budget?

Website crawl budget is best thought of as a “time spent crawling feature” that search engine spiders monitor to ensure efficiency.

One of the most serious and potentially costly SEO issues with migrating the content of your website to a CMS powered by React or JavaScript platform is the impact on the crawl budget of your website.

Hypothetical Explanation of Crawl Budget

If your WordPress website takes an average of 0.89 seconds to crawl per request and you have 1,500 pages, your website total required crawl time (excluding resource requests) would be: 1173.195 seconds.

If Googlebot allocates a crawl budget of 46.28 seconds (46280 ms), then you could expect 52 of your website pages to get crawled.

If your website slows and it takes GoogleBot 1.35 seconds to crawl each page on your website, it will take GoogleBot 70.2 seconds (70,200 ms) to crawl the same 52 pages.

Total crawl time required for every page on WordPress site @0.89: 1335 seconds (1335,000 ms)

Total crawl budget for every page on React site @1.35: 2025 seconds (2025,000 ms)

This kind of change could have a hugely detrimental affect on the crawl budget of your website.

However, other search engines, like Bing, DuckDuckGo and Yes aren’t as advanced as Google, so rendering JavaScript can still be a challenge.

The numbers above are all hypothetical, search engines don’t confirm crawl budgets or allocate a specific amount of time to each website (and if they do, they don’t specify what the allocation is, however you can get recent averages in Google Search Console).

How to detect JavaScript issues

You can identify websites running on JavaScript based platforms by loading them in a browser with JavaScript disabled.

The webpage shown below is currently built on react and shows no content at all with JavaScript disabled. You can see how this would be problematic for a search engine and increase crawl time commitment for each page and reduce the overall crawl budget for the website.

Where there was once a clean and easy to process and understand HTML page, there now exists a JavaScript based page that requires rendering separately, slowing the progress of search engine spiders considerably.

By creating a website in a JavaScript platform, all of your pages will be added to the render queue before Google can index and rank pages. Depending on how well the JavaScript is optimised will depend on how well your site is optimised to deliver code. The rendering process, on a React site, can take weeks or months for even single pages to get through. 

Google search engine crawl process

Google’s crawl, render, processing and index queue.

react website empty content page

The above page with JavaScript enabled offers a very different content density and user experience for users against what is available to search engines. 

There’s no page content shown at all and the <body> tag contains a single empty <div>. For a search engine crawling this page, the only content that can be crawled on the first indexing wave is the title tag and meta description tags seen in the <head> and the URL.

The source code for this particular react page (shown above) with JavaScript disabled shows there is only one single element accessible to search engines without JavaScript and this is the title tag and meta description of the page (this is empty because the page is noindexed).

Critical webpage elements to help with ranking like; the H1, content, images are not available until the page has been rendered.

Having your website ranked only by what’s included in the title tag and meta description is an ineffective SEO strategy in 2022 and will not produce the results that we would like.

Checking Sites with JavaScript Enabled

empty body tag react website

Even with a JavaScript enabled browser, the page shows no HTML body content at all when checking the source code (above).

The source code of a page is generally an effective way to identify what search engines will see when they crawl and render a webpage.

What the problems are

Rather than waiting hours or days for page changes or updates to get detected by search engines, with a react website, you could be waiting weeks or months, where the only available ranking factor for search engines is the meta content (which is not enough data for any website to compete effectively in modern search engines). 

The impacts of migrating to a react based site are;

– issues with crawl budget for SEO

– issues with content discovery for search engines

– issues with content coverage by search engines

– removal and obstruction of onpage ranking factors (h1s, content internal links)

– general page speed issues 

How to detect your issues

Diagnosing JavaScript issues caused by your React CMS isn’t too difficult.

Use Google tools like;

mobile friendly test tool

URL inspection tool in Google Search Console

The tools will help you to visualise the webpage content as Google would do. If your pages show a low proportion of the content you normally see, then you’ll need to identify how you can show more content.

What are the Solutions?

Fortunately, there are solutions to rendering issues that React websites can create and many of these solutions come in the form of server side rendering, static site generation and static content rehydration. These methods can allow you successfully show content rich pages to users while still enjoying all of the benefits of React.

You’ll need the help of your technical team and engineers to ensure the below steps are correctly implemented.

Use prerenderer.io – this tool prerenders and caches a fully rendered version of your page which is served to search engines like Google when they request a URL.

This works because you can help your server determine real users from bots.

Prerender.io is effective and cost efficient and essential for any site looking to launch a react or vue.js based website. You can even set a recaching frequency so the prerender.io can detect changes to your pages and update the static cached version when you wish.

In addition to prerendering you should also consider the following methods, depending upon the nature of your react website;

pre-rendering with rehydration; you can pre-render content on servers around the world using CDNs so it’ll be pre-rendered even before the user request. Once a user requests the page, code splitting can take place on the server

server side rendering with rehydration; you can create web applications that render on the server side, but subsequent user requests are handled client side. So, your webpage will be interactive without you having to render anything too hefty client side for search engines. You’re delivering a built framework, but any additional user requests for interactive elements are rendered client side.

server side rendering to static content; this works for simple sites where any user interactivity can easily be served within the DOM without the requirement for a round trip to the server.

Isomorphic React; isomorphic react allows code to run both server side and client side which can benefit both the performance for search engines, by rendering code on the server and for clients who can render code client side (or on a mix of both) provided your app supports this. This will mean you can handle user interactions on the app without too much issue without losing out on search performance.

Should you choose React for your website?

Ultimately, as efficient as React can help to make website production and development, impacts on SEO must be considered by anyone who’s considering migrating a website to React.

If you and your engineering team truly believe that a react based platform is the best route forward in terms of efficiency and development then just make sure you take the required steps and follow the processes above to ensure your website remains fully visible and indexable to search engines.

Published on: 15/11/2022

About Ben

Ben is the founder and SEO director of Search Natural. He spent 8 years working in SEO at some of the biggest comparison sites in the UK before setting up his own business to work as an SEO specialist with clients around the world.