To understand why React’s default client-side rendering approach causes SEO issues, you first need to know how Google crawls, processes, and indexes pages.
We can summarize the basics of how this works in the below steps:
Crawling – Googlebot sends GET requests to a server for the URLs in the crawl queue and saves the response contents. Googlebot does this for HTML, JS, CSS, image files, and more.
Processing – This includes adding URLs to the crawl queue found within <a href> links within the HTML. It also includes queuing resource URLs (CSS/JS) found within <link> tags or images within <img src> tags. If Googlebot finds a noindex tag at this stage, the process stops, Googlebot won’t render the content, and Caffeine (Google’s indexer) won’t index it.
Rendering – Googlebot executes JavaScript code with a headless Chromium browser to find additional content within the DOM, but not the HTML source. It does this for all HTML URLs.
Indexing – Caffeine takes the information from Googlebot, normalizes it (fixes broken HTML), and then tries to make sense of it all, precomputing some ranking signals ready for serving within a search result.
Flowchart showing how Google crawls, processes, and indexes pages
SIDENOTE. If you want to learn more about this process, Google has a great explanation. Patrick Stox also has an article explaining the critical need-to-know JavaScript SEO facts.
Historically, issues with React and other JS libraries have been due to Google not handling the rendering step well.
Affiliate marketing whatsapp number list has a simple premise. Just like Batman and Robin, vendors team up with affiliate marketers for mutual gain, making it a win-win for many business owners.
Some examples include:
Not rendering JavaScript – It’s an older issue, but Google only started rendering JavaScript in a limited way in 2008. However, it was still reliant on a crawling scheme for JavaScript sites created in 2009. (Google has since deprecated the scheme.)
The rendering engine (Chromium) being out of date – This resulted in a lack of support for the latest browser and JavaScript features. If you used a JavaScript feature that Googlebot didn’t support, your page might not render correctly, which could negatively impact your content’s indexing.
Google had a rendering delay – In some cases, this could mean a delay of up to a few weeks, slowing down the time for changes to the content to reach the indexing stage. This would have ruled out relying on Google to render content for most sites.
Thankfully, Google has now resolved most of these issues. Googlebot is now evergreen, meaning it always supports the latest features of Chromium.
In addition, the rendering delay is now five seconds, as announced by Martin Splitt at the Chrome Developer Summit in November 2019:
Last year Tom Greenaway and I were on this stage and telling you, ‘Well, you know, it can take up to a week, we are very sorry for this.’ Forget this, okay? Because the new numbers look a lot better. So we actually went over the numbers and found that, it turns out that at median, the time we spent between crawling and actually having rendered these results is – on median – it’s five seconds!”