Sign up for our newsletters and digests to get news, expert articles, and tips on SEO
Enter correct email address
Thank you for subscribing!
6 comment
16 min read
May 21, 2021

Historically, web developers have been using HTML for mere content, CSS for styling, and JavaScript for interactivity elements. It’s JS that makes it possible to add pop-up dialog boxes and expandable content on web pages. Now, over 97% of all sites use JavaScript because it gives opportunities to modify web content in response to user actions. 

A relatively new trend of incorporating JS into websites is single-page applications. While traditional websites load all their resources (HTML, CSS, JS) by requesting each from the server each time it’s needed, SPAs require just one initial loading and don’t bother the server after that, leaving all the processing to the browser. This results in faster websites but might be a disaster for SEO. 

In this post, we’ll discuss how SPAs are made, why they are so hard to optimize, and how to make sure search engines can understand them and rank them well.

What is an SPA

Single-page application, or SPA, is a specific JavaScript-based technology for website development that doesn’t require any more page loads after the first page view load. React, Angular, and Vue are the most popular JavaScript frameworks used for building SPA. They mostly differ in supported libraries and APIs but use the same logic of serving fast client-side rendering. Many high-profile websites (Twitter, Pinterest, Airbnb) are built with a single-page application architecture.

An SPA eliminates the requests between the server and browser, making the site much faster. But search engines are not so thrilled about this JavaScript trick. What happens is that search engines don’t get enough content: they don’t click around like real users and don’t understand that the content is being added dynamically. What they’re left with is a blank page yet to be filled.

The mechanics behind SPAs

The technology behind SPAs is favorable to end users: they can easily navigate through web pages without any discomfort of extra page loads and layout shifts. Given that single-page application sites cache all the resources in a local storage (after they are loaded at the initial request), users can continue browsing them even under an unstable connection. Due to these benefits, the technology is here for good even though it demands extra SEO effort.

Why is it hard to optimize SPAs

Before JS started dominating web development, search engines were crawling only the text-based content from HTML. As JS was becoming more and more popular, Google started thinking about adding the functionality for interpreting JS resources and understanding pages with them. They have made significant improvements over the years but there still are a lot of problems with how search crawlers see and access content on single-page applications.

There’s little information on how other search engines perceive single-page applications but it’s safe to say that all of them are not crazy about websites reliant on JavaScript. If you’re targeting search platforms beyond Google, you’re in quite a pickle. The 2017 Moz experiment showed that only Google and surprisingly, Ask, were able to crawl JavaScript content, while all other search engines remained totally blind to JS. As of today, no breakthrough announcements were made by any search engine except Google about putting efforts to understanding JS and single-page application websites. At least, there are some official recommendations: for example, Bing makes the same suggestions as Google—it encourages server-side pre-rendering, a technology that allows bingbot (and other crawlers) to receive static HTML as the most complete and comprehensible version.

Search bots failing to understand JavaScript

Crawling issues

HTML, which is easily crawlable by search engines, doesn’t contain much information on an SPA. It includes an external JavaScript file with the help of the