Relying heavily on JavaScript to render your page content and provide navigation functionality brings with it well-known risks in terms of technical SEO, indexing and linking challenges.
While Googlebot is adept at understanding the structure of HTML links, it can have difficulty finding its way around sites which use JavaScript for navigation or content rendered by Javascript frameworks. Google has mentioned that they are working on a system of indexing AJAX content; any of the major search engines currently index dynamic content as a rule.
Common issues
- Indexable URLs: Pages still need unique, distinct, and indexable URLs. There needs to be a real page, with a 200 OK server response for each individual “page” you want to have indexed. A single page app needs to allow for server-side URLs for each category, article, or product.
- Getting pushState right: Use pushState to represent a URL change. However, this should represent the canonical URL that has server-side support. pushState mistakes and loose server-side implementation can create duplicate content.
- Missing Content Tags: Pages still need: titles, meta descriptions, open graph, robots meta, clean URLs, alt attributes, etc. Audit a page using the Inspect Element approach. The same standards for HTML pages still apply to JavaScript-rendered content. AHREF and img src – Pages still need links to them. Google’s crawl and discovery processes are, generally, the same. Put links in href attributes and images in src attributes. Google struggles with some various approaches, like putting URLs in data attributes instead of the typical HTML attribute.
- Multiple versions: JavaScript rendering can create versions (pre-DOM and post-DOM), so minimise contradictions between them.
- Bot Limitations: Several bots struggle with JavaScript crawling. To combat this, we recommend putting title, meta, social tags, and technical SEO tags in the server-side HTML source.
How does pre-rendering work?
1. Renders your site javascript content in a browser
2. Saves the static HTML
3. Upon search engine crawl, the HTML rendered version is served
Google’s Stance
Initially Google supported pre-rendering, however as of Oct 2015, Google now prefers it if you let it crawl the site naturally include all the javascript files.
Despite the update, Google will still accept pre-rendering as long as the user experience and content of both the pre-rendered and javascript version are the same.
Best Practice principles when optimising JavaScript content (for Google):
- Content in by the load event (or 5 second timeout) is indexable.
- Content dependent on user events is not indexable.
- Pages require an indexable URL, with server-side support.
- Audit rendered HTML (Inspect Element) using the same SEO best practices used on traditional pages.
- Avoid contradictions between versions.