SEO benefits of Next.js in 2024
By Max Ikaheimo
October 11th, 2022
*This post was written over two years ago, and while it reflects the best information available at that time, some details may have changed.
*Updated in August 2024
The opinion that React is poorly suited for search engine optimization is widespread. It’s assumed that React doesn’t offer information about the rendered HTML, so it’s useless for crawling and indexing.
If it’s such a lousy fit for SEO, why do websites use React? Well, popular search engines like Google have executed JavaScript since the late 2000s. But it doesn’t solve all React’s SEO issues.
Luckily, a framework called Next.js makes these problems non-existent.
Here you will find a concise yet complete description of the traits making Next.js perfect for SEO. But first, let’s ensure you understand how search engines and Next.js SEO work.
Table of contents
In this post:
What’s important for a search engine?
When it comes to content, search engines care only about the experience you provide. They aim to favor websites that provide something useful for users, while it's all carefully designed for a high user experience. This is especially accurate after Google's Helpful Content update, advocating for content that genuine and useful.
To do so, Google stepped into the users’ shoes and formally defined the metrics influencing user experience. These are called Core Web Vitals. As of 2024, Google continues to prioritize Core Web Vitals heavily for SEO rankings, including metrics like Largest Contentful Paint (LCP), First Input Delay (FID), and Cumulative Layout Shift (CLS). These metrics still significantly impact SEO, but Google's Page Experience Update emphasizes them even more now, making them a core part of ranking factors. To rank in SEO, constant optimization of these metrics is critical.
They measure waiting time, quality of visual experience, security, and other parameters claimed important by Google. Other search engines may have differently calculated metrics, but they all measure similar things.
The Core Web Vitals are measured when the page is rendered. Next, the crawler goes through the page, looks at the metadata, and puts the page in the registry accordingly. Then it finds links on the page and crawls them.
A website should have relevant metadata available to make a crawler happy, and the metrics scores should be high.
But how to do that? It depends on the rendering strategy you choose.
What about Google's Helpful Content update?
Since mid 2022, Google has slowly but surely rolled out the new "Helpful Content Update." It's an algorithm they keep developing with the intention to simply emphasize websites that are genuinely helpful, bringing value to the user's online experience rather than giving credit to websites with irrelevant content, created purely to rank in the search engine.
While this blog post discusses the technicalities you should employ for maximizing your SEO, it's important to address that in addition to the technical optimization of SEO, you should always prioritize genuine user-experiences. That means simply providing useful content for users, and specially if you are looking to outsmart Google with all the dishonest SEO tricks, those days are over.
Client-side: HTML or JavaScript
The times of HTML websites crafted by hand are long gone. Most pages are generated by running JS scripts on the server or the client. Which strategy of the two is better for SEO, and how does it influence the metrics?
Client side rendering (CSR) happens when your browser receives JavaScript from the server, executes it, and displays the resulting HTML.
On the other hand, in server-side rendering (SSR), you don’t run scripts in the browser. They are executed on the server each time a user requests a page. Only the resulting HTML is sent to the client.
In the past, search engines could not execute JavaScript. Neither page descriptions nor URLs for further crawling were provided to the bot because bots can only parse HTML.
As early as 2008, Google introduced JavaScript execution, making CSR pages indexable. The rendering was not perfect at the beginning but got better with time. Currently, some search engines still can’t crawl JavaScript, but their market share is small.
You may ask, why does it still matter where you are rendering your website if there’s the same HTML in your browser anyway? Read further to find the answer.
Longer initial load-time in CSR
With SSR, your website's initial load time is shorter, so you score better on Core web vitals. Since browsing is often done from devices with meager computing capacity, such as smartphones, executing JS on the client side takes more time than on the server.
When it comes to a Single page application (SPA), the load time gets even longer because you’re loading and executing all the website content at once.
Moreover, the rendering request may timeout in the Web Rendering Service queue, or the code may not be executed due to errors.
Additionally, Google crawls client-rendered websites in two waves. Firstly, it indexes any featured HTML, adds the found links to the crawl queue, and downloads the response codes. The next step executes the JS scripts and indexes the page entirely.
But default processing capacities may not be enough, so Googlebot may request extra resources. When these become available, the crawler comes back to finish its job. But this step may occur hours or even a week later!
Poor link discovery in CSR
If your SPA is client-rendered, Google bots can’t tell you have paths on your website. That’s because SPAs look like a single page to the crawler.
What seems to be switching between pages for a user is, in fact, switching between fragments of a document object model (DOM). Unfortunately, search engines don’t crawl fragments as individual URLs; you’ll need a workaround to fix this. And there is one: you can use History API to make the pages have solid URLs. But it comes at a time’s cost.
Okay, let’s say you got your paths crawled. Will the crawler obtain the relevant titles and descriptions to index the pages correctly?
In a plain React app this is a problem because all the content is represented on one page. How do you assign different head tags with a specific meta tag to fragments? There’s also a workaround, but extra JS code takes time to execute, so the load time increases.
Client-rendered SPAs are still advantageous for inner business projects with no SEO concerns, like back offices or dashboards. They have a higher load speed when switching between pages because there’s no need to send more requests to the server and wait for the answers: all the content is already preloaded.
Comparing SEO capabilities: Next.js, Gatsby, and Nuxt.js
Feature | Next.js | Gatsby | Nuxt.js |
---|---|---|---|
Underlying Library | React | React | Vue.js |
Server-Side Rendering | Yes (SSR) | No | Yes (SSR) |
Static Site Generation | Yes (SSG) | Yes (SSG) | Yes (SSG) |
Incremental Regeneration | Yes (ISR) | No | No |
Dynamic Routing | Yes | Yes | Yes |
Dynamic Imports | Yes | Yes (via plugins) | Yes |
HTTP/2 Support | Yes | No | No |
PWA Support | Yes (via plugins) | Yes | Yes (via plugins) |
Performance Optimizations | Yes | Yes | Yes |
Head Management | Yes (via plugins) | Yes (via plugins) | Yes (via vue-meta) |
Why Next.js is highly suitable for SEO
Next.js comes with certain SEO-enhancing features out of the box. Additionally, many frameworks add value to Next.js' SEO component. Let’s explore how internal and external tools help solve page discoverability and initial load time problems.
Discoverable links and meaningful descriptions
For crawlers to find your website pages, the paths should be clear, and the URL structure simple and follow a specific pattern.
The good news is that Next.js handles the URL structure for you and does so consistently. Also, it allows for dynamic routes, so you don’t have to use URL parameters (Google doesn’t like them). Moreover, search engines punish you for duplicate links leading to the same page. To fix that, you can provide canonical URLs in the ‘robots.txt’ file.
Remember that a client-rendered single page application may have problems providing a unique title and description for each page?
Well, server side rendering solves this problem. Thanks to it, website pages are rendered and returned upon request, allowing unique head and meta tags to be inserted each time. Additionally, a `next/head` component allows for dynamic head rendering, meaning that your meta tags will be relevant even for dynamically generated pages like the ones representing products in a shop.
Besides, many Next.js libraries provide support for SEO-improving protocols. Here are some examples.
Sitemap protocol
Many pages may not have external links if your website is big. To get them crawled, you should upload a sitemap to Google. A sitemap is an XML file describing all paths inside your website. Its purpose is to tell a crawler about the pages it wouldn’t find otherwise.
By and large, if you have to add extra pages to your site, there’s no need to update your sitemap manually. Next.js allows for dynamically generated sitemaps. All updates are instantly reflected in the XML file when you change your routes. You can build your own sitemap component or use next-sitemap.
Structured metadata
There’s another way you can help Google understand the content of your web page better. You should incorporate a particular JS snippet into your HTML that describes the page content according to a JSON-LD format that gives machine-readable information about the website.
For doing so, Google rewards you with better SEO treatment. In Next.js, you can easily make use of the JSON-LD structured data format.
Open Graph protocol
What else you can do is add the Open Graph protocol support. Open Graph tags have a lot of similarities with SEO tags, but they don’t improve your position in search results directly.
Instead, by providing images, they make your pages more visible on social networks, which may influence your place in the ranking. This is how you can include dynamic content and OG images in your Next.js project.
Even shorter load time
As mentioned above, rendering HTML on the server is faster.
However, there’s still JavaScript on the client in SSR that allows the website to operate as a SPA, but that’s much fewer scripts and a shorter execution time. Also, no two-step crawling is needed for Next.js, because extra resources for code execution are not requested. It allows search engines to crawl your website more often.
What’s more, Next.js 10 offers automatic prefetching: the browser downloads the links’ content when they appear in the page’s viewport, which makes switching between pages faster.
Probably the best thing about Next.js is its ability to combine rendering strategies. In case you need dynamically generated content, you can use SSR. In that way, your HTML will be rendered each time upon request.
Suppose your website parameters stay unchanged during the user’s web journey. In that case, you can incorporate Static Site Generation (SSG), which renders all the pages upon the initial request and then serves the HTML pages individually. You can go for Incremental Static Regeneration (ISR) if you have lots of individual pages and want to build a static page.
Each strategy can adjust the metrics for a particular use case!
Additionally, Next.js comes with HTTP/2 support. HTTP/2 is a major revision of the HTTP protocol that aims to speed up internet exchanges between clients and servers. Using it positively affects the load speed.
Loading things when you need them
Next.js offers you to load resources only when you need them. If you have heavy images on your site, you can use lazy loading, which limits image loading only to the cases when they appear in the viewpoint.
That’s what Pinterest does to save you time. Likewise, you can import other specific components only when needed. It’s called dynamic imports. Your code is being split into small chunks loaded on demand.
Chances are you will use third-party scripts on your website for marketing purposes. But they can be slow. In Next.js, a `next/script` component performs script optimization: you can decide whether to fetch and execute the script instantly or after all the page contents have finished loading.
Best practices Next.js SEO
Server-side rendering (SSR): Using Next.js' built-in SSR capabilities can ensure that search engine crawlers can easily crawl and index your content. This can be achieved by using the
getServerSideProps
orgetStaticProps
methods, which allow you to pre-render a page on the server and return the resulting HTML to the browser.Dynamic rendering: In some cases, it may be more beneficial to use dynamic rendering, a technique that serves a pre-rendered version of a page to search engine crawlers and a client-rendered version to users. This can be achieved by using the
getServerSideProps
method and checking the user-agent of the request to determine if the request is coming from a search engine crawler.Meta tags: It's important to ensure that your Next.js site includes relevant and accurate meta tags, such as a title and description, that can help search engines understand the content of your pages. This can be achieved by using a library such as
react-helmet
to manage your meta tags.Sitemaps: Next.js provides a way to create a sitemap.xml that you can submit to search engines to help them find all the pages on your site. This can be achieved using the
next-sitemap
library to generate a sitemap for your site.Pre-rendering: Next.js provides a way to pre-render the pages that are less likely to change over time. This way, the pages will be pre-built and ready for the search engines to crawl. This can be achieved using the
getStaticProps
method to pre-render pages at build time.Structured data: If you're using Next.js to build an e-commerce site, it's important to use structured data to help search engines understand the products and services on your site. This can be achieved using a library such as
next-seo
to generate structured data for your pages.Optimize images: Optimizing images can help improve your site's performance and reduce the time it takes for pages to load. This can be achieved using a tool such as
imagemin
to compress and optimize images.Lighthouse: Lighthouse is a tool that helps to identify opportunities to improve the performance, accessibility, and SEO of your site. Use Lighthouse to check your site for issues impacting your SEO regularly.
Handle redirects: Set up proper redirects (301 or 302) for your Next.js application to avoid broken links and maintain the flow of link equity. This practice is crucial when you are changing the URL structure or moving pages around within your website.
Create a custom 404 page: Design a custom 404 page for your Next.js application to help users navigate your website when they encounter broken links or unavailable pages. A well-designed 404 page can reduce bounce rates and improve user experience, contributing to better SEO.
Code-splitting: Next.js supports automatic code-splitting, which helps you split your code into smaller chunks that are loaded on-demand. By utilizing code-splitting, you can improve the performance of your website, leading to faster load times and better user experience, which can positively impact your SEO.
Use a Content Delivery Network (CDN): A CDN helps deliver your website's content faster by caching and serving it from servers that are geographically closer to your users. Implementing a CDN for your Next.js application can significantly reduce load times, improving both user experience and SEO performance.
SEO plugins and libraries for Next.js
To enhance your Next.js project's SEO performance, you can take advantage of various plugins and libraries designed specifically for this purpose. Here's a list of some popular Next.js plugins and libraries, along with their key features and benefits:
next-seo: This plugin provides a set of SEO-related components and utilities to simplify the management of SEO metadata. With next-seo, you can easily add and configure meta tags, Open Graph tags, Twitter cards, and JSON-LD structured data to improve your website's search engine ranking.
Key Features:
Easy-to-use components for adding and managing SEO metadata
Support for Open Graph, Twitter cards, and JSON-LD
Customizable default settings for different pages
next-optimized-images: This plugin helps you automatically optimize images in your Next.js project to improve website performance and SEO. By optimizing images, you can reduce the size of your web pages, which leads to faster loading times and better user experience.
Key Features:
Supports various image formats like JPEG, PNG, SVG, and WebP
Automatic optimization of images during build time
Customizable optimization settings and quality levels
next-i18next: This library provides an internationalization (i18n) solution for Next.js projects, allowing you to create multilingual websites easily. A well-implemented i18n strategy can improve your website's SEO by making it accessible to users from different countries and regions.
Key Features:
Seamless integration with Next.js
Server-side rendering (SSR) support for translations
Simple configuration and usage in your Next.js project
Potential downsides and limitations for Next.js SEO
Server-side rendering (SSR) can increase the complexity of your application: SSR can add an additional layer of complexity to your application, as it requires you to handle both server-side and client-side rendering. This can make your application more difficult to debug and maintain.
Dynamic rendering can increase the complexity of your code: Dynamic rendering can also add complexity to your code, as it requires you to handle different scenarios depending on whether the request is coming from a search engine crawler or a user.
SEO issues with client-side routing: Next.js uses client-side routing by default, which can cause SEO issues if not properly handled. Search engines cannot crawl and index content only available through client-side routing. To mitigate this issue, you can use
next-seo
library to handle redirections for search engines.Limited control over the rendered HTML: Although Next.js provides a way to customize the rendered HTML of your pages, you may have limited control over the final output, as the framework handles many aspects of the rendering process.
SEO optimization can be time-consuming: SEO optimization can be a time-consuming process, as it requires you to carefully consider various factors such as meta tags, structured data, and page performance.
Limited support for older browsers: Next.js is built on top of modern web technologies, which may not be fully compatible with older browsers. This can limit your site's reach, as users on older browsers may not be able to access your site's content.
Closing thoughts
It’s evident now that client-side rendering is still detrimental to SEO. While executing JavaScript, errors can arise, preventing the crawler from reaching your meta tags and inner URLs. Additionally, running JS negatively impacts your ranking because it slows down page loading.
Next.js offers various combinations of on-server rendering that fix most of the issues with SPAs. Powered with it, your website will have a clear URL structure and load large components on demand. Additionally, you can find libraries that help you enhance your SEO optimization strategy.
All in all, if you are planning to build an SEO-oriented website, consider using Next.js.
Contact us
Get in touch and let's discuss your business case
Submitting this form will not sign you up for any marketing lists. Your information is strictly used and stored for contacting you. Privacy Policy