SEO benefits of Next.js in 2024

By Max Ikaheimo

11th October, 2022

The opinion that React is poorly suited for search engine optimization is widespread. It’s assumed that React doesn’t offer information about the rendered HTML, so it’s useless for crawling and indexing. 

If it’s such a lousy fit for SEO, why do more than 11 million websites use React? Well, popular search engines like Google have executed JavaScript since the late 2000s. But it doesn’t solve all React’s SEO issues.

Luckily, a framework called Next.js makes these problems non-existent.

Here you will find a concise yet complete description of the traits making Next.js perfect for SEO. But first, let’s ensure you understand how search engines and Next.js SEO work.  

    Table of contents

    In this post:

What’s important for a search engine?

Search engines care about your experience. They aim to show websites that are relevant and nice to navigate on the first pages. 

To do so, Google stepped into the users’ shoes and formally defined the metrics influencing user experience. These are called Core Web Vitals

They measure waiting time, quality of visual experience, security, and other parameters claimed important by Google. Other search engines may have differently calculated metrics, but they all measure similar things.

The Core Web Vitals are measured when the page is rendered. Next, the crawler goes through the page, looks at the metadata, and puts the page in the registry accordingly. Then it finds links on the page and crawls them. 

A website should have relevant metadata available to make a crawler happy, and the metrics scores should be high. 

But how to do that? It depends on the rendering strategy you choose.

Client-side: HTML or JavaScript  

The times of HTML websites crafted by hand are long gone. Most pages are generated by running JS scripts on the server or the client. Which strategy of the two is better for SEO, and how does it influence the metrics?

Client side rendering (CSR) happens when your browser receives JavaScript from the server, executes it, and displays the resulting HTML. 

On the other hand, in server-side rendering (SSR), you don’t run scripts in the browser. They are executed on the server each time a user requests a page. Only the resulting HTML is sent to the client.

In the past, search engines could not execute JavaScript. Neither page descriptions nor URLs for further crawling were provided to the bot because bots can only parse HTML. 

As early as 2008, Google introduced JavaScript execution, making CSR pages indexable. The rendering was not perfect at the beginning but got better with time. Currently, some search engines still can’t crawl JavaScript, but their market share is small. 

You may ask, why does it still matter where you are rendering your website if there’s the same HTML in your browser anyway? Read further to find the answer. 

Longer initial load-time in CSR

With SSR, your website's initial load time is shorter, so you score better on Core web vitals. Since browsing is often done from devices with meager computing capacity, such as smartphones, executing JS on the client side takes more time than on the server. 

When it comes to a Single page application (SPA), the load time gets even longer because you’re loading and executing all the website content at once.

Moreover, the rendering request may timeout in the Web Rendering Service queue, or the code may not be executed due to errors.

Additionally, Google crawls client-rendered websites in two waves. Firstly, it indexes any featured HTML, adds the found links to the crawl queue, and downloads the response codes. The next step executes the JS scripts and indexes the page entirely. 

But default processing capacities may not be enough, so Googlebot may request extra resources. When these become available, the crawler comes back to finish its job. But this step may occur hours or even a week later! 

Poor link discovery in CSR

If your SPA is client-rendered, Google bots can’t tell you have paths on your website. That’s because SPAs look like a single page to the crawler. 

What seems to be switching between pages for a user is, in fact, switching between fragments of a document object model (DOM). Unfortunately, search engines don’t crawl fragments as individual URLs; you’ll need a workaround to fix this. And there is one: you can use History API to make the pages have solid URLs. But it comes at a time’s cost. 

Okay, let’s say you got your paths crawled. Will the crawler obtain the relevant titles and descriptions to index the pages correctly? 

In a plain React app this is a problem because all the content is represented on one page. How do you assign different head tags with a specific meta tag to fragments? There’s also a workaround, but extra JS code takes time to execute, so the load time increases. 

Client-rendered SPAs are still advantageous for inner business projects with no SEO concerns, like back offices or dashboards. They have a higher load speed when switching between pages because there’s no need to send more requests to the server and wait for the answers: all the content is already preloaded. 

Comparing SEO capabilities: Next.js, Gatsby, and Nuxt.js

FeatureNext.jsGatsbyNuxt.js
Underlying LibraryReactReactVue.js
Server-Side RenderingYes (SSR)NoYes (SSR)
Static Site GenerationYes (SSG)Yes (SSG)Yes (SSG)
Incremental RegenerationYes (ISR)NoNo
Dynamic RoutingYesYesYes
Dynamic ImportsYesYes (via plugins)Yes
HTTP/2 SupportYesNoNo
PWA SupportYes (via plugins)YesYes (via plugins)
Performance OptimizationsYesYesYes
Head ManagementYes (via plugins)Yes (via plugins)Yes (via vue-meta)
FeatureUnderlying Library
Next.jsReact
GatsbyReact
Nuxt.jsVue.js
FeatureServer-Side Rendering
Next.jsYes (SSR)
GatsbyNo
Nuxt.jsYes (SSR)
FeatureStatic Site Generation
Next.jsYes (SSG)
GatsbyYes (SSG)
Nuxt.jsYes (SSG)
FeatureIncremental Regeneration
Next.jsYes (ISR)
GatsbyNo
Nuxt.jsNo
FeatureDynamic Routing
Next.jsYes
GatsbyYes
Nuxt.jsYes
FeatureDynamic Imports
Next.jsYes
GatsbyYes (via plugins)
Nuxt.jsYes
FeatureHTTP/2 Support
Next.jsYes
GatsbyNo
Nuxt.jsNo
FeaturePWA Support
Next.jsYes (via plugins)
GatsbyYes
Nuxt.jsYes (via plugins)
FeaturePerformance Optimizations
Next.jsYes
GatsbyYes
Nuxt.jsYes
FeatureHead Management
Next.jsYes (via plugins)
GatsbyYes (via plugins)
Nuxt.jsYes (via vue-meta)

Why Next JS is amazing for SEO

Next.js comes with certain SEO-enhancing features out of the box. Additionally, many frameworks add value to Next.js' SEO component. Let’s explore how internal and external tools help solve page discoverability and initial load time problems. 

Discoverable links and meaningful descriptions

For crawlers to find your website pages, the paths should be clear, and the URL structure simple and follow a specific pattern. 

The good news is that Next.js handles the URL structure for you and does so consistently. Also, it allows for dynamic routes, so you don’t have to use URL parameters (Google doesn’t like them). Moreover, search engines punish you for duplicate links leading to the same page. To fix that, you can provide canonical URLs in the ‘robots.txt’ file. 

Remember that a client-rendered single page application may have problems providing a unique title and description for each page?

Well, server side rendering solves this problem. Thanks to it, website pages are rendered and returned upon request, allowing unique head and meta tags to be inserted each time. Additionally, a `next/head` component allows for dynamic head rendering, meaning that your meta tags will be relevant even for dynamically generated pages like the ones representing products in a shop. 

Besides, many Next.js libraries provide support for SEO-improving protocols. Here are some examples.

Sitemap protocol 

Many pages may not have external links if your website is big. To get them crawled, you should upload a sitemap to Google. A sitemap is an XML file describing all paths inside your website. Its purpose is to tell a crawler about the pages it wouldn’t find otherwise. 

By and large, if you have to add extra pages to your site, there’s no need to update your sitemap manually. Next.js allows for dynamically generated sitemaps. All updates are instantly reflected in the XML file when you change your routes. You can build your own sitemap component or use next-sitemap.

Structured metadata

There’s another way you can help Google understand the content of your web page better. You should incorporate a particular JS snippet into your HTML that describes the page content according to a JSON-LD format that gives machine-readable information about the website. 

For doing so, Google rewards you with better SEO treatment. In Next.js, you can easily make use of the JSON-LD structured data format.

Open Graph protocol

What else you can do is add the Open Graph protocol support. Open Graph tags have a lot of similarities with SEO tags, but they don’t improve your position in search results directly. 

Instead, by providing images, they make your pages more visible on social networks, which may influence your place in the ranking. This is how you can include dynamic content and OG images in your Next.js project.

Even shorter load time

As mentioned above, rendering HTML on the server is faster. 

However, there’s still JavaScript on the client in SSR that allows the website to operate as a SPA, but that’s much fewer scripts and a shorter execution time. Also, no two-step crawling is needed for Next.js, because extra resources for code execution are not requested. It allows search engines to crawl your website more often.

What’s more, Next.js 10 offers automatic prefetching: the browser downloads the links’ content when they appear in the page’s viewport, which makes switching between pages faster. 

Probably the best thing about Next.js is its ability to combine rendering strategies. In case you need dynamically generated content, you can use SSR. In that way, your HTML will be rendered each time upon request. 

Suppose your website parameters stay unchanged during the user’s web journey. In that case, you can incorporate Static Site Generation (SSG), which renders all the pages upon the initial request and then serves the HTML pages individually. You can go for Incremental Static Regeneration (ISR) if you have lots of individual pages and want to build a static page.

Each strategy can adjust the metrics for a particular use case!

Additionally, Next.js comes with HTTP/2 support. HTTP/2 is a major revision of the HTTP protocol that aims to speed up internet exchanges between clients and servers. Using it positively affects the load speed.

Loading things when you need them

Next.js offers you to load resources only when you need them. If you have heavy images on your site, you can use lazy loading, which limits image loading only to the cases when they appear in the viewpoint. 

That’s what Pinterest does to save you time. Likewise, you can import other specific components only when needed. It’s called dynamic imports. Your code is being split into small chunks loaded on demand. 

Chances are you will use third-party scripts on your website for marketing purposes. But they can be slow. In Next.js, a `next/script` component performs script optimization: you can decide whether to fetch and execute the script instantly or after all the page contents have finished loading.

Best practices Next.js SEO

  • Server-side rendering (SSR): Using Next.js' built-in SSR capabilities can ensure that search engine crawlers can easily crawl and index your content. This can be achieved by using the getServerSideProps or getStaticProps methods, which allow you to pre-render a page on the server and return the resulting HTML to the browser.

  • Dynamic rendering: In some cases, it may be more beneficial to use dynamic rendering, a technique that serves a pre-rendered version of a page to search engine crawlers and a client-rendered version to users. This can be achieved by using the getServerSideProps method and checking the user-agent of the request to determine if the request is coming from a search engine crawler.

  • Meta tags: It's important to ensure that your Next.js site includes relevant and accurate meta tags, such as a title and description, that can help search engines understand the content of your pages. This can be achieved by using a library such as react-helmet to manage your meta tags.

  • Sitemaps: Next.js provides a way to create a sitemap.xml that you can submit to search engines to help them find all the pages on your site. This can be achieved using the next-sitemap library to generate a sitemap for your site.

  • Pre-rendering: Next.js provides a way to pre-render the pages that are less likely to change over time. This way, the pages will be pre-built and ready for the search engines to crawl. This can be achieved using the getStaticProps method to pre-render pages at build time.

  • Structured data: If you're using Next.js to build an e-commerce site, it's important to use structured data to help search engines understand the products and services on your site. This can be achieved using a library such as next-seo to generate structured data for your pages.

  • Optimize images: Optimizing images can help improve your site's performance and reduce the time it takes for pages to load. This can be achieved using a tool such as imagemin to compress and optimize images.

  • Lighthouse: Lighthouse is a tool that helps to identify opportunities to improve the performance, accessibility, and SEO of your site. Use Lighthouse to check your site for issues impacting your SEO regularly.

  • Handle redirects: Set up proper redirects (301 or 302) for your Next.js application to avoid broken links and maintain the flow of link equity. This practice is crucial when you are changing the URL structure or moving pages around within your website.

  • Create a custom 404 page: Design a custom 404 page for your Next.js application to help users navigate your website when they encounter broken links or unavailable pages. A well-designed 404 page can reduce bounce rates and improve user experience, contributing to better SEO.

  • Code-splitting: Next.js supports automatic code-splitting, which helps you split your code into smaller chunks that are loaded on-demand. By utilizing code-splitting, you can improve the performance of your website, leading to faster load times and better user experience, which can positively impact your SEO.

  • Use a Content Delivery Network (CDN): A CDN helps deliver your website's content faster by caching and serving it from servers that are geographically closer to your users. Implementing a CDN for your Next.js application can significantly reduce load times, improving both user experience and SEO performance.

SEO plugins and libraries for Next.js

To enhance your Next.js project's SEO performance, you can take advantage of various plugins and libraries designed specifically for this purpose. Here's a list of some popular Next.js plugins and libraries, along with their key features and benefits:

  1. next-seo: This plugin provides a set of SEO-related components and utilities to simplify the management of SEO metadata. With next-seo, you can easily add and configure meta tags, Open Graph tags, Twitter cards, and JSON-LD structured data to improve your website's search engine ranking.

    Key Features:

    • Easy-to-use components for adding and managing SEO metadata

    • Support for Open Graph, Twitter cards, and JSON-LD

    • Customizable default settings for different pages

  2. next-optimized-images: This plugin helps you automatically optimize images in your Next.js project to improve website performance and SEO. By optimizing images, you can reduce the size of your web pages, which leads to faster loading times and better user experience.

    Key Features:

    • Supports various image formats like JPEG, PNG, SVG, and WebP

    • Automatic optimization of images during build time

    • Customizable optimization settings and quality levels

  3. next-i18next: This library provides an internationalization (i18n) solution for Next.js projects, allowing you to create multilingual websites easily. A well-implemented i18n strategy can improve your website's SEO by making it accessible to users from different countries and regions.

    Key Features:

    • Seamless integration with Next.js

    • Server-side rendering (SSR) support for translations

    • Simple configuration and usage in your Next.js project

Potential downsides and limitations for Next.js SEO

  1. Server-side rendering (SSR) can increase the complexity of your application: SSR can add an additional layer of complexity to your application, as it requires you to handle both server-side and client-side rendering. This can make your application more difficult to debug and maintain.

  2. Dynamic rendering can increase the complexity of your code: Dynamic rendering can also add complexity to your code, as it requires you to handle different scenarios depending on whether the request is coming from a search engine crawler or a user.

  3. SEO issues with client-side routing: Next.js uses client-side routing by default, which can cause SEO issues if not properly handled. Search engines cannot crawl and index content only available through client-side routing. To mitigate this issue, you can use next-seo library to handle redirections for search engines.

  4. Limited control over the rendered HTML: Although Next.js provides a way to customize the rendered HTML of your pages, you may have limited control over the final output, as the framework handles many aspects of the rendering process.

  5. SEO optimization can be time-consuming: SEO optimization can be a time-consuming process, as it requires you to carefully consider various factors such as meta tags, structured data, and page performance.

  6. Limited support for older browsers: Next.js is built on top of modern web technologies, which may not be fully compatible with older browsers. This can limit your site's reach, as users on older browsers may not be able to access your site's content.

Closing thoughts

It’s evident now that client-side rendering is still detrimental to SEO. While executing JavaScript, errors can arise, preventing the crawler from reaching your meta tags and inner URLs. Additionally, running JS negatively impacts your ranking because it slows down page loading. 

Next.js offers various combinations of on-server rendering that fix most of the issues with SPAs. Powered with it, your website will have a clear URL structure and load large components on demand. Additionally, you can find libraries that help you enhance your SEO optimization strategy. 

All in all, if you are planning to build an SEO-oriented website, consider using Next.js

Contact us

Get in touch and let's discuss your business case

Email to sales@ikius.com or send us a message here.

Submitting this form will not sign you up for any marketing lists. Your information is strictly used and stored for contacting you. Privacy Policy

By continuing to use this website, you consent to the use of cookies.Check our cookie policy.