How to get your Next.js site indexed in 2024

Amos Bastian

Indexing your Next.js site is super important. If it isn’t indexed, it won’t show up in search results. This means that you won’t get any visitors from Google, Bing, Yandex, Seznam.cz and Naver, and that obviously sucks. Especially since you’re probably using Next.js for its SEO benefits. In this article, we’ll show you how to get your Next.js site indexed on all these search engines so you can actually get some organic traffic to your site for once. Just a warning, if you’re hosting it on Vercel, getting all your pages indexed might inflate your monthly costs even more than they already are.

What is indexing?

Indexing is the process through which search engines crawl, analyse, and store information about your site. A well-indexed site is more likely to appear in relevant search results, increasing its visibility to potential visitors and thus your organic traffic. Sorry for the repetition, I just wanted to make sure you understand how important getting your site indexed is.

How do I get my Next.js site indexed?

Now that you have a basic understanding of indexing and why you should care, let’s take a look at how you can get your Next.js site indexed as soon as possible, so you can grow your site organically (aka for free).

If you’re only interested in getting your Next.js site indexed on a specific search engine, you can jump straight to one of our relevant guides instead:

1. Create an XML sitemap

The first step is to create an XML sitemap. An XML sitemap is a file that contains a list of all the pages on your site in a specific format that search engines understand. It’s used by search engines to find and index your pages automatically.

There are a couple of ways you can generate a sitemap for your Next.js site. If you are using their new app directory, then you can simply add a `sitemap.(js|ts) file:

// sitemap.ts
import { BASE_URL } from "~/constants";
import { allPosts } from "~/contentlayer";

export default async function sitemap() {
  const blogPosts = allPosts.map((post) => ({
    url: `${BASE_URL}/blog/${post.slug}`,
    lastModified: post.dateModified ?? post.datePublished,
  }));

  const routes = ["", "/about", "/blog"].map((route) => ({
    url: `${BASE_URL}${route}`,
    lastModified: new Date(),
  }));

  return [...routes, ...blogPosts];
}

As you can see, this means you’ll need to do some manual work to set this up, which can be annoying.

Because of this, we recommend using the next-sitemap package instead, which will help you immensely by generating dynamic/server-side sitemaps. It even has options to configure sitemap size, alternate refs, your robots.txt and much more.

First, you’ll need to install the package:

# Using NPM
npm install next-sitemap
# Using Yarn
yarn add sitemap
# Using PNPM
pnpm add sitemap

Then, configure your next-sitemap.config.js file:

// next-sitemap.config.js
/** @type {import('next-sitemap').IConfig} */
module.exports = {
  siteUrl: process.env.SITE_URL || "https://indexplease.com",
  generateRobotsTxt: true, // (optional)
  // ...other options
};

and set up your scripts:

// package.json
{
  "build": "next build",
  "postbuild": "next-sitemap"
}

That’s it! Now, every time you run npm run build, your sitemap will be generated automatically. This includes best practices such as splitting your sitemap into multiple files, which is important if you have a large number of pages.

2. Submit your sitemap

Once you’ve created your XML sitemap, you need to submit it to each search engine. Depending on the search engine, this is done in different ways. Luckily for you, we’ve written a guide on how to do this for each relevant search engine, so check them out:

As mentioned above, you only need to submit your sitemap-index.xml file to each search engine, as they will automatically find all the other sitemaps, such as sitemap-0.xml, and crawl them.

3. Submit your pages

Once you’ve submitted your sitemap, you can either wait for search engines to crawl your pages automatically (this can take weeks or not happen at all) or manually submit them. This is done by going to each of the aforementioned webmaster tools and going through the process of submitting them one by one. As you can imagine, this is a very time-consuming process, especially if you have a large number of pages. And it doesn’t always guarantee that your pages will actually get indexed. If this doesn’t put you off, we’ve also written guides on how to manually submit your pages for indexing for each search engine:

Automatically index your pages

Most of us have better things to do than sitting around all day and going through the tedious process of manually submitting pages for indexing, or waiting ages for search engines to index them. This is one of the reasons why we built IndexPlease, which will help you do this automatically. It will only cost you 9$/month and we will automatically submit up to 400 pages per day (yes, really), for up to 5 of your sites, so they get indexed on Google, Bing, Yandex, Seznam.cz and Naver within 48 hours.

Getting your pages automatically indexed is only one of the many things you can do with IndexPlease, so if you’re interested in what else we have to offer, check out our features page.