BLOG // May 1, 2021

Gatsby robots.txt and sitemap

Are you looking for a quick and easy way to add a sitemap and generate a robots.txt file for Gatsby? Follow this quick guide to get up and running with both.

Install plugins

Install the sitemap and robots.txt plugins for Gatsby by running the following:

npm install gatsby-plugin-sitemap
npm install --save gatsby-plugin-robots-txt

Gatsby Config

Edit your gatsby-config.js file to enable the two plugins. Most likely you will have to merge the below with yours. The important pieces are:

  • Ensure siteUrl is set
  • Load the two plugins
  • Customize the options for robots.txt to point to the path the sitemap plugin uses
module.exports = {
  siteMetadata: {
    siteUrl: 'https://www.example.com'
  },
  plugins: [
    `gatsby-plugin-sitemap`,
    {
      resolve: 'gatsby-plugin-robots-txt',
      options: {
        host: 'https://www.example.com',
        sitemap: 'https://www.example.com/sitemap/sitemap-index.xml',
        policy: [{ userAgent: '*', allow: '/' }]
      }
    },
  ]
};

Note: sitemaps will only be generated in production environments. To test this you can run:

gatsby build && gatsby serve

Then browse to http://localhost:9000/robots.txt to verify (note 9000 for production server vs develop).

Final words

At this stage you'll want to consider submitting your sitemap to Google Search Console, etc. It will be automatically updated and generated so you can just relax :)

Comments

Subscribe to new articles

If you enjoy my content, consider subscribing. You will only receive new blog stories, no other email.