DEV Community

PostSrc
PostSrc

Posted on • Originally published at postsrc.com on

How to implement a robots.txt file in a Nuxt Js project

How to implement a robots.txt file in a Nuxt Js project

Having a robots.txt file is very important as it helps control Google and other search engines such as Bing to index website content. This is because the first thing crawler check when visiting the website is if robots.txt exists, hence it determine when content should be crawled or not.

There are several ways to add robots.txt and it's as easy as writing down manually in the "statics" folder. But for this tutorial, we'll be using nuxtjs/robots as it's more flexible and the content of the robots.txt can be easily manipulated.

1 - Install nuxtjs/robots Package

First thing first, install the robots package and define it in the modules array of the nuxt.config.js

yarn add @nuxtjs/robots
Enter fullscreen mode Exit fullscreen mode

To define the robots config we can pass an object, array, or function where each method has its own use cases.

export default {
  modules: [
    '@nuxtjs/robots'
  ],
  robots: {
    /* module options */
  }
}
Enter fullscreen mode Exit fullscreen mode

2 - Simple configuration

In this case, define the code as following and it will allow all user agents (bot) to crawl the site. In contrast, if the value of DIsallow is "/" then we are not allowing it to crawl any of the pages.

export default {
  robots: {
    UserAgent: '*',
    Disallow: ''
  }
}
Enter fullscreen mode Exit fullscreen mode

Multiple user agents configuration

If you want to specify a configuration for a different user agent, do pass the array of objects as the robots value and it will behave as defined.

export default {
  robots: [
    {
      UserAgent: 'Googlebot',
      Disallow: '/user',
    },
    {
      UserAgent: '*',
      Disallow: '/admin',
    },
  ]
}
Enter fullscreen mode Exit fullscreen mode

Function configuration

You can also pass in a function as the robot's value and in this case define logic or conditionally define the value of the robot you want it to be.

export default {
  robots: () => {
    if (someLogicHere) {
      return {
        UserAgent: '*',
        Disallow: '/'
      }
    }
  }
Enter fullscreen mode Exit fullscreen mode

3 - yarn dev / npm run dev

Finally, run "yarn dev" and now you can visit /robots.txt to see the value of the robots.txt that you have defined.

User-agent: Googlebot
Disallow: /users
User-agent: Bingbot
Disallow: /admin
Enter fullscreen mode Exit fullscreen mode

This post was originally published at PostSrc 🔥🔥🔥. If you like this kind of tutorial, I would really appreciate it if you give it a visit.

Top comments (0)