JSPM

  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 103857
  • Score
    100M100P100Q164380F
  • License MIT

Awesome generator robots.txt

Package Exports

  • generate-robotstxt

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (generate-robotstxt) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

generate-robotstxt

NPM version Travis Build Status dependencies Status devDependencies Status

Awesome generator robots.txt.

Installation

npm install --save-dev generate-robotstxt

Usage

import robotstxt from "generate-robotstxt";

robotstxt({
  policy: [
    {
      userAgent: "Googlebot",
      allow: "/",
      disallow: "/search",
      crawlDelay: 2,
    },
    {
      userAgent: "OtherBot",
      allow: ["/allow-for-all-bots", "/allow-only-for-other-bot"],
      disallow: ["/admin", "/login"],
      crawlDelay: 2,
    },
    {
      userAgent: "*",
      allow: "/",
      disallow: "/search",
      crawlDelay: 10,
      cleanParam: "ref /articles/",
    },
  ],
  sitemap: "http://example.com/sitemap.xml",
  host: "http://example.com",
})
  .then((content) => {
    console.log(content);

    return content;
  })
  .catch((error) => {
    throw error;
  });

File based configuration

robots-txt.config.js

module.exports = {
  policy: [
    {
      userAgent: "Googlebot",
      allow: "/",
      disallow: ["/search"],
      crawlDelay: 2,
    },
    {
      userAgent: "OtherBot",
      allow: ["/allow-for-all-bots", "/allow-only-for-other-bot"],
      disallow: ["/admin", "/login"],
      crawlDelay: 2,
    },
    {
      userAgent: "*",
      allow: "/",
      disallow: "/search",
      crawlDelay: 10,
      cleanParam: "ref /articles/",
    },
  ],
  sitemap: "http://example.com/sitemap.xml",
  host: "http://example.com",
};

CLI

Awesome generator robots.txt

  Usage generate-robotstxt [options] <dest>

  Options:
     --config  Path to a specific configuration file.

Contribution

Feel free to push your code if you agree with publishing under the MIT license.

Changelog

License