JSPM

robots-dot-txt

1.0.0
  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 1
  • Score
    100M100P100Q49393F
  • License MIT

A really lightweight and simple robots-txt parser.

Package Exports

  • robots-dot-txt
  • robots-dot-txt/index.js

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (robots-dot-txt) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

robots-dot-txt

A simple crawler made in JavaScript for Node.

Installation

$ npm install robots-dot-txt

This should install the package in your project.

Usage

const { RobotsParser } = require("../index.js");
const robots = `
# this is a comment
User-agent: Googlebot
Disallow: /nogoogle/*
Allow: /nobing/*

User-agent: Bingbot
Disallow: /nobing/*
    
User-agent: *
Disallow: /tmp/*`;

robotsParser = new RobotsParser(robots);
console.log(robotsParser.parse());
// Output:
// {
//   Googlebot: { disallow: [ '/nogoogle/*' ], allow: [ '/nobing/*' ] },
//   Bingbot: { disallow: [ '/nobing/*' ], allow: [] },
//   '*': { disallow: [ '/tmp/*' ], allow: [] }
// }

console.log(robotsParser.canAccess("Googlebot", "/nogoogle/test")) // false
console.log(robotsParser.canAccess("Googlebot", "/some/path")) // true
console.log(robotsParser.canAccess("NotARealUA", "/some/path")) // true