JSPM

  • Created
  • Published
  • Downloads 2915185
  • Score
    100M100P100Q196132F
  • License MIT

🤖 detect bots/crawlers/spiders via the user agent.

Package Exports

  • isbot

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (isbot) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

isbot 🤖/👨‍🦰

Detect bots/crawlers/spiders using the user agent string.

install

Usage

const isbot = require('isbot');

Simple detection

// Nodejs HTTP
isbot(request.getHeader('User-Agent'))

// ExpressJS
isbot(req.get['user-agent'])

// User Agent string
isbot('Googlebot/2.1 (+http://www.google.com/bot.html)') // true
isbot('Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2228.0 Safari/537.36') // false

Add crawler user agents

Add rules to user agent match RegExp

isbot('Mozilla/5.0') // false
isbot.extend([
    'istat',
    'httpclient',
    '^mozilla/\\d\\.\\d$'
])
isbot('Mozilla/5.0') // true

Remove matches of known crawlers

Remove rules to user agent match RegExp (see existing rules in list.json file)

isbot('Google Page Speed Insights') // true
isbot.exclude([
  'Google Page Speed Insights',
  'Chrome-Lighthouse'
])
isbot('Google Page Speed Insights') // false

Verbose result

Return the respective match for bot user agent rule

isbot.find('Googlebot/2.1 (+http://www.google.com/bot.html)') // 'bot'

Data sources

Crawlers user agents:

Non bot user agents: