JSPM

profanity-analysis

0.0.5
  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • 0
  • Score
    100M100P100Q39003F
  • License MIT

Not all bad words are equal. This filter rates a blob of text based off bad words to determine if the blob is too vulgar to use.

Package Exports

  • profanity-analysis

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (profanity-analysis) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

profanity-analysis

Not all bad words are equal. This filter rates a blob of text based off bad words to determine if the blob is too vulgar to use.

How To use.

var filter = require('profanityAnalysis'); var results = filter.analyzeBlob(yourString);

the returned results will contin the following object. { indexOfProfanity: [], //Array consisting of the indexes of bad words and the weight the bad word is givin in the config. score: 0, //The total profanity weight values added together precentage: 0 //The total number of words over the weight value of the profantiy - totalWords/score };