JSPM

  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 109
  • Score
    100M100P100Q56665F
  • License ISC

An api for hugging face chat.

Package Exports

  • huggingface-chat

Readme

Huggingface chat api

A simple api for hugging face chat with login caching.

Installation

Current stable release (2.x)

npm i huggingface-chat

Example usage

With non streaming api

import { Login ,ChatBot} from "huggingface-chat";

const EMAIL = "email"
const PASSWD = "password"
const cachePath = "./login_cache/"

const signin = new Login(EMAIL, PASSWD)
const res = await signin.login(cachePath) // default path is ./login_cache/
const chat = new ChatBot(res) // res is cookies which is required for subsequent aip calls
const data = await chat.chat("who am i"); // Default model is "meta-llama/Llama-2-70b-chat-hf"
const  response  =  await  data.completeResponsePromise()
console.log(response)

With streaming api

import { Login ,ChatBot} from "huggingface-chat";

const EMAIL = "email"
const PASSWD = "password"
const cachePath = "./login_cache/"

const signin = new Login(EMAIL, PASSWD)
const res = await signin.login(cachePath) // default path is ./login_cache/
const chat = new ChatBot(res) // res is cookies which is required for subsequent aip calls
const data = await chat.chat("who am i"); 
let  reader  =  data.stream.getReader();
while (true) {
    const  {  done,  value  }  =  await  reader.read();
    if (done) break;  // The streaming has ended.
    console.log(value)
}

Switching Models

/*
Avilable models are:

'meta-llama/Llama-2-70b-chat-hf'
'codellama/CodeLlama-34b-Instruct-hf'
'OpenAssistant/oasst-sft-6-llama-30b-xor'
*/
chat.switchModel('OpenAssistant/oasst-sft-6-llama-30b-xor') 

Documentations

Full API documentations can be found here docs

Contributions

  • If you happen to see missing feature or a bug, feel free to open an issue.
  • Pull requests are welcomed too!

License

MIT