Package Exports
- @lenml/tokenizer-gemma2
- @lenml/tokenizer-gemma2/dist/main.global.js
- @lenml/tokenizer-gemma2/models/tokenizer
- @lenml/tokenizer-gemma2/models/tokenizer.json
- @lenml/tokenizer-gemma2/models/tokenizer_config
- @lenml/tokenizer-gemma2/models/tokenizer_config.json
- @lenml/tokenizer-gemma2/src/data
- @lenml/tokenizer-gemma2/src/data.ts
- @lenml/tokenizer-gemma2/src/main
- @lenml/tokenizer-gemma2/src/main.ts
Readme
@lenml/tokenizer-gemma2
a tokenizer.
based on
@lenml/tokenizers
Usage
import { fromPreTrained } from "@lenml/tokenizer-gemma2";
const tokenizer = fromPreTrained();
console.log(
"encode()",
tokenizer.encode("Hello, my dog is cute", null, {
add_special_tokens: true,
})
);
console.log(
"_encode_text",
tokenizer._encode_text("Hello, my dog is cute")
);
Full Tokenizer API
Complete api parameters and usage can be found in transformer.js tokenizers document
License
Apache-2.0