JSPM

Found 25 results for tiktoken

tokenx

Fast and lightweight token estimation for any LLM without requiring a full tokenizer

  • v1.2.0
  • 84.08
  • Published

gpt-tokens

Calculate the token consumption and amount of openai gpt message

  • v1.3.14
  • 72.24
  • Published

@hakkisagdic/context-manager

Universal LLM context manager supporting 14+ languages with method-level filtering, token optimization, and GitIngest format support

  • v2.2.0
  • 41.84
  • Published

openai-tokens

A service for calculating, managing, truncating openai prompt tokens

  • v2.3.6
  • 36.85
  • Published

anthropic-bedrock

Anthropic/AWS Bedrock Typescript SDK.

    • v0.0.5
    • 35.34
    • Published

    @bbreukelen/textsplitter

    #### Description I needed the SentenceSplitter from llamaindex but had to import the entire llamaindex package which is 1GB. I pulled it out and had GPT make a standalone version. It's not exactly the same but close.

      • v1.2.0
      • 31.27
      • Published

      vg-coder-cli

      🚀 CLI tool to analyze projects, concatenate source files, count tokens, and export HTML with syntax highlighting and copy functionality

      • v1.0.9
      • 24.36
      • Published

      codemass

      Weigh your code in tokens - calculate AI API costs for your codebase

      • v0.1.5
      • 22.81
      • Published

      ultratoken

      UltraToken Utility - CLI tool for token cost analysis

      • v1.0.0
      • 19.86
      • Published

      count-tokens

      A simple CLI tool to count tokens in files or clipboard content using tiktoken

      • v1.0.2
      • 17.45
      • Published

      vecpdf

      CLI tool to process PDFs and create local vector databases using ChromaDB

      • v0.0.1
      • 17.16
      • Published

      elkyn-tokenx

      GPT token estimation and context size utilities without a full tokenizer

      • v0.5.2
      • 16.57
      • Published

      cpai

      Use web-only models (e.g., GPT-5 Pro) with your local codebase: scan, pack to token limits, and copy a paste-ready bundle.

      • v0.0.3
      • 16.51
      • Published

      ai-meter

      A javascript library designed to estimate AI models tokens and its API costs.

      • v0.0.1
      • 15.18
      • Published

      tiktokend-freebsd-x64

      High-level binding to tiktoken-rs for Node.js and browsers

      • v1.0.2
      • 13.90
      • Published

      @mcpflow.io/mcp-obsidian-mcp-server

      Model Context Protocol(MCP)服务器专为LLMs与Obsidian vaults进行交互而设计。通过标准化接口提供安全、token感知的工具,实现无缝知识库管理。

      • v1.0.1
      • 11.74
      • Published

      tiktokend

      High-level binding to tiktoken-rs for Node.js and browsers

      • v1.0.2
      • 11.31
      • Published

      tokin

      Fast tokenizer.

      • v0.1.0
      • 9.75
      • Published

      tiktoken-bundle

      offline-capable ESM module for cl100k_base tokenization in the browser

        • v0.0.1
        • 8.81
        • Published

        obsidian-mcp-server-virtuman

        Model Context Protocol (MCP) server designed for LLMs to interact with Obsidian vaults. Provides secure, token-aware tools for seamless knowledge base management through a standardized interface.

        • v1.5.0
        • 7.20
        • Published

        tiktoken.js

        Pure JavaScript version of OpenAI tiktoken

          • v0.0.1
          • 2.97
          • Published