JSPM

@utiric/blackbox-ai

1.0.0
    • ESM via JSPM
    • ES Module Entrypoint
    • Export Map
    • Keywords
    • License
    • Repository URL
    • TypeScript Types
    • README
    • Created
    • Published
    • Downloads 2
    • Score
      100M100P100Q21202F
    • License Apache-2.0

    A reverse engineered Node.js client for the Blackbox.ai API, supporting chat completions with streaming and aggregated responses.

    Package Exports

    • @utiric/blackbox-ai
    • @utiric/blackbox-ai/index.js

    This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (@utiric/blackbox-ai) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

    Readme

    BlackboxAI Client

    A reverse-engineered Node.js client for interacting with the Blackbox.ai API. This module enables you to perform chat completions with both streaming and aggregated responses. It features router-style model naming for seamless IntelliSense auto-completion and throttles requests based on token usage.

    Important:
    This project is based on reverse engineering Blackbox.ai’s API. In order to make successful requests, you must supply a validated token. You can obtain this token by opening your browser’s Developer Tools while using Blackbox.ai and inspecting the network request payloads for the validated field.

    Features

    • Chat Completion: Easily create chat completions with support for streaming responses.
    • Token Management: Uses a GPT-3 tokenizer to count tokens accurately and throttle requests based on tokens-per-second (TPS).
    • IntelliSense Support: Model names use a router-style format (e.g., deepseek/deepseek-r1) to provide autocomplete suggestions in your editor.
    • Reverse Engineered: Designed to work with Blackbox.ai by reverse engineering their request flow.

    Installation

    Install the module via NPM:

    npm install blackbox-ai-client

    If the module is not published yet, clone the repository and run npm install in the project directory.

    Usage

    Below is an example of how to use the BlackboxAI Client:

    const { BlackboxAIClient } = require('blackbox-ai-client');
    
    (async () => {
      // Create an instance of the client.
      // Replace "YOUR_VALIDATED_TOKEN" with the token you obtained from Blackbox.ai.
      const client = new BlackboxAIClient({
        validated: "YOUR_VALIDATED_TOKEN",
        // Optionally, specify a model. Options:
        //   - deepseek/deepseek-r1 (designed for reasoning tasks)
        //   - deepseek/deepseek-v3 (versatile for various applications)
        //   - deepseek/deepseek-reasoner (alias for deepseek/deepseek-r1)
        model: "deepseek/deepseek-v3"
      });
    
      try {
        // Create a chat completion.
        const response = await client.createChatCompletion({
          messages: [
            { role: "user", content: "Hello, how do I use BlackboxAI?" }
          ],
          // Set stream to false to get the aggregated response.
          stream: false
        });
        console.log("Chat Completion Response:", response);
      } catch (error) {
        console.error("Error creating chat completion:", error);
      }
    })();

    Obtaining the Validated Token

    To retrieve the validated token:

    1. Open Blackbox.ai in your browser.
    2. Open the Developer Tools (usually by pressing F12 or right-clicking and selecting Inspect).
    3. Navigate to the Network tab and perform an action that sends a chat request.
    4. Find the request payload (look for the /api/chat endpoint) and locate the validated field.
    5. Copy the value of the validated token and use it in your client configuration.

    API Documentation

    BlackboxAIClient

    Constructor

    • Parameters:
      • options (object) – Configuration options.
        • baseUrl (string, optional) – The API endpoint (default: https://www.blackbox.ai/api/chat).
        • model (ModelName, optional) – The model to use for chat completions. See usage above.
        • validated (string, required) – The validation token obtained from Blackbox.ai.
        • proxy (string, optional) – Proxy URL if needed.
        • maxTPS (number, optional) – Maximum tokens per second allowed (default: Infinity).

    createChatCompletion

    • Description:
      Creates a chat completion using the provided messages. Supports both streaming and aggregated responses.
    • Parameters:
      • params (object)
        • messages (Array) – An array of message objects with role and content.
        • model (ModelName, optional) – Override the default model.
        • trendingAgentMode (object, optional) – Additional mode configuration.
        • stream (boolean, optional) – If set to true, returns a ReadableStream.
    • Returns:
      A Promise that resolves to either an aggregated response object or a ReadableStream.

    Contributing

    Contributions are welcome! If you have ideas or improvements, please open an issue or submit a pull request.

    License

    This project is licensed under the Apache-2.0 License.

    Disclaimer

    This client is a reverse engineering project for educational and experimental purposes only. Use it responsibly and at your own risk.