JSPM

  • Created
  • Published
  • Downloads 10665
  • Score
    100M100P100Q145014F
  • License ISC

Package Exports

    This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (node-llama-cpp) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

    Readme

    node-llama-cpp Logo

    node-llama-cpp

    Run AI models locally on your machine

    Pre-built bindings are provided with a fallback to building from source with cmake

    Build License Types Version

    gpt-oss is here!

    Features

    Documentation

    Try It Without Installing

    Chat with a model in your terminal using a single command:

    npx -y node-llama-cpp chat

    Installation

    npm install node-llama-cpp

    This package comes with pre-built binaries for macOS, Linux and Windows.

    If binaries are not available for your platform, it'll fallback to download a release of llama.cpp and build it from source with cmake. To disable this behavior, set the environment variable NODE_LLAMA_CPP_SKIP_DOWNLOAD to true.

    Usage

    import {fileURLToPath} from "url";
    import path from "path";
    import {getLlama, LlamaChatSession} from "node-llama-cpp";
    
    const __dirname = path.dirname(fileURLToPath(import.meta.url));
    
    const llama = await getLlama();
    const model = await llama.loadModel({
        modelPath: path.join(__dirname, "models", "Meta-Llama-3.1-8B-Instruct.Q4_K_M.gguf")
    });
    const context = await model.createContext();
    const session = new LlamaChatSession({
        contextSequence: context.getSequence()
    });
    
    
    const q1 = "Hi there, how are you?";
    console.log("User: " + q1);
    
    const a1 = await session.prompt(q1);
    console.log("AI: " + a1);
    
    
    const q2 = "Summarize what you said";
    console.log("User: " + q2);
    
    const a2 = await session.prompt(q2);
    console.log("AI: " + a2);

    For more examples, see the getting started guide

    Contributing

    To contribute to node-llama-cpp read the contribution guide.

    Acknowledgements


    Star please

    If you like this repo, star it ✨