JSPM

  • Created
  • Published
  • Downloads 25705
  • Score
    100M100P100Q156843F
  • License Apache-2.0

Package Exports

  • ollama-ai-provider-v2
  • ollama-ai-provider-v2/internal
  • ollama-ai-provider-v2/package.json

Readme

Ollama Provider V2 for the Vercel AI SDK

The Ollama Provider V2 for the AI SDK has been created as the original ollama-ai-provider was not being actively maintained.

This provider now supports tool streaming and calling for models.

Setup

The Ollama provider is available in the ollama-ai-provider-v2 module. You can install it with

npm i ollama-ai-provider-v2

Provider Instance

You can import the default provider instance ollama from ollama-ai-provider-v2:

import { ollama } from 'ollama-ai-provider-v2';

Example

import { ollama } from 'ollama-ai-provider';
import { generateText } from 'ai';

const { text } = await generateText({
  model: ollama('llama3.2:latest'),
  prompt: 'Write a meaty lasagna recipe for 4 people.',
});

Documentation

Please check out the Ollama provider documentation for more information.