JSPM

  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 4618
  • Score
    100M100P100Q127831F
  • License Databricks License

Databricks provider for Vercel AI SDK

Package Exports

  • @databricks/ai-sdk-provider

Readme

@databricks/ai-sdk-provider

Databricks provider for the Vercel AI SDK.

Features

  • 🚀 Support for three Databricks endpoint types:
    • Chat Agent (agent/v2/chat) - Databricks chat agent API
    • Responses Agent (agent/v2/responses) - Databricks responses agent API
    • FM API (llm/v1/chat) - Foundation model chat completions API
  • 🔄 Stream and non-stream (generate) support for all endpoint types
  • 🛠️ Custom tool calling mechanism for Databricks agents
  • 🔐 Flexible authentication (bring your own tokens/headers)
  • 🎯 Full TypeScript support

Installation

npm install @databricks/ai-sdk-provider

Peer Dependencies

This package requires the following peer dependencies:

npm install @ai-sdk/provider @ai-sdk/provider-utils

To use the provider with AI SDK functions like generateText or streamText, also install:

npm install ai

Quick Start

import { createDatabricksProvider } from '@databricks/ai-sdk-provider'
import { generateText } from 'ai'

// Create provider with your workspace URL and authentication
const provider = createDatabricksProvider({
  baseURL: 'https://your-workspace.databricks.com/serving-endpoints',
  headers: {
    Authorization: `Bearer ${token}`,
  },
})

// Use the Chat Agent endpoint
const responsesAgent = provider.responsesAgent('your-agent-endpoint')

const result = await generateText({
  model: responsesAgent,
  prompt: 'Hello, how are you?',
})

console.log(result.text)

Authentication

The provider requires you to pass authentication headers:

const provider = createDatabricksProvider({
  baseURL: 'https://your-workspace.databricks.com/serving-endpoints',
  headers: {
    Authorization: `Bearer ${token}`,
  },
})

API Reference

Main Export

createDatabricksProvider(settings)

Creates a Databricks provider instance.

Parameters:

  • settings.baseURL (string, required): Base URL for the Databricks API calls
  • settings.headers (object, optional): Custom headers to include in requests
  • settings.provider (string, optional): Provider name (defaults to "databricks")
  • settings.fetch (function, optional): Custom fetch implementation
  • settings.formatUrl (function, optional): Optional function to format the URL

Returns: DatabricksProvider with three model creation methods:

  • chatAgent(modelId: string): Create a Chat Agent model
  • responsesAgent(modelId: string): Create a Responses Agent model
  • fmapi(modelId: string): Create an FM API model

Tool Constants

import { DATABRICKS_TOOL_DEFINITION, DATABRICKS_TOOL_CALL_ID } from '@databricks/ai-sdk-provider'

Why are these needed?

The AI SDK requires tools to be defined ahead of time with known schemas. However, Databricks agents can orchestrate tools dynamically at runtime - we don't know which tools will be called until the model executes.

To bridge this gap, this provider uses a special "catch-all" tool definition:

  • DATABRICKS_TOOL_DEFINITION: A universal tool definition that accepts any input/output schema. This allows the provider to handle any tool that Databricks agents orchestrate, regardless of its schema.

  • DATABRICKS_TOOL_CALL_ID: The constant ID ('databricks-tool-call') used to identify this special tool. The actual tool name from Databricks is preserved in the metadata so it can be displayed correctly in the UI and passed back to the model.

This pattern enables dynamic tool orchestration by Databricks while maintaining compatibility with the AI SDK's tool interface.

MCP Utilities

import {
  MCP_APPROVAL_STATUS_KEY,
  MCP_APPROVAL_REQUEST_TYPE,
  MCP_APPROVAL_RESPONSE_TYPE,
  isMcpApprovalRequest,
  isMcpApprovalResponse,
  createApprovalStatusOutput,
  getMcpApprovalState,
} from '@databricks/ai-sdk-provider'

MCP (Model Context Protocol) approval utilities for handling approval workflows.

Examples

Responses Agent Endpoint

const responsesAgent = provider.responsesAgent('my-responses-agent')

const result = await generateText({
  model: responsesAgent,
  prompt: 'Analyze this data...',
})

console.log(result.text)

With Tool Calling

import { DATABRICKS_TOOL_CALL_ID, DATABRICKS_TOOL_DEFINITION } from '@databricks/ai-sdk-provider'

const responsesAgent = provider.responsesAgent('my-agent-with-tools')

const result = await generateText({
  model: responsesAgent,
  prompt: 'Search for information about AI',
  tools: {
    [DATABRICKS_TOOL_CALL_ID]: DATABRICKS_TOOL_DEFINITION,
  },
})

Contributing

This package is part of the databricks-ai-bridge monorepo.