Package Exports
- @ai-sdk/deepinfra
- @ai-sdk/deepinfra/package.json
Readme
AI SDK - DeepInfra Provider
The DeepInfra provider for the AI SDK contains language model support for the DeepInfra API, giving you access to models like Llama 3, Mixtral, and other state-of-the-art LLMs.
Deploying to Vercel? With Vercel's AI Gateway you can access DeepInfra (and hundreds of models from other providers) — no additional packages, API keys, or extra cost. Get started with AI Gateway.
Setup
The DeepInfra provider is available in the @ai-sdk/deepinfra module. You can install it with
npm i @ai-sdk/deepinfraSkill for Coding Agents
If you use coding agents such as Claude Code or Cursor, we highly recommend adding the AI SDK skill to your repository:
npx skills add vercel/aiProvider Instance
You can import the default provider instance deepinfra from @ai-sdk/deepinfra:
import { deepinfra } from '@ai-sdk/deepinfra';Example
import { deepinfra } from '@ai-sdk/deepinfra';
import { generateText } from 'ai';
const { text } = await generateText({
model: deepinfra('meta-llama/Llama-3.3-70B-Instruct'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});Documentation
Please check out the DeepInfra provider documentation for more information.