JSPM

  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 8
  • Score
    100M100P100Q49751F
  • License MIT

First-order integrations for AI, email, and payments with automatic usage billing

Package Exports

  • vly-integrations

Readme

vly-integrations

First-order integrations for AI, email, and payments with automatic usage billing through Vly deployment tokens. Built on the AI SDK for reliable, type-safe AI Gateway integration.

Installation

npm install vly-integrations
# or
yarn add vly-integrations
# or
pnpm add vly-integrations

No additional dependencies required! Built with fetch and AI SDK.

Usage

import { createVlyIntegrations } from 'vly-integrations';

const vly = createVlyIntegrations({
  deploymentToken: process.env.VLY_INTEGRATION_KEY,  // Uses VLY_INTEGRATION_KEY env var
  debug: false // optional
});

// AI Completions via AI Gateway - supports any Vercel AI Gateway model
const completion = await vly.ai.completion({
  model: 'gpt-5', // Or any model supported by Vercel AI Gateway
  messages: [
    { role: 'system', content: 'You are a helpful assistant.' },
    { role: 'user', content: 'Hello!' }
  ],
  temperature: 0.7,
  maxTokens: 150
});

// Streaming completions
await vly.ai.streamCompletion({
  model: 'claude-opus-4-1',
  messages: [{ role: 'user', content: 'Tell me a story...' }]
}, (chunk) => {
  process.stdout.write(chunk); // Real-time streaming
});

// Send Email
const emailResult = await vly.email.send({
  to: 'user@example.com',
  subject: 'Welcome!',
  html: '<h1>Welcome to our service!</h1>',
  text: 'Welcome to our service!'
});

// Create Payment Intent
const paymentIntent = await vly.payments.createPaymentIntent({
  amount: 2000, // $20.00 in cents
  currency: 'usd',
  description: 'Premium subscription',
  customer: {
    email: 'customer@example.com'
  }
});

AI Gateway Integration

Powered by the AI SDK with OpenAI-compatible provider for https://ai-gateway.vly.ai

Supported Models

The vly-integrations package supports all models available through the Vercel AI Gateway. This includes but is not limited to:

  • OpenAI Models: GPT-5, GPT-5 Mini, GPT-5 Nano, GPT-4, GPT-3.5, etc.
  • Anthropic Claude: Claude 4 Opus, Claude 4 Sonnet, Claude 3.7, Claude 3.5, Haiku, etc.
  • Google Models: Gemini Pro, Gemini Flash, Gemini Ultra, etc.
  • Meta Models: Llama models and variants
  • Mistral Models: Mistral Large, Medium, Small, etc.
  • DeepSeek Models: DeepSeek R1, DeepSeek Thinking, DeepSeek Coder, etc.
  • Qwen Models: Qwen Coder and other variants
  • And many more...

Simply pass any model identifier supported by the Vercel AI Gateway to the model parameter.

Direct AI SDK Access

For advanced usage, access the AI SDK provider directly:

import { generateText, streamText } from 'ai';

// Get the provider
const provider = vly.ai.getProvider();
const model = provider('gpt-5');

// Use with AI SDK directly
const result = await generateText({
  model,
  messages: [{ role: 'user', content: 'Hello!' }]
});

// Or streaming
const stream = await streamText({
  model,
  messages: [{ role: 'user', content: 'Tell me a story' }]
});

for await (const delta of stream.textStream) {
  process.stdout.write(delta);
}

API Reference

AI Integration

// Create completion
vly.ai.completion(request: AICompletionRequest): Promise<ApiResponse<AICompletionResponse>>

// Stream completion
vly.ai.streamCompletion(
  request: AICompletionRequest,
  onChunk: (chunk: string) => void
): Promise<ApiResponse<AICompletionResponse>>

// Get AI SDK provider for direct usage
vly.ai.getProvider(): OpenAICompatibleProvider

// Generate embeddings (limited support)
vly.ai.embeddings(input: string | string[]): Promise<ApiResponse<{embeddings: number[][]}>>

Email Integration

// Send single email
vly.email.send(email: EmailRequest): Promise<ApiResponse<EmailResponse>>

// Send batch emails
vly.email.sendBatch(emails: EmailRequest[]): Promise<ApiResponse<EmailResponse[]>>

// Get email status
vly.email.getStatus(emailId: string): Promise<ApiResponse<EmailResponse>>

// Domain management
vly.email.verifyDomain(domain: string): Promise<ApiResponse>
vly.email.listDomains(): Promise<ApiResponse>

Payments Integration

// Payment Intents
vly.payments.createPaymentIntent(intent: PaymentIntent): Promise<ApiResponse<PaymentIntentResponse>>
vly.payments.confirmPaymentIntent(intentId: string, paymentMethodId: string): Promise<ApiResponse>
vly.payments.getPaymentIntent(intentId: string): Promise<ApiResponse>
vly.payments.cancelPaymentIntent(intentId: string): Promise<ApiResponse>

// Subscriptions
vly.payments.createSubscription(subscription: Subscription): Promise<ApiResponse<SubscriptionResponse>>
vly.payments.updateSubscription(id: string, updates: Partial<Subscription>): Promise<ApiResponse>
vly.payments.cancelSubscription(id: string, immediately?: boolean): Promise<ApiResponse>
vly.payments.getSubscription(id: string): Promise<ApiResponse>
vly.payments.listSubscriptions(customerId?: string): Promise<ApiResponse>

// Checkout & Portal
vly.payments.createCheckoutSession(session: CheckoutSession): Promise<ApiResponse>
vly.payments.createCustomerPortal(session: CustomerPortalSession): Promise<ApiResponse>

// Customer Management
vly.payments.createCustomer(customer: Customer): Promise<ApiResponse>
vly.payments.getCustomer(customerId: string): Promise<ApiResponse>
vly.payments.updateCustomer(customerId: string, updates: CustomerUpdate): Promise<ApiResponse>

// Payment Methods
vly.payments.listPaymentMethods(customerId: string): Promise<ApiResponse>
vly.payments.attachPaymentMethod(methodId: string, customerId: string): Promise<ApiResponse>
vly.payments.detachPaymentMethod(methodId: string): Promise<ApiResponse>

Error Handling

All methods return an ApiResponse object with the following structure:

interface ApiResponse<T> {
  success: boolean;
  data?: T;
  error?: string;
  usage?: {
    credits: number;
    operation: string;
  };
}

Example error handling:

const result = await vly.ai.completion({ ... });

if (result.success) {
  console.log('Response:', result.data);
  console.log('Credits used:', result.usage?.credits);
} else {
  console.error('Error:', result.error);
}

Configuration

Environment Variables

VLY_INTEGRATION_KEY=your_integration_key_here  # The key for authenticating with VLY
VLY_DEBUG=true  # optional, enables debug logging

Debug Mode

Enable debug mode to see detailed logs:

const vly = createVlyIntegrations({
  deploymentToken: process.env.VLY_INTEGRATION_KEY,  // Note: parameter is 'deploymentToken' but env var is 'VLY_INTEGRATION_KEY'
  debug: true
});

What's New in v0.2.0

  • AI SDK Integration: Now powered by @ai-sdk/openai-compatible for better reliability
  • No more axios: Replaced with built-in fetch for lighter weight
  • New AI models: Support for GPT-5, Claude 4, and Claude 3.7 models
  • Direct AI SDK access: Get the provider for advanced AI SDK usage
  • Better streaming: Improved streaming support with AI SDK
  • Type safety: Enhanced TypeScript support

Billing

All API calls are automatically billed to your deployment based on usage. The billing happens transparently through your deployment token, and usage information is included in the API responses.

License

MIT