Package Exports
- @aerocorp/agi-proxy-sdk
Readme
AGI-PROXY SDK
Universal SDK for integrating with AeroCorp AGI Proxy Platform - Your agnostic AI provider with intelligent routing and fallback.
Features
- 🚀 Universal AI Access: Single SDK for multiple AI providers (Gemini, OpenAI, Anthropic, Mistral, Grok, Ollama)
- 🔄 Intelligent Routing: Automatic fallback between providers for maximum reliability
- 🎯 Default Model: Pre-configured with
google/gemini-2.5-flashfor optimal performance - 🔌 OpenAI Compatible: Drop-in replacement for OpenAI SDK
- 🛡️ Built-in Retry Logic: Automatic retries with exponential backoff
- 📊 Streaming Support: Real-time streaming responses
- 🔐 Secure: Optional API key authentication
Installation
NPM
npm install @aerocorp/agi-proxy-sdkYarn
yarn add @aerocorp/agi-proxy-sdkPNPM
pnpm add @aerocorp/agi-proxy-sdkQuick Start
Basic Usage
import { createAGIProxySDK } from '@aerocorp/agi-proxy-sdk';
// Initialize the SDK
const agi = createAGIProxySDK({
baseUrl: 'https://agi.aerocorpindustries.org',
apiKey: 'your-api-key', // Optional
defaultModel: 'google/gemini-2.5-flash'
});
// Create a chat completion
const response = await agi.createChatCompletion({
messages: [
{ role: 'user', content: 'Hello, how are you?' }
]
});
console.log(response.choices[0].message.content);Advanced Usage
import { AGIProxySDK } from '@aerocorp/agi-proxy-sdk';
const agi = new AGIProxySDK({
baseUrl: 'https://agi.aerocorpindustries.org',
apiKey: 'your-api-key',
defaultModel: 'google/gemini-2.5-flash',
timeout: 120000, // 120 seconds
retries: 3 // Number of retry attempts
});
// Chat completion with custom parameters
const response = await agi.createChatCompletion({
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Explain quantum computing in simple terms.' }
],
model: 'google/gemini-2.5-pro', // Override default model
temperature: 0.7,
max_tokens: 1024,
top_p: 1.0
});Streaming Responses
const stream = agi.streamChatCompletion({
messages: [
{ role: 'user', content: 'Write a short story about AI.' }
],
stream: true
});
for await (const chunk of stream) {
const content = chunk.choices[0]?.delta?.content;
if (content) {
process.stdout.write(content);
}
}List Available Models
const models = await agi.listModels();
console.log('Available models:', models.data);Health Check
const health = await agi.getHealth();
console.log('Service status:', health);Available Models
The AGI Proxy supports multiple AI providers with intelligent fallback:
Google Gemini (Default)
google/gemini-2.5-flash- Fast and efficient (default)google/gemini-2.5-flash-lite- Lightweight versiongoogle/gemini-2.5-pro- Advanced reasoninggoogle/gemini-flash-1.5- Previous generation
OpenAI
openai/gpt-4- Most capable modelopenai/gpt-4-turbo- Faster GPT-4openai/gpt-3.5-turbo- Fast and cost-effective
Anthropic
anthropic/claude-3.7-sonnet:beta-4.0- Latest Claudeanthropic/claude-3-opus- Most capableanthropic/claude-3-sonnet- Balanced
Mistral
mistral/mistral-large- Flagship modelmistral/mistral-medium- Balancedmistral/mistral-small- Fast and efficient
Grok
grok/grok-2- Latest Grok model
Ollama (Local)
ollama/llama3- Meta's Llama 3ollama/deepseek-coder-v2:16b- Code specialistollama/qwen2.5vl:7b- Vision model
Configuration Options
interface AGIProxyConfig {
baseUrl?: string; // Default: 'https://agi.aerocorpindustries.org'
apiKey?: string; // Optional API key
defaultModel?: string; // Default: 'google/gemini-2.5-flash'
timeout?: number; // Default: 120000 (120 seconds)
retries?: number; // Default: 3
}Error Handling
try {
const response = await agi.createChatCompletion({
messages: [{ role: 'user', content: 'Hello!' }]
});
} catch (error) {
if (error instanceof Error) {
console.error('Error:', error.message);
}
}TypeScript Support
The SDK is written in TypeScript and includes full type definitions:
import type {
ChatMessage,
ChatCompletionRequest,
ChatCompletionResponse,
Model,
ModelsResponse
} from '@aerocorp/agi-proxy-sdk';Examples
ChatGPT-like Interface
const agi = createAGIProxySDK();
async function chat(userMessage: string) {
const response = await agi.createChatCompletion({
messages: [
{ role: 'system', content: 'You are a helpful AI assistant.' },
{ role: 'user', content: userMessage }
]
});
return response.choices[0].message.content;
}
const answer = await chat('What is the meaning of life?');
console.log(answer);Multi-turn Conversation
const conversation: ChatMessage[] = [
{ role: 'system', content: 'You are a helpful assistant.' }
];
async function sendMessage(message: string) {
conversation.push({ role: 'user', content: message });
const response = await agi.createChatCompletion({
messages: conversation
});
const assistantMessage = response.choices[0].message;
conversation.push(assistantMessage);
return assistantMessage.content;
}
await sendMessage('Hello!');
await sendMessage('Tell me a joke.');API Reference
createAGIProxySDK(config?: AGIProxyConfig): AGIProxySDK
Creates a new SDK instance with the provided configuration.
createChatCompletion(request: ChatCompletionRequest): Promise<ChatCompletionResponse>
Creates a chat completion with the specified messages and parameters.
streamChatCompletion(request: ChatCompletionRequest): AsyncGenerator
Streams a chat completion response in real-time.
listModels(): Promise<ModelsResponse>
Lists all available models from all providers.
getHealth(): Promise<any>
Checks the health status of the AGI Proxy service.
Support
- Documentation: https://agi.aerocorpindustries.org/docs
- Issues: https://github.com/aerocorp/agi-proxy-sdk/issues
- Email: support@aerocorpindustries.org
License
MIT License - Copyright (c) 2025 AeroCorp Industries
Contributing
Contributions are welcome! Please read our contributing guidelines before submitting PRs.