Package Exports
- @switchy-ai/sdk
- @switchy-ai/sdk/mcp
Readme
@switchy-ai/sdk
Official TypeScript/JavaScript SDK for the Switchy AI memory, knowledge-graph, and multi-model chat API.
Install
npm install @switchy-ai/sdk
# or
pnpm add @switchy-ai/sdk
# or
yarn add @switchy-ai/sdkQuick start
import { Switchy } from '@switchy-ai/sdk';
const client = new Switchy({ apiKey: process.env.SWITCHY_API_KEY! });
const response = await client.chat.complete({
model: 'anthropic/claude-sonnet-4',
message: 'Summarise my recent project notes',
memory: { enabled: true, extractMemories: true },
});
console.log(response.message.content);MCP server (v0.2.0)
Switchy is also an MCP server. Use McpClient to call its tools from your own
code (the same surface Claude Desktop and Cursor talk to).
import { McpClient } from '@switchy-ai/sdk';
const mcp = new McpClient({ apiKey: process.env.SWITCHY_KEY! });
// Org-scoped reads + writes — no actor needed.
const { memories } = await mcp.searchMemory({ query: 'launch readiness' });
const { spaces } = await mcp.listSpaces();
await mcp.addMemory({
content: 'We picked Ably for realtime in April 2026.',
visibility: 'ORG',
});
// Acting on behalf of a specific human (required for PRIVATE memory access
// and post_message). The key must carry the act_on_behalf scope.
const asAlice = mcp.actAsUser('user_abc123');
await asAlice.postMessage({
sessionId: 'cmo5...',
content: 'Posting from a script — @claude please draft a reply.',
});The McpClient is also exported from @switchy-ai/sdk/mcp if you want a tree-
shakeable subpath import.
Errors
Each MCP failure raises a typed exception you can catch by class:
| Class | When |
|---|---|
McpAuthError |
401 / 403 — missing key, wrong scope, actor not in org |
McpNotFoundError |
404 — resource doesn't exist OR isn't visible to the caller |
McpInvalidParamsError |
400 — params didn't match the tool's schema |
McpRateLimitError |
429 — retryAfterMs, cap, kind available on the error |
McpError |
catch-all base class |
Key minting + docs
Mint a key at https://switchy.build/settings#api-keys (click Mint MCP key). Full install snippets for Claude Desktop / Cursor / generic HTTP live at https://switchy.build/docs/mcp. The OpenAPI spec is at https://switchy.build/api/mcp/openapi.json.
Streaming
for await (const chunk of client.chat.stream({
model: 'openai/gpt-5',
message: 'Write a haiku about memory',
})) {
if (chunk.type === 'token') process.stdout.write(chunk.content ?? '');
}Memory
// Create a namespace
await client.namespaces.create({ name: 'my-project' });
// Store a memory frame
await client.memory.createFrame('my-project', {
content: 'User prefers dark mode and TypeScript',
metadata: { source: 'onboarding' },
});
// Contextual retrieval (semantic + recency)
const relevant = await client.memory.context('my-project', {
query: 'What are the user preferences?',
limit: 5,
});Knowledge graph
await client.knowledgeGraph.createEntity('my-project', {
name: 'AuthService',
type: 'service',
});
await client.knowledgeGraph.createRelation('my-project', {
source: 'AuthService',
target: 'User',
type: 'authenticates',
});Error handling
import { Switchy, SwitchyError, RateLimitError } from '@switchy-ai/sdk';
try {
await client.chat.complete({ model: 'openai/gpt-5', message: 'hi' });
} catch (err) {
if (err instanceof RateLimitError) {
console.log(`Rate limited, retry in ${err.retryAfter}s`);
} else if (err instanceof SwitchyError) {
console.log(`API error: ${err.code} — ${err.message}`);
}
}API reference
chat.complete(opts)— single-turn completionchat.stream(opts)— SSE streaming generatormodels.list({ featured, category, q })— list available modelsnamespaces.{create,list,get,update,delete}— manage memory namespacesmemory.{createFrame,listFrames,context,semantic,search,bridge,consolidate}— memory operationsknowledgeGraph.{createEntity,createRelation,query}— graph operationssessions.{create,list,get}— session management
See full OpenAPI spec: https://switchy.build/api/v1/openapi.json
License
MIT