Package Exports
- @echoes-io/mcp-server
Readme
mcp-server
Model Context Protocol server for AI integration with Echoes storytelling platform
Installation
The server is distributed as an npm package and can be used without cloning the repository.
Using with MCP Clients
Add to your MCP client configuration (e.g., ~/.config/q/mcp.json for Amazon Q):
{
"mcpServers": {
"echoes": {
"command": "npx",
"args": ["-y", "@echoes-io/mcp-server"],
"env": {
"ECHOES_TIMELINE": "your-timeline-name"
}
}
}
}Or install globally:
npm install -g @echoes-io/mcp-serverThen configure:
{
"mcpServers": {
"echoes": {
"command": "echoes-mcp-server",
"env": {
"ECHOES_TIMELINE": "your-timeline-name",
"ECHOES_RAG_PROVIDER": "e5-small",
"ECHOES_RAG_DB_PATH": "./rag_data.db"
}
}
}
}Important: The ECHOES_TIMELINE environment variable must be set to specify which timeline to work with. All tools operate on this timeline.
Optional RAG Configuration:
ECHOES_RAG_PROVIDER: Embedding provider (e5-small,e5-large, orgemini). Default:e5-smallECHOES_GEMINI_API_KEY: Required if usinggeminiproviderECHOES_RAG_DB_PATH: SQLite database path. Default:./rag_data.db
Available Tools
All tools operate on the timeline specified by the ECHOES_TIMELINE environment variable.
Content Operations
words-count- Count words and text statistics in markdown files- Input:
file(path to markdown file)
- Input:
chapter-info- Extract chapter metadata from database- Input:
arc,episode,chapter
- Input:
chapter-refresh- Refresh chapter metadata and word counts from file- Input:
file(path to chapter file)
- Input:
chapter-insert- Insert new chapter with automatic renumbering- Input:
arc,episode,after,pov,title, optional:excerpt,location,outfit,kink,file
- Input:
chapter-delete- Delete chapter from database and optionally from filesystem- Input:
arc,episode,chapter, optional:file(to delete from filesystem)
- Input:
Episode Operations
episode-info- Get episode information and list of chapters- Input:
arc,episode
- Input:
episode-update- Update episode metadata (description, title, slug)- Input:
arc,episode, optional:description,title,slug
- Input:
Timeline Operations
timeline-sync- Synchronize filesystem content with database- Input:
contentPath(path to content directory)
- Input:
Statistics
stats- Get aggregate statistics with optional filters- Input: optional:
arc,episode,pov - Output: Total words/chapters, POV distribution, arc/episode breakdown, longest/shortest chapters
- Examples:
- No filters: Overall timeline statistics
arc: "arc1": Statistics for specific arcarc: "arc1", episode: 1: Statistics for specific episodepov: "Alice": Statistics for specific POV across timeline
- Input: optional:
RAG (Semantic Search)
rag-index- Index chapters into vector database for semantic search- Input:
contentPath(path to content directory, required for full content indexing), optional:arc,episode(to index specific content) - Output: Number of chapters indexed
- Note: Requires
contentPathto read and index actual chapter content. Without it, only metadata is indexed.
- Input:
rag-search- Semantic search across timeline content- Input:
query, optional:arc,pov,maxResults - Output: Relevant chapters with similarity scores and previews
- Input:
rag-context- Retrieve relevant context for AI interactions- Input:
query, optional:arc,pov,maxChapters - Output: Full chapter content for AI context
- Input:
Book Generation
book-generate- Generate PDF book from timeline content using LaTeX- Input:
contentPath,outputPath, optional:episodes,format - Output: PDF book with Victoria Regia template
- Formats:
a4(default),a5 - Requirements: pandoc, LaTeX distribution (pdflatex/xelatex/lualatex)
- Input:
Development
Scripts
# Run tests
npm test
# Run tests with coverage
npm run test:coverage
# Build
npm run build
# Lint
npm run lint
# Fix linting issues
npm run lint:fixTech Stack
- Language: TypeScript (strict mode)
- Testing: Vitest (97%+ coverage)
- Linting: Biome
- Build: TypeScript compiler
Architecture
- MCP Protocol: Standard Model Context Protocol implementation
- Database: SQLite via @echoes-io/tracker (singleton pattern)
- Validation: Zod schemas for type-safe inputs
- Testing: Comprehensive unit and integration tests
- Environment: Uses
ECHOES_TIMELINEenv var for timeline context
License
MIT
Part of the Echoes project - a multi-POV digital storytelling platform.