Package Exports
- task-engine-ai-core
- task-engine-ai-core/backend
- task-engine-ai-core/cli
- task-engine-ai-core/mcp
- task-engine-ai-core/package.json
Readme
Task Engine AI 
By @eyaltoledano & @RalphEcom
A task management system for AI-driven development with Claude, designed to work seamlessly with Cursor AI.
π NEW: IDE Bridge - Zero External API Dependencies!
Task-engine can now run completely independently using your IDE's built-in AI agent!
β No external API keys required β Zero token costs β Better performance (local communication) β Enhanced security (no data leaves your machine) β Works with Cursor, VS Code, and Windsurf
Quick IDE Setup (3 commands)
# 1. Check if you're ready
task-engine migrate-to-ide --check-prereqs
# 2. Migrate to IDE-based AI
task-engine migrate-to-ide
# 3. Test it works
task-engine generate-text "Hello from my IDE!"π Full IDE Bridge Guide | π§ Advanced Configuration
Requirements (Traditional Setup)
π‘ Note: With the new IDE Bridge, you can skip external API keys entirely! See the IDE setup above.
Task Engine AI utilizes AI across several commands, and those require a separate API key. You can use a variety of models from different AI providers provided you add your API keys. For example, if you want to use Claude 3.7, you'll need an Anthropic API key.
You can define 3 types of models to be used: the main model, the research model, and the fallback model (in case either the main or research fail). Whatever model you use, its provider API key must be present in either mcp.json or .env.
At least one (1) of the following is required:
- Anthropic API key (Claude API)
- OpenAI API key
- Google Gemini API key
- Perplexity API key (for research model)
- xAI API Key (for research or main model)
- OpenRouter API Key (for research or main model)
Using the research model is optional but highly recommended. You will need at least ONE API key. Adding all API keys enables you to seamlessly switch between model providers at will.
Quick Start
Option 1: MCP (Recommended)
MCP (Model Control Protocol) lets you run Task Engine AI directly from your editor.
π Real IDE Integration (New!)
Task Engine AI now supports real connections to your IDE's built-in AI agent instead of external APIs:
# Automatically setup MCP with real IDE integration
npm run setup:mcp-ideThis provides:
- β Zero API costs when using your IDE's AI
- β Consistent behavior matching your IDE settings
- β Automatic fallback to external APIs if needed
- β Cross-IDE compatibility (Cursor, VS Code, Windsurf)
π Complete MCP IDE Integration Guide
1. Add your MCP config at the following path depending on your editor
| Editor | Scope | Linux/macOS Path | Windows Path | Key |
|---|---|---|---|---|
| Cursor | Global | ~/.cursor/mcp.json |
%USERPROFILE%\.cursor\mcp.json |
mcpServers |
| Project | <project_folder>/.cursor/mcp.json |
<project_folder>\.cursor\mcp.json |
mcpServers |
|
| Windsurf | Global | ~/.codeium/windsurf/mcp_config.json |
%USERPROFILE%\.codeium\windsurf\mcp_config.json |
mcpServers |
| VSβ―Code | Project | <project_folder>/.vscode/mcp.json |
<project_folder>\.vscode\mcp.json |
servers |
Cursor & Windsurf (mcpServers)
{
"mcpServers": {
"task-engine-ai": {
"command": "npx",
"args": ["-y", "--package=task-engine-ai-core", "task-engine-ai"],
"env": {
"ANTHROPIC_API_KEY": "YOUR_ANTHROPIC_API_KEY_HERE",
"PERPLEXITY_API_KEY": "YOUR_PERPLEXITY_API_KEY_HERE",
"OPENAI_API_KEY": "YOUR_OPENAI_KEY_HERE",
"GOOGLE_API_KEY": "YOUR_GOOGLE_KEY_HERE",
"MISTRAL_API_KEY": "YOUR_MISTRAL_KEY_HERE",
"OPENROUTER_API_KEY": "YOUR_OPENROUTER_KEY_HERE",
"XAI_API_KEY": "YOUR_XAI_KEY_HERE",
"AZURE_OPENAI_API_KEY": "YOUR_AZURE_KEY_HERE",
"OLLAMA_API_KEY": "YOUR_OLLAMA_API_KEY_HERE",
},
},
},
}π Replace
YOUR_β¦_KEY_HEREwith your real API keys. You can remove keys you don't use.
VSβ―Code (servers + type)
{
"servers": {
"task-engine-ai": {
"command": "npx",
"args": ["-y", "--package=task-engine-ai-core", "task-engine-ai"],
"env": {
"ANTHROPIC_API_KEY": "YOUR_ANTHROPIC_API_KEY_HERE",
"PERPLEXITY_API_KEY": "YOUR_PERPLEXITY_API_KEY_HERE",
"OPENAI_API_KEY": "YOUR_OPENAI_KEY_HERE",
"GOOGLE_API_KEY": "YOUR_GOOGLE_KEY_HERE",
"MISTRAL_API_KEY": "YOUR_MISTRAL_KEY_HERE",
"OPENROUTER_API_KEY": "YOUR_OPENROUTER_KEY_HERE",
"XAI_API_KEY": "YOUR_XAI_KEY_HERE",
"AZURE_OPENAI_API_KEY": "YOUR_AZURE_KEY_HERE",
},
"type": "stdio",
},
},
}π Replace
YOUR_β¦_KEY_HEREwith your real API keys. You can remove keys you don't use.
2. (Cursor-only) Enable Task Engine AI MCP
Open Cursor Settings (Ctrl+Shift+J) β‘ Click on MCP tab on the left β‘ Enable task-engine-ai with the toggle
3. (Optional) Configure the models you want to use
In your editor's AI chat pane, say:
Change the main, research and fallback models to <model_name>, <model_name> and <model_name> respectively.4. Initialize Task Engine AI
In your editor's AI chat pane, say:
Initialize task-engine-ai in my project5. Make sure you have a PRD (Recommended)
For new projects: Create your PRD at .task-engine/docs/prd.txt
For existing projects: You can use scripts/prd.txt or migrate with task-engine migrate
An example PRD template is available after initialization in .task-engine/templates/example_prd.txt.
[!NOTE] While a PRD is recommended for complex projects, you can always create individual tasks by asking "Can you help me implement [description of what you want to do]?" in chat.
Always start with a detailed PRD.
The more detailed your PRD, the better the generated tasks will be.
6. Common Commands
Use your AI assistant to:
- Parse requirements:
Can you parse my PRD at scripts/prd.txt? - Plan next step:
What's the next task I should work on? - Implement a task:
Can you help me implement task 3? - Expand a task:
Can you help me expand task 4?
π Dual-Mode Task Creation
Task Engine AI supports two modes for creating tasks via MCP:
π€ AI-Powered Mode (requires external APIs):
Create a task for implementing user authentication with secure session managementπ§ Manual/Agentic Mode (works through agentic instance):
Add a task with title "Implement user authentication", description "Add login/logout functionality", priority "high", and dependencies "1,2,3"Benefits of Manual Mode:
- β No External Dependencies: Works without API keys
- β Immediate Response: No waiting for AI processing
- β Perfect for AI Agents: Ideal for agentic workflows
- β Cost-Free: No API usage costs
More examples on how to use Task Engine AI in chat
Option 2: Using Command Line
Installation
# Install globally
npm install -g task-engine-ai-core
# OR install locally within your project
npm install task-engine-ai-coreInitialize a new project
# If installed globally
task-engine init
# If installed locally
npx task-engine-ai-core task-engine initThis will prompt you for project details and set up a new project with the necessary files and structure.
Common Commands
# Initialize a new project
task-engine init
# Parse a PRD and generate tasks
task-engine parse-prd your-prd.txt
# List all tasks
task-engine list
# Show the next task to work on
task-engine next
# Generate task files
task-engine generateDocumentation
For more detailed information, check out the documentation in the docs directory:
π Real IDE Integration Documentation
- MCP IDE Integration Guide - Complete guide for real IDE connections via MCP
- Real IDE Integration - Technical details and architecture
- IDE Bridge Quick Start - Get started with IDE-based AI in 3 steps
- IDE Bridge Advanced Guide - Complete setup, configuration, and troubleshooting
General Documentation
- Configuration Guide - Set up environment variables and customize Task Engine AI
- Tutorial - Step-by-step guide to getting started with Task Engine AI
- Command Reference - Complete list of all available commands
- Task Structure - Understanding the task format and features
- Example Interactions - Common Cursor AI interaction examples
- Migration Guide - Guide to migrating to the new project structure
Troubleshooting
If task-engine init doesn't respond:
Try running it with Node directly:
node node_modules/task-engine-ai-core/scripts/init.jsOr clone the repository and run:
git clone https://github.com/eyaltoledano/claude-task-master.git
cd claude-task-master
node scripts/init.jsContributors
Star History
Licensing
Task Engine AI is licensed under the MIT License with Commons Clause. This means you can:
β Allowed:
- Use Task Engine AI for any purpose (personal, commercial, academic)
- Modify the code
- Distribute copies
- Create and sell products built using Task Engine AI
β Not Allowed:
- Sell Task Engine AI itself
- Offer Task Engine AI as a hosted service
- Create competing products based on Task Engine AI
See the LICENSE file for the complete license text and licensing details for more information.