Package Exports
- mindloc-cli
- mindloc-cli/src/cli.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (mindloc-cli) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
MindLoc CLI
AI-powered terminal application for chatting and coding using local models via Ollama.
Note: This package was previously published as
localmind. I've renamed it tomindlocafter becoming aware of another application with the same name.
Important: MindLoc is a CLI tool and must be installed globally with the
-gflag.
Installation
npm install -g mindloc-cliAfter installation, run from any terminal:
mindlocPrerequisites
Features
Dual Mode System
- Chat Mode - Conversational AI with web search and deep research capabilities
- Code Mode - Secure project-scoped development environment with intelligent coding assistance
Intelligent Context Management
- Automatic context condensing at 80% capacity
- Preserves recent messages and system prompts
- Condenses older conversation into concise summaries
- Per-model conversation memory that persists across sessions
Auto Code Mode
- AI can automatically execute actions with intelligent approval system
- Automatic action detection from AI responses
- Smart approval workflow (immediate, pre-approval, or post-approval based on action type)
- Automatic rollback on rejection
Hybrid RAG (Retrieval-Augmented Generation)
- Combine local knowledge base with web search
- Store code snippets and documentation locally
- Search and retrieve relevant information automatically
- Available in both chat and code modes
Advanced Features
- Syntax Highlighting - Beautiful code blocks with language detection
- Live Diff Viewer - Side-by-side comparison of file changes with color-coded additions/deletions
- Session Dependency Graph - Track and visualize session provenance
- Thinking Mode - See model's reasoning process in real-time (for supported models)
- Web Search - Integrated DuckDuckGo search for up-to-date information
- Resource Monitoring - Real-time CPU, RAM, GPU, and VRAM usage display
- Interactive Settings - Configure all features without editing config files
- Theme Support - 5 built-in themes (Default, Ocean, Forest, Sunset, Monochrome)
Security
- Mandatory Project Directory - All file operations scoped to specified directory in code mode
- Path Validation - Prevents AI from accessing files outside project scope
- Command Context - Shell commands execute within project directory
Quick Start
Interactive Mode
mindlocSelect your mode (Chat or Code) and model, then start chatting!
Direct Mode Launch
# Start chat mode
mindloc chat
# Start code mode
mindloc code
# Use specific model
mindloc chat -m llama3.2
mindloc code -m codellama
# Quick question
mindloc ask "How do I implement a binary search in Python?"
# List available models
mindloc models
# Get help
mindloc help
mindloc help chatChat Mode Commands
/settings- Configure MindLoc settings/think [off]- Toggle/configure thinking mode (for supported models)/search <query>- Web search/research <topic>- Deep research with multiple searches/rag add- Add snippet to local knowledge base/rag list- List all stored snippets/rag search <query>- Search local knowledge/graph- Show session dependency graph/why- Explain AI's reasoning for last answer/save [title]- Save conversation/load- Load previous conversation/new- Start new conversation/ollama <model>- Pull a model from Ollama/unload- Unload current model and switch/exit- Exit chat mode
Code Mode Commands
/settings- Configure MindLoc settings/think [off]- Toggle/configure thinking mode (for supported models)/file <path>- Load file into context/write <path>- Write AI response to file/exec <command>- Execute shell command/delete <path>- Delete a file/project [path]- View/set project directory/context- Show context usage/rag [add|list|search]- Knowledge base operations/graph- Show session dependency graph/save [title]- Save session/load- Load session/new- New session/exit- Exit code mode
Configuration
MindLoc stores data in ~/.mindloc/:
config.json- Settingsmemory/- Conversation historyrag/- Local knowledge base
Settings
Access settings with /settings command in any mode:
- Auto Code Mode - Enable/disable automatic action execution
- Hybrid RAG - Enable/disable local knowledge base
- Live Diff Viewer - Show file changes before approval
- Session Graph - Track session dependencies
- Theme - Choose from 5 built-in themes
- Global Context Length - Set default context window for all models
Supported Models
MindLoc works with any Ollama model. Popular choices:
- Chat: llama3.2, mistral, gemma, qwen
- Code: codellama, deepseek-coder, starcoder
- Thinking: qwen3, deepseek-r1, glm-4.7-flash, gpt-oss
Install models with:
ollama pull llama3.2
ollama pull codellamaThinking Mode
Supported models with thinking capabilities:
- qwen3, deepseek-r1, deepseek-v3.1
- gpt-oss, gpt-oss-safeguard
- glm-4.7-flash, qwen3-vl, qwen3-next
- nemotron-3-nano, magistral
Thinking mode shows the model's reasoning process in real-time before providing the final answer.
Uninstall
npm uninstall -g mindloc-cliLicense
MIT