JSPM

webdocs-mcp-server

1.0.0
  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 1
  • Score
    100M100P100Q39113F
  • License MIT

MCP server for fetching real-time documentation from Langchain, Llama-Index, OpenAI, UV, and Qubrid using web search

Package Exports

  • webdocs-mcp-server
  • webdocs-mcp-server/index.js

This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (webdocs-mcp-server) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

Readme

๐Ÿ“š WebDocs MCP Server

AI-Powered Documentation Search & Retrieval System
A Model Context Protocol (MCP) server that fetches real-time documentation from popular libraries and frameworks using web search.

Python FastMCP License


๐ŸŒŸ Features

  • ๐Ÿ” Real-time Documentation Search - Fetches the latest docs from official sources
  • ๐Ÿš€ Multiple Library Support - Langchain, Llama-Index, OpenAI, UV, and Qubrid
  • ๐Ÿงน Clean Content Extraction - Removes HTML noise and returns readable text
  • ๐Ÿ”— Source Attribution - Every piece of content includes its source URL
  • โšก Fast & Async - Built with async/await for optimal performance
  • ๐Ÿค– MCP Compatible - Works seamlessly with Claude Desktop and other MCP clients

Claude Desktop Usage Demo Screenshots

MCP Server Added to Claude Tool Detected Successfully Got the Response MCP Tool Inspector Remote Server with Streamable HTTP


๐Ÿ“ฆ Supported Libraries

Library Documentation Site
๐Ÿฆœ Langchain python.langchain.com/docs
๐Ÿฆ™ Llama-Index docs.llamaindex.ai/en/stable
๐Ÿค– OpenAI platform.openai.com/docs
๐Ÿ“ฆ UV docs.astral.sh/uv
๐ŸŽฏ Qubrid docs.qubrid.com

๐Ÿ› ๏ธ Installation

The easiest way to use this MCP server is via npx:

npx -y webdocs-mcp-server

This requires:

Manual Installation (For Development)

Prerequisites

  • Python 3.10 or higher
  • UV package manager

Setup

  1. Clone the repository

    git clone <your-repo-url>
    cd Docu_MCP
  2. Install dependencies

    uv sync
  3. Set up environment variables

    Create a .env file in the project root:

    SERPER_API_KEY=your_serper_api_key_here
    GROQ_API_KEY=your_groq_api_key_here

๐Ÿ”‘ API Keys Setup

1. Serper API Key (Required)

The Serper API is used for web search functionality.

  1. Visit serper.dev
  2. Sign up for a free account
  3. Navigate to your dashboard
  4. Copy your API key
  5. Add it to your .env file

2. Groq API Key (Optional)

Currently reserved for future LLM integration features.

  1. Visit console.groq.com
  2. Sign up and get your API key
  3. Add it to your .env file

๐Ÿš€ Usage

Option 1: Using the MCP Client (Python)

Test the server directly with the included client:

uv run mcp_client.py

Example output:

Available Tools: ['get_docs']
[Documentation content with sources...]

You can customize the query in mcp_client.py:

query = "How to setup ComfyUI AI ML Template?"
library = "qubrid"
res = await session.call_tool(
    "get_docs",
    arguments={"user_query": query, "library": library}
)

Option 2: Using Claude Desktop

Step 1: Configure Claude Desktop

Open your Claude Desktop config file:

  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

๐Ÿš€ Recommended: NPM Installation (Remote)

Add this configuration to use the published npm package:

{
  "mcpServers": {
    "WebDocs": {
      "command": "npx",
      "args": ["-y", "webdocs-mcp-server"],
      "env": {
        "SERPER_API_KEY": "your_serper_api_key_here",
        "GROQ_API_KEY": "your_groq_api_key_here"
      }
    }
  }
}

๐Ÿ’ป Alternative: Local Development

If you're developing locally, use this configuration:

{
  "mcpServers": {
    "WebDocs": {
      "command": "C:\\Users\\YOUR_USERNAME\\.local\\bin\\uv.EXE",
      "args": ["--directory", "D:\\Path\\To\\Docu_MCP", "run", "mcp_server.py"],
      "env": {
        "SERPER_API_KEY": "your_serper_api_key_here",
        "GROQ_API_KEY": "your_groq_api_key_here"
      }
    }
  }
}

โš ๏ธ Important:

  • For NPM method: Just add your API keys
  • For local method: Replace YOUR_USERNAME and D:\\Path\\To\\Docu_MCP with your actual paths!

Step 2: Restart Claude Desktop

  1. Completely close Claude Desktop
  2. Reopen Claude Desktop
  3. The WebDocs tool should now be available

Step 3: Use the Tool

In Claude Desktop, you can ask:

"Use the WebDocs tool to search for how to create a ReAct agent in Langchain"

Claude will automatically call the get_docs tool with the appropriate parameters.

Option 3: Using MCP Inspector (Debugging)

The MCP Inspector is a powerful tool for testing and debugging your MCP server.

Start the Inspector

npx @modelcontextprotocol/inspector uv --directory "D:\AbhiiiMan Codes\Docu_MCP" run mcp_server.py

This will:

  1. Start the MCP server
  2. Launch a web interface at http://localhost:5173
  3. Allow you to interactively test the server

Using the Inspector

  1. Open your browser to the provided URL
  2. Click "Connect" to establish connection
  3. Navigate to "Tools" tab
  4. Select get_docs tool
  5. Fill in the parameters:
    • user_query: Your search query
    • library: One of langchain, llama-index, openai, uv, or qubrid
  6. Click "Run Tool" to test

Troubleshooting

If port is already in use:

# Find and kill the process using port 6277
netstat -ano | findstr :6277
taskkill /PID <process_id> /F

๐Ÿ”ง Available Tools

get_docs

Searches and retrieves documentation content from supported libraries.

Parameters:

  • user_query (string, required): The search query
    • Example: "How to use Langchain with OpenAI"
  • library (string, required): The library to search
    • Options: langchain, llama-index, openai, uv, qubrid

Returns:

  • Text content from documentation with source URLs

Example:

await session.call_tool(
    "get_docs",
    arguments={
        "user_query": "vector store integration",
        "library": "langchain"
    }
)

๐Ÿ“ Project Structure

Docu_MCP/
โ”œโ”€โ”€ ๐Ÿ“„ mcp_server.py          # Main MCP server implementation
โ”œโ”€โ”€ ๐Ÿ“„ mcp_client.py          # Test client for the server
โ”œโ”€โ”€ ๐Ÿ“„ constants.py           # Configuration and constants
โ”œโ”€โ”€ ๐Ÿ“„ utils.py               # Utility functions (HTML cleaning)
โ”œโ”€โ”€ ๐Ÿ“„ test_server.py         # Server launch test script
โ”œโ”€โ”€ ๐Ÿ“„ .env                   # Environment variables (create this)
โ”œโ”€โ”€ ๐Ÿ“„ pyproject.toml         # Project dependencies
โ””โ”€โ”€ ๐Ÿ“„ README.md              # This file

๐Ÿ” How It Works

graph LR
    A[User Query] --> B[MCP Server]
    B --> C[Serper API]
    C --> D[Google Search]
    D --> E[Documentation URLs]
    E --> F[Fetch Content]
    F --> G[Clean HTML]
    G --> H[Return Text]
    H --> I[User/Claude]
  1. Query Construction: Combines library domain with user query
  2. Web Search: Uses Serper API to search Google
  3. Content Fetching: Retrieves raw HTML from documentation pages
  4. Content Cleaning: Extracts readable text using Trafilatura
  5. Response Formation: Formats content with source attribution

๐Ÿ› Debugging & Testing

Test Server Launch

Run the test script to verify configuration:

uv run test_server.py

Expected output:

โœ… Server process started successfully!
โœ… Server is running and accepting connections!

Common Issues

Issue Solution
๐Ÿ”ด PORT IS IN USE Kill process on port 6277 or use different port
๐Ÿ”ด SERPER_API_KEY not found Check .env file exists and contains valid key
๐Ÿ”ด program not found Use full path to uv.EXE in Claude config

Enable Debug Logging

Add to your .env:

LOG_LEVEL=DEBUG

๐ŸŽฏ Example Use Cases

1. Learning New Framework

Query: "Getting started with vector stores"
Library: "langchain"
โ†’ Returns: Setup guides, installation steps, basic examples

2. Troubleshooting

Query: "Error handling in async queries"
Library: "llama-index"
โ†’ Returns: Error handling patterns, best practices

3. API Reference

Query: "Chat completion parameters"
Library: "openai"
โ†’ Returns: Parameter documentation, examples, limits

4. Tool Setup

Query: "Installing UV on Windows"
Library: "uv"
โ†’ Returns: Installation guide, configuration steps

๐Ÿค Contributing

Contributions are welcome! Here's how you can help:

  1. Add More Libraries: Update constants.py with new documentation sources
  2. Improve Content Cleaning: Enhance the HTML extraction in utils.py
  3. Add Features: Implement caching, rate limiting, or semantic search
  4. Fix Bugs: Report issues or submit pull requests

๐Ÿ“ License

This project is licensed under the MIT License - see the LICENSE file for details.


๐Ÿ™ Acknowledgments


๐Ÿ“ง Support

Having issues? Here's how to get help:

  1. ๐Ÿ“– Check this README thoroughly
  2. ๐Ÿ” Use MCP Inspector for debugging
  3. ๐Ÿ’ฌ Open an issue on GitHub

Built with โค๏ธ by AbhiiiMan

โญ Star this repo if you find it helpful!