Cover image
Try Now
2025-02-07

3 years

Works with Finder

1

Github Watches

7

Github Forks

23

Github Stars

RAG Documentation MCP Server

smithery badge

An MCP server implementation that provides tools for retrieving and processing documentation through vector search, enabling AI assistants to augment their responses with relevant documentation context.

Table of Contents

Features

Tools

  1. search_documentation

    • Search through the documentation using vector search
    • Returns relevant chunks of documentation with source information
  2. list_sources

    • List all available documentation sources
    • Provides metadata about each source
  3. extract_urls

    • Extract URLs from text and check if they're already in the documentation
    • Useful for preventing duplicate documentation
  4. remove_documentation

    • Remove documentation from a specific source
    • Cleans up outdated or irrelevant documentation
  5. list_queue

    • List all items in the processing queue
    • Shows status of pending documentation processing
  6. run_queue

    • Process all items in the queue
    • Automatically adds new documentation to the vector store
  7. clear_queue

    • Clear all items from the processing queue
    • Useful for resetting the system
  8. add_documentation

    • Add new documentation to the processing queue
    • Supports various formats and sources

Quick Start

The RAG Documentation tool is designed for:

  • Enhancing AI responses with relevant documentation
  • Building documentation-aware AI assistants
  • Creating context-aware tooling for developers
  • Implementing semantic documentation search
  • Augmenting existing knowledge bases

Docker Compose Setup

The project includes a docker-compose.yml file for easy containerized deployment. To start the services:

docker-compose up -d

To stop the services:

docker-compose down

Web Interface

The system includes a web interface that can be accessed after starting the Docker Compose services:

  1. Open your browser and navigate to: http://localhost:3030
  2. The interface provides:
    • Real-time queue monitoring
    • Documentation source management
    • Search interface for testing queries
    • System status and health checks

Configuration

Embeddings Configuration

The system uses Ollama as the default embedding provider for local embeddings generation, with OpenAI available as a fallback option. This setup prioritizes local processing while maintaining reliability through cloud-based fallback.

Environment Variables

  • EMBEDDING_PROVIDER: Choose the primary embedding provider ('ollama' or 'openai', default: 'ollama')
  • EMBEDDING_MODEL: Specify the model to use (optional)
    • For OpenAI: defaults to 'text-embedding-3-small'
    • For Ollama: defaults to 'nomic-embed-text'
  • OPENAI_API_KEY: Required when using OpenAI as provider
  • FALLBACK_PROVIDER: Optional backup provider ('ollama' or 'openai')
  • FALLBACK_MODEL: Optional model for fallback provider

Cline Configuration

Add this to your cline_mcp_settings.json:

{
  "mcpServers": {
    "rag-docs": {
      "command": "node",
      "args": ["/path/to/your/mcp-ragdocs/build/index.js"],
      "env": {
        "EMBEDDING_PROVIDER": "ollama", // default
        "EMBEDDING_MODEL": "nomic-embed-text", // optional
        "OPENAI_API_KEY": "your-api-key-here", // required for fallback
        "FALLBACK_PROVIDER": "openai", // recommended for reliability
        "FALLBACK_MODEL": "nomic-embed-text", // optional
        "QDRANT_URL": "http://localhost:6333"
      },
      "disabled": false,
      "autoApprove": [
        "search_documentation",
        "list_sources",
        "extract_urls",
        "remove_documentation",
        "list_queue",
        "run_queue",
        "clear_queue",
        "add_documentation"
      ]
    }
  }
}

Claude Desktop Configuration

Add this to your claude_desktop_config.json:

{
  "mcpServers": {
    "rag-docs": {
      "command": "node",
      "args": ["/path/to/your/mcp-ragdocs/build/index.js"],
      "env": {
        "EMBEDDING_PROVIDER": "ollama", // default
        "EMBEDDING_MODEL": "nomic-embed-text", // optional
        "OPENAI_API_KEY": "your-api-key-here", // required for fallback
        "FALLBACK_PROVIDER": "openai", // recommended for reliability
        "FALLBACK_MODEL": "nomic-embed-text", // optional
        "QDRANT_URL": "http://localhost:6333"
      }
    }
  }
}

Default Configuration

The system uses Ollama by default for efficient local embedding generation. For optimal reliability:

  1. Install and run Ollama locally
  2. Configure OpenAI as fallback (recommended):
    {
      // Ollama is used by default, no need to specify EMBEDDING_PROVIDER
      "EMBEDDING_MODEL": "nomic-embed-text", // optional
      "FALLBACK_PROVIDER": "openai",
      "FALLBACK_MODEL": "text-embedding-3-small",
      "OPENAI_API_KEY": "your-api-key-here"
    }
    

This configuration ensures:

  • Fast, local embedding generation with Ollama
  • Automatic fallback to OpenAI if Ollama fails
  • No external API calls unless necessary

Note: The system will automatically use the appropriate vector dimensions based on the provider:

  • Ollama (nomic-embed-text): 768 dimensions
  • OpenAI (text-embedding-3-small): 1536 dimensions

Acknowledgments

This project is a fork of qpd-v/mcp-ragdocs, originally developed by qpd-v. The original project provided the foundation for this implementation.

Special thanks to the original creator, qpd-v, for their innovative work on the initial version of this MCP server. This fork has been enhanced with additional features and improvements by Rahul Retnan.

Troubleshooting

Server Not Starting (Port Conflict)

If the MCP server fails to start due to a port conflict, follow these steps:

  1. Identify and kill the process using port 3030:
npx kill-port 3030
  1. Restart the MCP server

  2. If the issue persists, check for other processes using the port:

lsof -i :3030
  1. You can also change the default port in the configuration if needed

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • https://zenepic.net
  • Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • apappascs
  • Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.

  • ShrimpingIt
  • Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX

  • pontusab
  • La communauté du curseur et de la planche à voile, recherchez des règles et des MCP

  • av
  • Exécutez sans effort LLM Backends, API, Frontends et Services avec une seule commande.

  • ravitemer
  • Un puissant plugin Neovim pour gérer les serveurs MCP (Protocole de contexte modèle)

  • 1Panel-dev
  • 🔥 1Panel fournit une interface Web intuitive et un serveur MCP pour gérer des sites Web, des fichiers, des conteneurs, des bases de données et des LLM sur un serveur Linux.

  • jae-jae
  • MCP Server pour récupérer le contenu de la page Web à l'aide du navigateur sans tête du dramwright.

  • Mintplex-Labs
  • L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.

  • patruff
  • Pont entre les serveurs Olllama et MCP, permettant aux LLM locaux d'utiliser des outils de protocole de contexte de modèle

  • GeyserMC
  • Une bibliothèque de communication avec un client / serveur Minecraft.

    Reviews

    2 (1)
    Avatar
    user_WXjjTOUD
    2025-04-17

    MCP-Ragdocs by rahulretnan is an outstanding tool for managing and retrieving documentation seamlessly. The interface is user-friendly, and the integration capabilities are top-notch. Whether you are a developer or tech enthusiast, this product simplifies the documentation process considerably. I highly recommend checking it out at https://github.com/rahulretnan/mcp-ragdocs!