Cover image
Try Now
2025-03-20

Ein Model Context Protocol (MCP) -Server -Implementierung, der Claude -Desktop mit den Sprachmodellen von Deepseek verbindet (R1/V3), verbindet.

3 years

Works with Finder

2

Github Watches

8

Github Forks

50

Github Stars

Deepseek R1 MCP Server

A Model Context Protocol (MCP) server implementation for the Deepseek R1 language model. Deepseek R1 is a powerful language model optimized for reasoning tasks with a context window of 8192 tokens.

Why Node.js? This implementation uses Node.js/TypeScript as it provides the most stable integration with MCP servers. The Node.js SDK offers better type safety, error handling, and compatibility with Claude Desktop.

Deepseek R1 Server MCP server

Quick Start

Installing manually

# Clone and install
git clone https://github.com/66julienmartin/MCP-server-Deepseek_R1.git
cd deepseek-r1-mcp
npm install

# Set up environment
cp .env.example .env  # Then add your API key

# Build and run
npm run build

Prerequisites

  • Node.js (v18 or higher)
  • npm
  • Claude Desktop
  • Deepseek API key

Model Selection

By default, this server uses the deepseek-R1 model. If you want to use DeepSeek-V3 instead, modify the model name in src/index.ts:

// For DeepSeek-R1 (default)
model: "deepseek-reasoner"

// For DeepSeek-V3
model: "deepseek-chat"

Project Structure

deepseek-r1-mcp/
├── src/
│   ├── index.ts             # Main server implementation
├── build/                   # Compiled files
│   ├── index.js
├── LICENSE
├── README.md
├── package.json
├── package-lock.json
└── tsconfig.json

Configuration

  1. Create a .env file:
DEEPSEEK_API_KEY=your-api-key-here
  1. Update Claude Desktop configuration:
{
  "mcpServers": {
    "deepseek_r1": {
      "command": "node",
      "args": ["/path/to/deepseek-r1-mcp/build/index.js"],
      "env": {
        "DEEPSEEK_API_KEY": "your-api-key"
      }
    }
  }
}

Development

npm run dev     # Watch mode
npm run build   # Build for production

Features

  • Advanced text generation with Deepseek R1 (8192 token context window)
  • Configurable parameters (max_tokens, temperature)
  • Robust error handling with detailed error messages
  • Full MCP protocol support
  • Claude Desktop integration
  • Support for both DeepSeek-R1 and DeepSeek-V3 models

API Usage

{
  "name": "deepseek_r1",
  "arguments": {
    "prompt": "Your prompt here",
    "max_tokens": 8192,    // Maximum tokens to generate
    "temperature": 0.2     // Controls randomness
  }
}

The Temperature Parameter

The default value of temperature is 0.2.

Deepseek recommends setting the temperature according to your specific use case:

USE CASE TEMPERATURE EXAMPLE
Coding / Math 0.0 Code generation, mathematical calculations
Data Cleaning / Data Analysis 1.0 Data processing tasks
General Conversation 1.3 Chat and dialogue
Translation 1.3 Language translation
Creative Writing / Poetry 1.5 Story writing, poetry generation

Error Handling

The server provides detailed error messages for common issues:

  • API authentication errors
  • Invalid parameters
  • Rate limiting
  • Network issues

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • tomoyoshi hirata
  • Sony α7IIIマニュアルアシスタント

  • apappascs
  • Entdecken Sie die umfassendste und aktuellste Sammlung von MCP-Servern auf dem Markt. Dieses Repository dient als zentraler Hub und bietet einen umfangreichen Katalog von Open-Source- und Proprietary MCP-Servern mit Funktionen, Dokumentationslinks und Mitwirkenden.

  • jae-jae
  • MCP -Server für den Fetch -Webseiteninhalt mit dem Headless -Browser von Dramatikern.

  • ravitemer
  • Ein leistungsstarkes Neovim -Plugin für die Verwaltung von MCP -Servern (Modellkontextprotokoll)

  • patruff
  • Brücke zwischen Ollama und MCP -Servern und ermöglicht es lokalen LLMs, Modellkontextprotokoll -Tools zu verwenden

  • pontusab
  • Die Cursor & Windsurf -Community finden Regeln und MCPs

  • av
  • Führen Sie mühelos LLM -Backends, APIs, Frontends und Dienste mit einem Befehl aus.

  • appcypher
  • Awesome MCP -Server - eine kuratierte Liste von Modellkontext -Protokollservern für Modellkontext

  • Mintplex-Labs
  • Die All-in-One-Desktop & Docker-AI-Anwendung mit integriertem Lappen, AI-Agenten, No-Code-Agent Builder, MCP-Kompatibilität und vielem mehr.

  • chongdashu
  • Aktivieren Sie KI -Assistenten -Clients wie Cursor, Windsurf und Claude -Desktop, um Unreal Engine durch natürliche Sprache mit dem Modellkontextprotokoll (MCP) zu steuern.

    Reviews

    3 (1)
    Avatar
    user_1ilJW5l7
    2025-04-17

    As a dedicated user of MCP-server-Deepseek_R1, I am thoroughly impressed with its performance and capabilities. Developed by 66julienmartin, this tool has significantly enhanced my data analysis workflow. The integration and ease of use are top-notch, making complex tasks more manageable. Highly recommended for anyone looking to streamline their server operations.