Cover image
Nestjs-MCP-Server-Langchainjs-Demo
Public

Nestjs-MCP-Server-Langchainjs-Demo

Try Now
2025-04-10

3 years

Works with Finder

1

Github Watches

0

Github Forks

1

Github Stars

NestJS MCP Server - Model Context Protocol Example

By: @LiusDev

This repository demonstrates a NestJS implementation of the Model Context Protocol (MCP) with a microservice architecture. It consists of two main services:

  1. mcp-server: Provides functions to get current time context for LLMs
  2. mcp-backend: A client that uses LangChain.js and integrates with the MCP client SDK to connect to the MCP server

Getting Started

Prerequisites

  • Node.js (v20 or higher)
  • npm

Installation

  1. Clone the repository
  2. Install dependencies:
npm install
  1. Rename the .env.example to .env and add your OpenAI API key:
cp .env.example .env

Then edit the .env file to include your OpenAI API key:

OPENAI_API_KEY=your_openai_api_key_here
OPENAI_API_URL=https://api.openai.com/v1
PORT=3001

Running the Services

You need to run both services for the complete functionality:

Start the MCP Server

npm run start:dev mcp-server

This will start the MCP server on port 3000 (default).

Start the MCP Backend

npm run start:dev mcp-backend

This will start the MCP backend on port 3001 (default).

Usage Example

Once both services are running, you can test the functionality by sending a POST request to the MCP backend:

Sample Request

Send a POST request to http://localhost:3001 with the following JSON body:

{
  "message": "What time is it in Viet Nam?"
}

Using cURL

curl -X POST http://localhost:3001 -H "Content-Type: application/json" -d "{\"message\": \"What time is it in Viet Nam?\"}"

Using Postman

  1. Create a new POST request to http://localhost:3001
  2. Set the Content-Type header to application/json
  3. In the request body, select "raw" and "JSON", then enter:
    {
      "message": "What time is it in Viet Nam?"
    }
    
  4. Send the request

The response will contain the current time in Vietnam, retrieved through the MCP server's time context function.

Connecting to Multiple MCP Servers

The backend can connect to multiple MCP servers simultaneously. To add additional servers, modify the McpClientModule.register configuration in apps/mcp-backend/src/mcp-backend.module.ts:

McpClientModule.register({
  throwOnLoadError: true,
  prefixToolNameWithServerName: false,  // Set to true to prefix tool names with server names
  additionalToolNamePrefix: '',
  mcpServers: {
    myServer: {
      transport: 'sse',
      url: 'http://localhost:3000/sse',
      useNodeEventSource: true,
      reconnect: {
        enabled: true,
        maxAttempts: 5,
        delayMs: 2000,
      },
    },
    // Add additional servers here
    anotherServer: {
      transport: 'sse',
      url: 'http://localhost:4000/sse',  // Different port for another server
      useNodeEventSource: true,
      reconnect: {
        enabled: true,
        maxAttempts: 5,
        delayMs: 2000,
      },
    },
  },
}),

When connecting to multiple servers:

  • Consider setting prefixToolNameWithServerName: true to avoid tool name conflicts
  • Ensure each server has a unique key in the mcpServers object
  • Make sure each server is running on a different port

The MCP client will automatically fetch tools from all configured servers and make them available to the LLM.

Architecture

  • mcp-server: Exposes tools via the Model Context Protocol, including a function to get the current time
  • mcp-backend: Connects to the MCP server, retrieves available tools, and uses them with LangChain.js to process user queries

Technologies Used

  • NestJS
  • LangChain.js
  • Model Context Protocol (MCP)
  • @langchain/mcp-adapters: MCP client adapters for LangChain.js
  • @rekog/mcp-nest: MCP server implementation for NestJS

Project Structure

mcp-server/
├── apps/
│   ├── mcp-server/       # MCP server implementation
│   └── mcp-backend/      # MCP client implementation
├── dist/                 # Compiled output
├── node_modules/
├── .env                  # Environment variables
└── package.json

License

This project is licensed under the UNLICENSED License - see the LICENSE file for details.

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://appia.in
  • Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • tomoyoshi hirata
  • Sony α7IIIマニュアルアシスタント

  • apappascs
  • Découvrez la collection la plus complète et la plus à jour de serveurs MCP sur le marché. Ce référentiel sert de centre centralisé, offrant un vaste catalogue de serveurs MCP open-source et propriétaires, avec des fonctionnalités, des liens de documentation et des contributeurs.

  • ShrimpingIt
  • Manipulation basée sur Micropython I2C de l'exposition GPIO de la série MCP, dérivée d'Adafruit_MCP230XX

  • jae-jae
  • MCP Server pour récupérer le contenu de la page Web à l'aide du navigateur sans tête du dramwright.

  • ravitemer
  • Un puissant plugin Neovim pour gérer les serveurs MCP (Protocole de contexte modèle)

  • patruff
  • Pont entre les serveurs Olllama et MCP, permettant aux LLM locaux d'utiliser des outils de protocole de contexte de modèle

  • pontusab
  • La communauté du curseur et de la planche à voile, recherchez des règles et des MCP

  • av
  • Exécutez sans effort LLM Backends, API, Frontends et Services avec une seule commande.

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 LLM 资料总结 (数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Résumé des meilleures ressources LLM du monde.

  • Mintplex-Labs
  • L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.

  • modelcontextprotocol
  • Serveurs de protocole de contexte modèle

    Reviews

    3 (1)
    Avatar
    user_MF3ZysEL
    2025-04-17

    As a loyal user of MCP apps, I must say that the nestjs-mcp-server-langchainjs-demo by LiusDev is truly impressive. It seamlessly integrates NestJS with langchainjs, providing robust language processing capabilities. The documentation is well-written and the repository is easy to navigate. Kudos to LiusDev for creating such a valuable resource!