Cover image
nestjs-mcp-server-langchainjs-demo
Public

nestjs-mcp-server-langchainjs-demo

Try Now
2025-04-10

3 years

Works with Finder

1

Github Watches

0

Github Forks

1

Github Stars

NestJS MCP Server - Model Context Protocol Example

By: @LiusDev

This repository demonstrates a NestJS implementation of the Model Context Protocol (MCP) with a microservice architecture. It consists of two main services:

  1. mcp-server: Provides functions to get current time context for LLMs
  2. mcp-backend: A client that uses LangChain.js and integrates with the MCP client SDK to connect to the MCP server

Getting Started

Prerequisites

  • Node.js (v20 or higher)
  • npm

Installation

  1. Clone the repository
  2. Install dependencies:
npm install
  1. Rename the .env.example to .env and add your OpenAI API key:
cp .env.example .env

Then edit the .env file to include your OpenAI API key:

OPENAI_API_KEY=your_openai_api_key_here
OPENAI_API_URL=https://api.openai.com/v1
PORT=3001

Running the Services

You need to run both services for the complete functionality:

Start the MCP Server

npm run start:dev mcp-server

This will start the MCP server on port 3000 (default).

Start the MCP Backend

npm run start:dev mcp-backend

This will start the MCP backend on port 3001 (default).

Usage Example

Once both services are running, you can test the functionality by sending a POST request to the MCP backend:

Sample Request

Send a POST request to http://localhost:3001 with the following JSON body:

{
  "message": "What time is it in Viet Nam?"
}

Using cURL

curl -X POST http://localhost:3001 -H "Content-Type: application/json" -d "{\"message\": \"What time is it in Viet Nam?\"}"

Using Postman

  1. Create a new POST request to http://localhost:3001
  2. Set the Content-Type header to application/json
  3. In the request body, select "raw" and "JSON", then enter:
    {
      "message": "What time is it in Viet Nam?"
    }
    
  4. Send the request

The response will contain the current time in Vietnam, retrieved through the MCP server's time context function.

Connecting to Multiple MCP Servers

The backend can connect to multiple MCP servers simultaneously. To add additional servers, modify the McpClientModule.register configuration in apps/mcp-backend/src/mcp-backend.module.ts:

McpClientModule.register({
  throwOnLoadError: true,
  prefixToolNameWithServerName: false,  // Set to true to prefix tool names with server names
  additionalToolNamePrefix: '',
  mcpServers: {
    myServer: {
      transport: 'sse',
      url: 'http://localhost:3000/sse',
      useNodeEventSource: true,
      reconnect: {
        enabled: true,
        maxAttempts: 5,
        delayMs: 2000,
      },
    },
    // Add additional servers here
    anotherServer: {
      transport: 'sse',
      url: 'http://localhost:4000/sse',  // Different port for another server
      useNodeEventSource: true,
      reconnect: {
        enabled: true,
        maxAttempts: 5,
        delayMs: 2000,
      },
    },
  },
}),

When connecting to multiple servers:

  • Consider setting prefixToolNameWithServerName: true to avoid tool name conflicts
  • Ensure each server has a unique key in the mcpServers object
  • Make sure each server is running on a different port

The MCP client will automatically fetch tools from all configured servers and make them available to the LLM.

Architecture

  • mcp-server: Exposes tools via the Model Context Protocol, including a function to get the current time
  • mcp-backend: Connects to the MCP server, retrieves available tools, and uses them with LangChain.js to process user queries

Technologies Used

  • NestJS
  • LangChain.js
  • Model Context Protocol (MCP)
  • @langchain/mcp-adapters: MCP client adapters for LangChain.js
  • @rekog/mcp-nest: MCP server implementation for NestJS

Project Structure

mcp-server/
├── apps/
│   ├── mcp-server/       # MCP server implementation
│   └── mcp-backend/      # MCP client implementation
├── dist/                 # Compiled output
├── node_modules/
├── .env                  # Environment variables
└── package.json

License

This project is licensed under the UNLICENSED License - see the LICENSE file for details.

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://appia.in
  • Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • tomoyoshi hirata
  • Sony α7IIIマニュアルアシスタント

  • apappascs
  • Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.

  • ShrimpingIt
  • Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx

  • jae-jae
  • Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.

  • ravitemer
  • Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)

  • patruff
  • Puente entre los servidores Ollama y MCP, lo que permite a LLM locales utilizar herramientas de protocolo de contexto del modelo

  • pontusab
  • La comunidad de cursor y windsurf, encontrar reglas y MCP

  • av
  • Ejecute sin esfuerzo LLM Backends, API, frontends y servicios con un solo comando.

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.

  • Mintplex-Labs
  • La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.

  • modelcontextprotocol
  • Servidores de protocolo de contexto modelo

    Reviews

    3 (1)
    Avatar
    user_MF3ZysEL
    2025-04-17

    As a loyal user of MCP apps, I must say that the nestjs-mcp-server-langchainjs-demo by LiusDev is truly impressive. It seamlessly integrates NestJS with langchainjs, providing robust language processing capabilities. The documentation is well-written and the repository is easy to navigate. Kudos to LiusDev for creating such a valuable resource!