I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

langchain-mcp
LangChain Agent with MCP Servers: Using LangChain MCP Adapters for tool integration.
3 years
Works with Finder
1
Github Watches
0
Github Forks
0
Github Stars
LangChain Agent with MCP Servers
A LangChain agent using MCP Adapters for tool integration with Model Context Protocol (MCP) servers.
Overview
This project demonstrates how to build a LangChain agent that uses the Model Context Protocol (MCP) to interact with various services:
- Tavily Search: Web search and news search capabilities
- Weather: Mock weather information retrieval
- Math: Mathematical expression evaluation
The agent uses LangGraph's ReAct agent pattern to dynamically select and use these tools based on user queries.
Features
- Graceful Shutdown: All MCP servers implement proper signal handling for clean termination
- Subprocess Management: The main agent tracks and manages all MCP server subprocesses
- Error Handling: Robust error handling throughout the application
- Modular Design: Easy to extend with additional MCP servers
Graceful Shutdown Mechanism
This project implements a comprehensive graceful shutdown system:
- Signal Handling: Captures SIGINT and SIGTERM signals to initiate graceful shutdown
- Process Tracking: The main agent maintains a registry of all child processes
- Cleanup Process: Ensures all subprocesses are properly terminated on exit
- Shutdown Flags: Each MCP server has a shutdown flag to prevent new operations when shutdown is initiated
- Async Cooperation: Uses asyncio to allow operations in progress to complete when possible
Installation
# Clone the repository
git clone https://github.com/yourusername/langchain-mcp.git
cd langchain-mcp
# Create a virtual environment
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install dependencies
pip install -e .
Configuration
Create a .env
file in the project root with the following variables:
OPENAI_API_KEY=your_openai_api_key
TAVILY_API_KEY=your_tavily_api_key
Usage
Run the agent from the command line:
python src/agent.py
The agent will prompt for your query and then process it using the appropriate tools.
Development
To add a new MCP server:
- Create a new file in
src/mcpserver/
- Implement the server with proper signal handling
- Update
src/mcpserver/__init__.py
to expose the new server - Add the server configuration to
src/agent.py
License
MIT
相关推荐
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
Mirror ofhttps://github.com/agentience/practices_mcp_server
Mirror ofhttps://github.com/bitrefill/bitrefill-mcp-server
An AI chat bot for small and medium-sized teams, supporting models such as Deepseek, Open AI, Claude, and Gemini. 专为中小团队设计的 AI 聊天应用,支持 Deepseek、Open AI、Claude、Gemini 等模型。
Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools
Reviews

user_4EuXQgMW
As an avid user, I must say MCP_Server_Spotify by Hashim9184 is phenomenal. The integration with Spotify is seamless, making music streaming delightful. The server's performance and reliability are top-notch, ensuring an uninterrupted musical experience. Highly recommend checking it out: https://mcp.so/server/MCP_Server_Spotify/Hashim9184.