Confidential guide on numerology and astrology, based of GG33 Public information

MCP-Connect
Permet aux services d'IA basés sur le cloud d'accéder aux serveurs MCP locaux basés sur STDIO via des demandes HTTP
3 years
Works with Finder
1
Github Watches
17
Github Forks
141
Github Stars
MCP Connect
███╗ ███╗ ██████╗██████╗ ██████╗ ██████╗ ███╗ ██╗███╗ ██╗███████╗ ██████╗████████╗
████╗ ████║██╔════╝██╔══██╗ ██╔════╝██╔═══██╗████╗ ██║████╗ ██║██╔════╝██╔════╝╚══██╔══╝
██╔████╔██║██║ ██████╔╝ ██║ ██║ ██║██╔██╗ ██║██╔██╗ ██║█████╗ ██║ ██║
██║╚██╔╝██║██║ ██╔═══╝ ██║ ██║ ██║██║╚██╗██║██║╚██╗██║██╔══╝ ██║ ██║
██║ ╚═╝ ██║╚██████╗██║ ╚██████╗╚██████╔╝██║ ╚████║██║ ╚████║███████╗╚██████╗ ██║
╚═╝ ╚═╝ ╚═════╝╚═╝ ╚═════╝ ╚═════╝ ╚═╝ ╚═══╝╚═╝ ╚═══╝╚══════╝ ╚═════╝ ╚═╝
The Model Context Protocol (MCP) introduced by Anthropic is cool. However, most MCP servers are built on Stdio transport, which, while excellent for accessing local resources, limits their use in cloud-based applications.
MCP Connect is a tiny tool that is created to solve this problem:
- Cloud Integration: Enables cloud-based AI services to interact with local Stdio based MCP servers
- Protocol Translation: Converts HTTP/HTTPS requests to Stdio communication
- Security: Provides secure access to local resources while maintaining control
- Flexibility: Supports various MCP servers without modifying their implementation
- Easy to use: Just run MCP Connect locally, zero modification to the MCP server
- Tunnel: Built-in support for Ngrok tunnel
By bridging this gap, we can leverage the full potential of local MCP tools in cloud-based AI applications without compromising on security.
How it works
+-----------------+ HTTPS/SSE +------------------+ stdio +------------------+
| | | | | |
| Cloud AI tools | <---------------> | Node.js Bridge | <------------> | MCP Server |
| (Remote) | Tunnels | (Local) | | (Local) |
| | | | | |
+-----------------+ +------------------+ +------------------+
Prerequisites
- Node.js
Quick Start
- Clone the repository
and enter the directorygit clone https://github.com/EvalsOne/MCP-connect.git
cd MCP-connect
- Copy
.env.example
to.env
and configure the port and auth_token:cp .env.example .env
- Install dependencies:
npm install
- Run MCP Connect
# build MCP Connect npm run build # run MCP Connect npm run start # or, run in dev mode (supports hot reloading by nodemon) npm run dev
Now MCP connect should be running on http://localhost:3000/bridge
.
Note:
- The bridge is designed to be run on a local machine, so you still need to build a tunnel to the local MCP server that is accessible from the cloud.
- Ngrok, Cloudflare Zero Trust, and LocalTunnel are recommended for building the tunnel.
Running with Ngrok Tunnel
MCP Connect has built-in support for Ngrok tunnel. To run the bridge with a public URL using Ngrok:
- Get your Ngrok auth token from https://dashboard.ngrok.com/authtokens
- Add to your .env file:
NGROK_AUTH_TOKEN=your_ngrok_auth_token
- Run with tunnel:
# Production mode with tunnel npm run start:tunnel # Development mode with tunnel npm run dev:tunnel
After MCP Connect is running, you can see the MCP bridge URL in the console.
API Endpoints
After MCP Connect is running, there are two endpoints exposed:
-
GET /health
: Health check endpoint -
POST /bridge
: Main bridge endpoint for receiving requests from the cloud
For example, the following is a configuration of the official GitHub MCP:
{
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-github"
],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "<your_github_personal_access_token>"
}
}
You can send a request to the bridge as the following to list the tools of the MCP server and call a specific tool.
Listing tools:
curl -X POST http://localhost:3000/bridge \
-d '{
"method": "tools/list",
"serverPath": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-github"
],
"params": {},
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "<your_github_personal_access_token>"
}
}'
Calling a tool:
Using the search_repositories tool to search for repositories related to modelcontextprotocol
curl -X POST http://localhost:3000/bridge \
-d '{
"method": "tools/call",
"serverPath": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-github"
],
"params": {
"name": "search_repositories",
"arguments": {
"query": "modelcontextprotocol"
},
},
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "<your_github_personal_access_token>"
}
}'
Authentication
MCP Connect uses a simple token-based authentication system. The token is stored in the .env
file. If the token is set, MCP Connect will use it to authenticate the request.
Sample request with token:
curl -X POST http://localhost:3000/bridge \
-H "Authorization: Bearer <your_auth_token>" \
-d '{
"method": "tools/list",
"serverPath": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-github"
],
"params": {},
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "<your_github_personal_access_token>"
}
}'
Configuration
Required environment variables:
-
AUTH_TOKEN
: Authentication token for the bridge API (Optional) -
PORT
: HTTP server port (default: 3000, required) -
LOG_LEVEL
: Logging level (default: info, required) -
NGROK_AUTH_TOKEN
: Ngrok auth token (Optional)
Using MCP Connect with ConsoleX AI to access local MCP Server
The following is a demo of using MCP Connect to access a local MCP Server on ConsoleX AI:
License
MIT License
相关推荐
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
I find academic articles and books for research and literature reviews.
MCP Server pour récupérer le contenu de la page Web à l'aide du navigateur sans tête du dramwright.
Un puissant plugin Neovim pour gérer les serveurs MCP (Protocole de contexte modèle)
Pont entre les serveurs Olllama et MCP, permettant aux LLM locaux d'utiliser des outils de protocole de contexte de modèle
La communauté du curseur et de la planche à voile, recherchez des règles et des MCP
🧑🚀 全世界最好的 LLM 资料总结 (数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Résumé des meilleures ressources LLM du monde.
L'application tout-en-un desktop et Docker AI avec chiffon intégré, agents AI, constructeur d'agent sans code, compatibilité MCP, etc.
Serveurs MCP géniaux - une liste organisée de serveurs de protocole de contexte de modèle
Activer les clients adjoints AI comme Cursor, Windsurf et Claude Desktop pour contrôler le moteur Unreal à travers le langage naturel à l'aide du Protocole de contexte modèle (MCP).
🔥 1Panel fournit une interface Web intuitive et un serveur MCP pour gérer des sites Web, des fichiers, des conteneurs, des bases de données et des LLM sur un serveur Linux.
Reviews

user_SRyQwY7F
As a dedicated user of MCP-connect, I must say it has significantly enhanced my project collaborations. EvalsOne did a fantastic job with this application. The seamless interface and comprehensive features boost efficiency and streamline communication. I highly recommend checking it out on GitHub at https://github.com/EvalsOne/MCP-connect. This tool is essential for anyone looking to improve their project management and team connectivity.