
Run an MCP Server on Vercel
Usage
Update api/server.ts
with your tools, prompts, and resources following the MCP TypeScript SDK documentation.
There is also a Next.js version of this template
Notes for running on Vercel
- Requires a Redis attached to the project under
process.env.REDIS_URL
- Make sure you have Fluid compute enabled for efficient execution
- After enabling Fluid compute, open
vercel.json
and adjust max duration to 800 if you using a Vercel Pro or Enterprise account - Deploy the MCP template
Sample Client
script/test-client.mjs
contains a sample client to try invocations.
node scripts/test-client.mjs https://mcp-on-vercel.vercel.app
相关推荐
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Advanced software engineer GPT that excels through nailing the basics.
Converts Figma frames into front-end code for various mobile frameworks.
Take an adjectivised noun, and create images making it progressively more adjective!
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.
🧑🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.
Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools
Reviews

user_Xi7xPhuH
I recently started using mcp-on-vercel by vercel-labs and I am thoroughly impressed! This application seamlessly integrates with Vercel, providing an efficient and user-friendly experience. The setup was straightforward and the performance has been rock solid. Highly recommend for anyone looking to streamline their deployment process. Check it out at: https://github.com/vercel-labs/mcp-on-vercel