Confidential guide on numerology and astrology, based of GG33 Public information

VERCEL-AI-SDK-MCP-Proyecto
Servidor MCP para Vercel AI SDK con Figma e Integración Magic-MCP
3 years
Works with Finder
1
Github Watches
0
Github Forks
0
Github Stars
Vercel AI SDK MCP Server Project
This repository contains a Model Context Protocol (MCP) server designed to expose capabilities of the Vercel AI SDK Core to AI development environments like Cursor. It allows leveraging features like generateObject
, generateText
, streamText
, and UI generation alongside other MCP servers (like mcp-figma
and magic-mcp
via Smithery).
Core Features
-
Vercel AI SDK Integration: Provides MCP tools wrapping core Vercel AI SDK functions (
generate_object
,generate_ui_component
, etc.). -
Tool Categorization: Implements a
ToolManager
with aset_tool_category
meta-tool to manage the number of active tools exposed to Cursor, keeping within reasonable limits. -
Figma/Magic MCP Placeholders: Includes placeholder connectors and tool registrations for
mcp-figma
andmagic-mcp
, intended for orchestration via Cursor AI (Pathway 2). -
Smithery Deployment Ready: Configured with
Dockerfile
andsmithery.yaml
for easy deployment on Smithery.ai. -
Cursor Integration: Designed to be used within Cursor via the
.cursor/mcp.json
configuration.
Architectural Approach (Pathway 2 Orchestration)
This server is primarily designed to be one component in a multi-MCP workflow orchestrated by the AI within Cursor (Pathway 2).
The intended workflow involves:
- Using prompts and Cursor Rules (
.cursor/rules/
) to guide the AI. - Making sequential calls to different MCP servers:
-
mcp-figma
(via Smithery) for design extraction. -
magic-mcp
(via Smithery) for inspiration/component building. - This
vercel-ai-sdk-mcp
server for Vercel AI SDK specific tasks (like structured generation).
-
- The AI combines context from each step to achieve the final goal.
While a composite tool (generate_enhanced_component_from_figma
) demonstrating direct server-to-server interaction (Pathway 1) exists in the code (src/integrations/crossIntegration.ts
), it requires implementing functional MCP clients within the connectors and is not the primary intended usage pattern for this setup.
Prerequisites
- Node.js (v20 or later recommended)
- npm
- Git
- Cursor
- Smithery Account (for deployment)
-
API Keys:
- OpenAI API Key (Required)
- Figma API Key (Required for implementing Figma integration)
- 21st Dev API Key (Required for implementing Magic MCP integration)
Local Setup
-
Clone Repository:
git clone https://github.com/chiziuwaga/vercel-ai-sdk-mcp-project.git cd vercel-ai-sdk-mcp-project
-
Install Dependencies:
npm install
-
Create
.env
File: Copy.env.example
to.env
and fill in your API keys:OPENAI_API_KEY=sk-your-openai-key ANTHROPIC_API_KEY=sk-ant-your-anthropic-key # Optional FIGMA_API_KEY=your-figma-key # For future implementation TWENTY_FIRST_API_KEY=your-21st-key # For future implementation TRANSPORT_TYPE=stdio # Keep as stdio for local PORT=3000 # Only used if TRANSPORT_TYPE=sse
-
Build the Code:
npm run build
-
Run Locally:
The server will be running using stdio, waiting for connections.npm run start
Cursor Integration (Local)
To use the local server in Cursor:
-
Ensure
mcp-figma
andmagic-mcp
are runnable vianpx
locally. -
Modify your workspace
.cursor/mcp.json
to run this server directly with Node:{ "mcpServers": { "magic-mcp": { ... }, // Keep existing Smithery config "mcp-figma": { ... }, // Keep existing Smithery config "vercel-ai-sdk-mcp": { "command": "node", "args": ["dist/index.js"], // Path relative to workspace root "env": { // Pass keys directly for local run "OPENAI_API_KEY": "${OPENAI_API_KEY}", "ANTHROPIC_API_KEY": "${ANTHROPIC_API_KEY}", "FIGMA_API_KEY": "${FIGMA_API_KEY}", "TWENTY_FIRST_API_KEY": "${TWENTY_FIRST_API_KEY}", "TRANSPORT_TYPE": "stdio" } } } }
-
Make sure the
${API_KEY}
variables are accessible in your environment where Cursor can read them.
Usage Example (Pathway 2)
-
Ensure MCP Servers are running (locally or configured via Smithery in
.cursor/mcp.json
). -
Create Cursor Rules: Add rule files in
.cursor/rules/
to guide the AI (see section below). -
Prompt Cursor AI: Give a multi-step prompt like the User Story described previously, instructing the AI to call tools sequentially across
mcp-figma
,magic-mcp
, andvercel-ai-sdk-mcp
.Example Snippet:
"First, use mcp-figma's extract_figma_design... Then use magic-mcp's inspiration tool... Finally, use vercel-ai-sdk-mcp's generate_ui_component with the combined context..."
Cursor Rules (.cursor/rules/
)
Effective use of the Pathway 2 orchestration relies on creating guidance rules for the Cursor AI. You must create a .cursor/rules/
directory in your project root and add rule files (e.g., figma.cursorule
, magic.cursorule
, vercel.cursorule
).
- These files should contain natural language instructions on:
- Which tools to use from each MCP server for specific tasks.
- How to structure prompts for those tools.
- How to pass context (data) between sequential tool calls.
- Standard workflows (e.g., Figma -> Magic -> Vercel).
Refer to the Cursor Rules Documentation for syntax and examples.
Deployment (Smithery)
-
Push to GitHub: Ensure your latest code, including
Dockerfile
andsmithery.yaml
, is pushed to themain
branch on GitHub. -
Go to Smithery.ai: Log in and find/add your
chiziuwaga/vercel-ai-sdk-mcp-project
server. - Deploy: Go to the "Deployments" tab and click "Create Deployment".
-
Configure: Provide the required API keys (
openaiApiKey
, etc.) when prompted by Smithery. These are stored securely. - Launch: Start the deployment process. Smithery builds the Docker image and runs the container.
Cursor Integration (Deployed)
Once deployed on Smithery:
-
Update your
.cursor/mcp.json
to use the Smithery CLI runner for your server (this should match the current content):{ "mcpServers": { "magic-mcp": { ... }, // Keep existing Smithery config "mcp-figma": { ... }, // Keep existing Smithery config "vercel-ai-sdk-mcp": { "command": "npx", "args": [ "-y", "@smithery/cli@latest", "run", "chiziuwaga/vercel-ai-sdk-mcp-project", "--config", // Ensure these env vars are available to Cursor "{"openaiApiKey":"${OPENAI_API_KEY}", "anthropicApiKey":"${ANTHROPIC_API_KEY}", "figmaApiKey":"${FIGMA_API_KEY}", "twentyFirstApiKey":"${TWENTY_FIRST_API_KEY}", "transportType":"stdio"}" ] } } }
-
Ensure the
${API_KEY}
variables referenced in the--config
JSON string are accessible to Cursor from your environment.
Configuration
API Keys are required for full functionality:
-
OPENAI_API_KEY
: Required for Vercel AI SDK tools. Provide during Smithery deployment config or in.env
for local runs. -
ANTHROPIC_API_KEY
: Optional, for Anthropic models. -
FIGMA_API_KEY
: Required only whenFigmaConnector
is implemented. -
TWENTY_FIRST_API_KEY
: Required only whenMagicMcpConnector
is implemented.
Placeholders & Future Work
-
Implement Connectors: The
src/integrations/figma/connector.ts
andsrc/integrations/magicMcp/connector.ts
contain placeholders. They need to be implemented with actual API calls (for Figma) and MCP client logic (potentially for Magic MCP, depending on its interface) to enable the integration tools. - Add More Tools: Implement the remaining Vercel AI SDK tools (text generation, streaming, chat, code gen, etc.) as outlined in the specs.
- Error Handling: Enhance error handling, especially around missing API keys.
- Testing: Add automated tests.
License
ISC License (as per package.json
).
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
I find academic articles and books for research and literature reviews.
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Delivers concise Python code and interprets non-English comments
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
🔥 1Panel proporciona una interfaz web intuitiva y un servidor MCP para administrar sitios web, archivos, contenedores, bases de datos y LLM en un servidor de Linux.
La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.
Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.
Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)
Servidores MCP impresionantes: una lista curada de servidores de protocolo de contexto del modelo
Reviews

user_qkETTeT6
As a dedicated user of the vercel-ai-sdk-mcp-project, I can confidently say this product offers an exceptional experience. Developed by chiziuwaga, it provides a seamless integration for AI features with robust SDK support. If you're looking for a well-documented and reliable tool, check it out at https://github.com/chiziuwaga/vercel-ai-sdk-mcp-project. The comprehensive welcome information and easy-to-follow instructions make it user-friendly and efficient. Highly recommend!