
Waldzell-MCP
Monorepo de los servidores MCP de Waldzell AI. ¡Use en Claude Desktop, Cline, Roo Code y más!
2
Github Watches
8
Github Forks
59
Github Stars
Waldzell MCP Servers
This is a Turborepo-powered monorepo containing MCP (Model Context Protocol) servers for various AI assistant integrations.
What's inside?
Packages
- server-yelp-fusionai - MCP server for Yelp Fusion API
- server-typestyle - Google TypeScript Style Guide MCP server
- server-stochasticthinking - Stochastic thinking MCP server
- server-clear-thought - Sequentialthinking fork inspired by James Clear
- common - Shared utilities and types
Utilities
This monorepo uses Turborepo with Yarn 4 Workspaces.
- Turborepo — High-performance build system for monorepos
- Yarn 4 — Modern package management with PnP support
- Changesets — Managing versioning and changelogs
- GitHub Actions — Automated workflows
- Smithery — Deployment platform for MCP servers
Getting Started
Prerequisites
- Node.js 18 or higher
-
Corepack enabled (
corepack enable
)
Installation
Clone the repository and install dependencies:
git clone https://github.com/waldzellai/mcp-servers.git
cd mcp-servers
yarn install
Development
To develop all packages:
yarn dev
Building
To build all packages:
yarn build
The build output will be in each package's dist/
directory.
Testing
yarn test
Linting
yarn lint
Deploying to Smithery
This repo is set up to easily deploy packages to Smithery:
# Deploy all packages
yarn deploy
# Deploy specific packages
yarn smithery:yelp-fusion
yarn smithery:typestyle
yarn smithery:stochastic
yarn smithery:clear-thought
Workflow
Adding a new feature
- Create a new branch
- Make your changes
- Add a changeset (documents what's changed for version bumping):
yarn changeset
- Push your changes
Releasing new versions
We use Changesets to manage versions. Create a PR with your changes and Changesets will create a release PR that you can merge to release new versions.
For manual releases:
yarn publish-packages
Adding a New Package
- Create a new directory in the
packages
directory - Initialize the package with
yarn init
- Add your source code
- Update
turbo.json
pipeline if needed - Add a
smithery.yaml
file if you want to deploy to Smithery - Run
yarn install
at the root to update workspaces
Turborepo
Remote Caching
Turborepo can use a remote cache to share build artifacts across machines. To enable Remote Caching:
yarn dlx turbo login
yarn dlx turbo link
MCP Server Documentation
Each MCP server package in this monorepo has its own README with detailed documentation:
License
All packages in this monorepo are licensed under the MIT License - see each package's LICENSE file for details.
Contributing
Contributions are welcome! Please feel free to submit a pull request.
相关推荐
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.
La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.
Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)
Puente entre los servidores Ollama y MCP, lo que permite a LLM locales utilizar herramientas de protocolo de contexto del modelo
🧑🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.
Plataforma de automatización de flujo de trabajo de código justo con capacidades de IA nativas. Combine el edificio visual con código personalizado, auto-anfitrión o nube, más de 400 integraciones.
Reviews

user_D9Yro8OU
Waldzell-mcp is an exceptional tool for enhancing your workflow! Its seamless integration and user-friendly interface make it a must-have for any professional. The support from the developer, waldzellai, is outstanding, and the comprehensive documentation ensures a smooth experience. Whether you're a beginner or an expert, Waldzell-mcp will undoubtedly elevate your productivity. Highly recommended!