
motor
🤖 El motor semántico para clientes del Protocolo de contexto del modelo (MCP) y agentes de IA 🔥
14
Github Watches
69
Github Forks
282
Github Stars
Wren Engine is the Semantic Engine for MCP Clients and AI Agents. Wren AI GenBI AI Agent is based on Wren Engine.
😫 Challenge Today
At the enterprise level, the stakes - and the complexity - are much higher. Businesses run on structured data stored in cloud warehouses, relational databases, and secure filesystems. From BI dashboards to CRM updates and compliance workflows, AI must not only execute commands but also understand and retrieve the right data, with precision and in context.
While many community and official MCP servers already support connections to major databases like PostgreSQL, MySQL, SQL Server, and more, there's a problem: raw access to data isn't enough.
Enterprises need:
- Accurate semantic understanding of their data models
- Trusted calculations and aggregations in reporting
- Clarity on business terms, like "active customer," "net revenue," or "churn rate"
- User-based permissions and access control
Natural language alone isn't enough to drive complex workflows across enterprise data systems. You need a layer that interprets intent, maps it to the correct data, applies calculations accurately, and ensures security.
🎯 Our Mission

Wren Engine is on a mission to power the future of MCP clients and AI agents through the Model Context Protocol (MCP) — a new open standard that connects LLMs with tools, databases, and enterprise systems.
As part of the MCP ecosystem, Wren Engine provides a semantic engine powered the next generation semantic layer that enables AI agents to access business data with accuracy, context, and governance.
By building the semantic layer directly into MCP clients, such as Claude, Cline, Cursor, etc. Wren Engine empowers AI Agents with precise business context and ensures accurate data interactions across diverse enterprise environments.
We believe the future of enterprise AI lies in context-aware, composable systems. That’s why Wren Engine is designed to be:
- 🔌 Embeddable into any MCP client or AI agentic workflow
- 🔄 Interoperable with modern data stacks (PostgreSQL, MySQL, Snowflake, etc.)
- 🧠 Semantic-first, enabling AI to “understand” your data model and business logic
- 🔐 Governance-ready, respecting roles, access controls, and definitions
With Wren Engine, you can scale AI adoption across teams — not just with better automation, but with better understanding.
Check our full article
🚀 Get Started with MCP
https://github.com/user-attachments/assets/dab9b50f-70d7-4eb3-8fc8-2ab55dc7d2ec
🤔 Concepts
- Powering Semantic SQL for AI Agents with Apache DataFusion
- Quick start with Wren Engine
- What is semantics?
- What is Modeling Definition Language (MDL)?
- Benefits of Wren Engine with LLMs
🚧 Project Status
Wren Engine is currently in the beta version. The project team is actively working on progress and aiming to release new versions at least biweekly.
🛠️ Developer Guides
The project consists of 4 main modules:
- ibis-server: the Web server of Wren Engine powered by FastAPI and Ibis
- wren-core: the semantic core written in Rust powered by Apache DataFusion
- wren-core-py: the Python binding for wren-core
- mcp-server: the MCP server of Wren Engine powered by MCP Python SDK
⭐️ Community
- Welcome to our Discord server to give us feedback!
- If there is any issues, please visit Github Issues.
相关推荐
I find academic articles and books for research and literature reviews.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Advanced software engineer GPT that excels through nailing the basics.
Converts Figma frames into front-end code for various mobile frameworks.
Take an adjectivised noun, and create images making it progressively more adjective!
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.
Plataforma de automatización de flujo de trabajo de código justo con capacidades de IA nativas. Combine el edificio visual con código personalizado, auto-anfitrión o nube, más de 400 integraciones.
Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)
🧑🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.
Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.
Puente entre los servidores Ollama y MCP, lo que permite a LLM locales utilizar herramientas de protocolo de contexto del modelo
Una lista curada de servidores de protocolo de contexto del modelo (MCP)
Reviews

user_fUhdoQZC
I have been using the Wren-engine developed by Canner, and it has significantly streamlined my workflow. Its robust features and seamless integration with other tools make it a standout. Check it out on GitHub: https://github.com/Canner/wren-engine. Highly recommend it for developers looking for efficiency and reliability.