MCP cover image

🤖 El motor semántico para clientes del Protocolo de contexto del modelo (MCP) y agentes de IA 🔥

14

Github Watches

69

Github Forks

282

Github Stars

motor ./misc/wrenai_logo.png

Wren Engine

Wren Engine is the Semantic Engine for MCP Clients and AI Agents. Wren AI GenBI AI Agent is based on Wren Engine.

😫 Challenge Today

At the enterprise level, the stakes - and the complexity - are much higher. Businesses run on structured data stored in cloud warehouses, relational databases, and secure filesystems. From BI dashboards to CRM updates and compliance workflows, AI must not only execute commands but also understand and retrieve the right data, with precision and in context.

While many community and official MCP servers already support connections to major databases like PostgreSQL, MySQL, SQL Server, and more, there's a problem: raw access to data isn't enough.

Enterprises need:

  • Accurate semantic understanding of their data models
  • Trusted calculations and aggregations in reporting
  • Clarity on business terms, like "active customer," "net revenue," or "churn rate"
  • User-based permissions and access control

Natural language alone isn't enough to drive complex workflows across enterprise data systems. You need a layer that interprets intent, maps it to the correct data, applies calculations accurately, and ensures security.

🎯 Our Mission

motor ./misc/mcp_wren_engine.webp

Wren Engine is on a mission to power the future of MCP clients and AI agents through the Model Context Protocol (MCP) — a new open standard that connects LLMs with tools, databases, and enterprise systems.

As part of the MCP ecosystem, Wren Engine provides a semantic engine powered the next generation semantic layer that enables AI agents to access business data with accuracy, context, and governance.

By building the semantic layer directly into MCP clients, such as Claude, Cline, Cursor, etc. Wren Engine empowers AI Agents with precise business context and ensures accurate data interactions across diverse enterprise environments.

We believe the future of enterprise AI lies in context-aware, composable systems. That’s why Wren Engine is designed to be:

  • 🔌 Embeddable into any MCP client or AI agentic workflow
  • 🔄 Interoperable with modern data stacks (PostgreSQL, MySQL, Snowflake, etc.)
  • 🧠 Semantic-first, enabling AI to “understand” your data model and business logic
  • 🔐 Governance-ready, respecting roles, access controls, and definitions

With Wren Engine, you can scale AI adoption across teams — not just with better automation, but with better understanding.

Check our full article

🤩 Our Mission - Fueling the Next Wave of AI Agents: Building the Foundation for Future MCP Clients and Enterprise Data Access

🚀 Get Started with MCP

MCP Server README

https://github.com/user-attachments/assets/dab9b50f-70d7-4eb3-8fc8-2ab55dc7d2ec

🤔 Concepts

🚧 Project Status

Wren Engine is currently in the beta version. The project team is actively working on progress and aiming to release new versions at least biweekly.

🛠️ Developer Guides

The project consists of 4 main modules:

  1. ibis-server: the Web server of Wren Engine powered by FastAPI and Ibis
  2. wren-core: the semantic core written in Rust powered by Apache DataFusion
  3. wren-core-py: the Python binding for wren-core
  4. mcp-server: the MCP server of Wren Engine powered by MCP Python SDK

⭐️ Community

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Contraband Interactive
  • Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.

  • rustassistant.com
  • Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • apappascs
  • Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.

  • ShrimpingIt
  • Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx

  • Mintplex-Labs
  • La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.

  • n8n-io
  • Plataforma de automatización de flujo de trabajo de código justo con capacidades de IA nativas. Combine el edificio visual con código personalizado, auto-anfitrión o nube, más de 400 integraciones.

  • ravitemer
  • Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.

  • jae-jae
  • Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.

  • patruff
  • Puente entre los servidores Ollama y MCP, lo que permite a LLM locales utilizar herramientas de protocolo de contexto del modelo

  • pontusab
  • La comunidad de cursor y windsurf, encontrar reglas y MCP

    Reviews

    2 (1)
    Avatar
    user_fUhdoQZC
    2025-04-17

    I have been using the Wren-engine developed by Canner, and it has significantly streamlined my workflow. Its robust features and seamless integration with other tools make it a standout. Check it out on GitHub: https://github.com/Canner/wren-engine. Highly recommend it for developers looking for efficiency and reliability.