Cover image

Das Typscript AI Agent Framework. ⚡ Assistenten, Lappen, Beobachtbarkeit. Unterstützt alle LLM: GPT-4, Claude, Gemini, Lama.

3 years

Works with Finder

49

Github Watches

618

Github Forks

12k

Github Stars

Mastra

npm version GitHub Repo stars Discord Twitter Follow NPM Downloads Static Badge

Mastra is an opinionated TypeScript framework that helps you build AI applications and features quickly. It gives you the set of primitives you need: workflows, agents, RAG, integrations and evals. You can run Mastra on your local machine, or deploy to a serverless cloud.

The main Mastra features are:

Features Description
LLM Models Mastra uses the Vercel AI SDK for model routing, providing a unified interface to interact with any LLM provider including OpenAI, Anthropic, and Google Gemini. You can choose the specific model and provider, and decide whether to stream the response.
Agents Agents are systems where the language model chooses a sequence of actions. In Mastra, agents provide LLM models with tools, workflows, and synced data. Agents can call your own functions or APIs of third-party integrations and access knowledge bases you build.
Tools Tools are typed functions that can be executed by agents or workflows, with built-in integration access and parameter validation. Each tool has a schema that defines its inputs, an executor function that implements its logic, and access to configured integrations.
Workflows Workflows are durable graph-based state machines. They have loops, branching, wait for human input, embed other workflows, do error handling, retries, parsing and so on. They can be built in code or with a visual editor. Each step in a workflow has built-in OpenTelemetry tracing.
RAG Retrieval-augemented generation (RAG) lets you construct a knowledge base for agents. RAG is an ETL pipeline with specific querying techniques, including chunking, embedding, and vector search.
Integrations In Mastra, integrations are auto-generated, type-safe API clients for third-party services that can be used as tools for agents or steps in workflows.
Evals Evals are automated tests that evaluate LLM outputs using model-graded, rule-based, and statistical methods. Each eval returns a normalized score between 0-1 that can be logged and compared. Evals can be customized with your own prompts and scoring functions.

Quick Start

Prerequisites

  • Node.js (v20.0+)

Get an LLM provider API key

If you don't have an API key for an LLM provider, you can get one from the following services:

If you don't have an account with these providers, you can sign up and get an API key. Anthropic require a credit card to get an API key. Some OpenAI models and Gemini do not and have a generous free tier for its API.

Create a new project

The easiest way to get started with Mastra is by using create-mastra. This CLI tool enables you to quickly start building a new Mastra application, with everything set up for you.

npx create-mastra@latest

Run the script

Finally, run mastra dev to open the Mastra playground.

npm run dev

If you're using Anthropic, set the ANTHROPIC_API_KEY. If you're using Gemini, set the GOOGLE_GENERATIVE_AI_API_KEY.

Contributing

Looking to contribute? All types of help are appreciated, from coding to testing and feature specification.

If you are a developer and would like to contribute with code, please open an issue to discuss before opening a Pull Request.

Information about the project setup can be found in the development documentation

Support

We have an open community Discord. Come and say hello and let us know if you have any questions or need any help getting things running.

It's also super helpful if you leave the project a star here at the top of the page

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • https://reddgr.com
  • Delivers concise Python code and interprets non-English comments

  • apappascs
  • Entdecken Sie die umfassendste und aktuellste Sammlung von MCP-Servern auf dem Markt. Dieses Repository dient als zentraler Hub und bietet einen umfangreichen Katalog von Open-Source- und Proprietary MCP-Servern mit Funktionen, Dokumentationslinks und Mitwirkenden.

  • pontusab
  • Die Cursor & Windsurf -Community finden Regeln und MCPs

  • av
  • Führen Sie mühelos LLM -Backends, APIs, Frontends und Dienste mit einem Befehl aus.

  • 1Panel-dev
  • 🔥 1Panel bietet eine intuitive Weboberfläche und einen MCP -Server, um Websites, Dateien, Container, Datenbanken und LLMs auf einem Linux -Server zu verwalten.

  • GeyserMC
  • Eine Bibliothek für Kommunikation mit einem Minecraft -Client/Server.

  • Mintplex-Labs
  • Die All-in-One-Desktop & Docker-AI-Anwendung mit integriertem Lappen, AI-Agenten, No-Code-Agent Builder, MCP-Kompatibilität und vielem mehr.

  • awslabs
  • AWS MCP -Server - Spezielle MCP -Server, die AWS -Best Practices direkt in Ihren Entwicklungsworkflow bringen

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 llm 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Zusammenfassung der weltbesten LLM -Ressourcen.

    Reviews

    2 (1)
    Avatar
    user_XvtCMZFo
    2025-04-18

    I have been using Mastra for a while now, and it has transformed how I approach machine learning projects. It is well-documented and user-friendly, making complex processes much more manageable. The GitHub repository is regularly updated, ensuring I have access to the latest features and improvements. Highly recommended for both beginners and advanced users! Check it out via their GitHub page.