Show HN:Mastra - 盖茨比开发人员的开源JS代理框架
Show HN: Mastra – Open-source JS agent framework, by the developers of Gatsby

原始链接: https://github.com/mastra-ai/mastra

Mastra是一个打字稿框架,旨在加速AI应用程序的开发。它提供了基本的原始词,例如工作流,代理,抹布,集成和evals,使开发人员能够快速构建复杂的AI功能。 Key features include: seamless integration with LLM providers (OpenAI, Anthropic, Google Gemini via Vercel AI SDK), intelligent agents powered by tools and synced data, durable and visually editable workflows with built-in tracing, retrieval-augmented generation (RAG) for knowledge base construction, auto-generated and type-safe integrations with third-party services, and automated evaluations (Evals) for LLM输出质量评估。 Mastra支持本地或无服务器部署,并使用“ Create-Mastra” CLI工具简化了初始设置。设置API键后,运行“ Mastra Dev”打开了Mastra Playground以开始建造。 Mastra鼓励社区贡献,并通过开放的不和谐渠道提供支持。

我们是Sam,Shane和Abhi,并且我们已经启动了Mastra(Mastra.ai),这是一种开源JavaScript SDK,建立在Vercel的AI SDK上,以简化代理开发。使用`npm Create Mastra`以悬挂/恢复,破布管道,代理存储器管理和多代理工作流程构建工作流程图。 由于重复的AI开发任务而沮丧,我们创建了Mastra来简化该过程。它具有构建在XSTATE和OTEL TRACING上的工作流图原始图,并具有显式控制流动API(.Step(),.then(then(),.fter())。还包括摘要的抹布动词(.chunk(),.embed()等),一个受模因启发的代理存储器管理和代理存储器管理。 当地开发游乐场提供代理交互,评估查看,跟踪和及时迭代。它使用由libsql供电的本地存储层,并在Localhost上运行`npm Run Dev`。 Mastra可以在Next.js应用程序中运行,也可以作为独立服务。 我们采用了弹性V2许可证。在GitHub上查看代码:[https://github.com/mastra-ai/mastra](https://github.com/mastra-ai/mastra)。演示:[https://www.youtube.com/watch?v=8o_ejbcw5s8](

原文

Mastra is an opinionated Typescript framework that helps you build AI applications and features quickly. It gives you the set of primitives you need: workflows, agents, RAG, integrations and evals. You can run Mastra on your local machine, or deploy to a serverless cloud.

The main Mastra features are:

Features Description
LLM Models Mastra uses the Vercel AI SDK for model routing, providing a unified interface to interact with any LLM provider including OpenAI, Anthropic, and Google Gemini. You can choose the specific model and provider, and decide whether to stream the response.
Agents Agents are systems where the language model chooses a sequence of actions. In Mastra, agents provide LLM models with tools, workflows, and synced data. Agents can call your own functions or APIs of third-party integrations and access knowledge bases you build.
Tools Tools are typed functions that can be executed by agents or workflows, with built-in integration access and parameter validation. Each tool has a schema that defines its inputs, an executor function that implements its logic, and access to configured integrations.
Workflows Workflows are durable graph-based state machines. They have loops, branching, wait for human input, embed other workflows, do error handling, retries, parsing and so on. They can be built in code or with a visual editor. Each step in a workflow has built-in OpenTelemetry tracing.
RAG Retrieval-augemented generation (RAG) lets you construct a knowledge base for agents. RAG is an ETL pipeline with specific querying techniques, including chunking, embedding, and vector search.
Integrations In Mastra, integrations are auto-generated, type-safe API clients for third-party services that can be used as tools for agents or steps in workflows.
Evals Evals are automated tests that evaluate LLM outputs using model-graded, rule-based, and statistical methods. Each eval returns a normalized score between 0-1 that can be logged and compared. Evals can be customized with your own prompts and scoring functions.

If you don't have an API key for an LLM provider, you can get one from the following services:

If you don't have an account with these providers, you can sign up and get an API key. Anthropic require a credit card to get an API key. Some OpenAI models and Gemini do not and have a generous free tier for its API.

The easiest way to get started with Mastra is by using create-mastra. This CLI tool enables you to quickly start building a new Mastra application, with everything set up for you.

Finally, run mastra dev to open the Mastra playground.

If you're using Anthropic, set the ANTHROPIC_API_KEY. If you're using Gemini, set the GOOGLE_GENERATIVE_AI_API_KEY.

Looking to contribute? All types of help are appreciated, from coding to testing and feature specification.

If you are a developer and would like to contribute with code, please open an issue to discuss before opening a Pull Request.

Information about the project setup can be found in the development documentation

We have an open community Discord. Come and say hello and let us know if you have any questions or need any help getting things running.

It's also super helpful if you leave the project a star here at the top of the page

联系我们 contact @ memedata.com