使用 any-LLM-gateway 控制 LLM 支出和访问。
Control LLM Spend and Access with any-LLM-gateway

原始链接: https://blog.mozilla.ai/control-llm-spend-and-access-with-any-llm-gateway/

## any-llm-gateway:控制和监控您的LLM使用情况 any-llm-gateway是一个新的开源代理服务器,旨在为使用any-llm库的任何LLM部署添加关键的管理功能。它位于您的应用程序和LLM提供商(如OpenAI、Anthropic和本地模型)之间,提供预算、分析和访问控制。 主要功能包括**智能预算管理**,具有共享层级和自动重置功能;**灵活的API密钥系统**,具有虚拟密钥以增强安全性和跟踪;以及**完整的用法分析**,详细说明每个用户的token数量和成本。 该网关支持流式传输,并可使用Docker和Kubernetes轻松部署。它允许团队在防止成本失控并促进通过受控访问进行创新的同时,自信地扩展LLM的使用。它非常适合SaaS应用程序、研究团队和需要强大的LLM成本控制的组织。 了解更多信息并从快速入门指南开始:[https://mozilla-ai.github.io/any-llm/gateway/quickstart/](https://mozilla-ai.github.io/any-llm/gateway/quickstart/)

## Hacker News 上关于 LLM 网关的讨论 最近一篇 Hacker News 帖子强调了 Mozilla 的新“any-llm-gateway”——一个旨在控制支出和访问各种 LLM 提供商的代理。讨论很快集中在现有解决方案上,特别是 **LiteLLM**,许多用户对其日益复杂的复杂性和错误表示不满。 几位用户报告了 LiteLLM 的问题,包括预算更新问题、UI 不稳定以及提交了修复程序但未解决的错误。 提到的替代方案包括 **apisix**、**PydanticAI Gateway** 和 **Bifrost**(尽管 Bifrost 具有付费功能)。 一位用户计划测试新的 Mozilla 网关,因为它很简单。 人们对 Mozilla 的参与表示担忧,质疑他们的资源是否最好花在这个项目上,而不是 Firefox。 然而,澄清 Mozilla 拥有一个专门的 AI 部门 (Mozilla.ai)。 谈话还涉及竞争格局,一些人认为 Mozilla 的免费产品可以挑战 LiteLLM 等付费解决方案。 最终,用户正在寻找一个可靠、可扩展的 LLM 网关,具有强大的速率限制和预算功能。
相关文章

原文

Gain visibility and control over your LLM usage. any-llm-gateway adds budgeting, analytics, and access management to any-llm, giving teams reliable oversight for every provider.

Control LLM Spend and Access with any-llm-gateway

Track Usage, Set Limits, and Deploy Confidently Across Any LLM Provider

Managing LLM costs and access at scale is hard. Give users unrestricted access and you risk runaway costs. Lock it down too much and you slow down innovation. That's why today we’re happy to announce that we have open-sourced and are releasing any-llm-gateway!

We recently released version 1.0 of any-llm: a Python library that provides a consistent interface across multiple LLM providers (OpenAI, Anthropic, Mistral, your local model deployments, and more). Today we're excited to announce any-llm-gateway: a FastAPI-based proxy server that adds production-grade budget enforcement, API key management, and usage analytics on top of any-llm's multi-provider foundation.

any-llm-gateway sits between your applications and LLM providers, exposing the OpenAI-compatible Completions API that works with any supported provider. Simply specify models using the provider:model format (e.g., openai:gpt-4o-mini, anthropic:claude-3-5-sonnet-20241022) and any-llm-gateway handles the rest, including streaming support with automatic token tracking.

Key Features

Smart Budget Management

Create shared budget tiers with automatic daily, weekly, or monthly resets. Budgets can be shared across multiple users, enforced automatically, or set to tracking-only mode. No manual intervention required.

Flexible API Key System

Choose between master key authentication (ideal for trusted services) or virtual API keys. Virtual keys can have expiration dates, metadata, and can be activated, deactivated, or revoked on demand, all while automatically associating with users for spend tracking.

Complete Usage Analytics

Every request is logged with full token counts, costs (with admin configured per-token costs), and metadata. Track spending per user, view detailed usage history, and get the observability you need for cost attribution and chargebacks.

Production-Ready Deployment

Deploy with Docker in minutes, configure via YAML or environment variables, and scale with Kubernetes-ready built-in liveness and readiness probes.

Getting Started

The fastest way to try any-llm-gateway is to head over to our quickstart, which guides you through configuration and deployment.

Check out our documentation for comprehensive guides on authentication, budget management, and configuration. We've also updated the any-llm SDK so you can easily connect to your gateway as a client.

Whether you're building a SaaS application with tiered pricing, managing LLM access for a research team, or implementing cost controls for your organization, any-llm-gateway provides the infrastructure you need to deploy, budget, monitor, and control LLM access with confidence.

Get started today: https://mozilla-ai.github.io/any-llm/gateway/quickstart/ 

联系我们 contact @ memedata.com