(评论)
(comments)

原始链接: https://news.ycombinator.com/item?id=40710154

嘿,HN! Tokencost 是一个用于估算 LLM 成本的实用程序库。 现在有数百种不同的型号,并且它们都有自己的定价方案。 It’s difficult to keep up with the pricing changes, and it’s even more difficult to estimate how much your prompts and completions will cost until you see the bill.Tokencost works by counting the number of tokens in prompt and completion messages and multiplying that number by the 相应的模型成本。 在幕后,它实际上只是一个简单的成本字典和一些用于获得正确价格的实用函数。 它还考虑了不同的分词器和浮动精度误差。令人惊讶的是,大多数模型提供商在账单到达之前实际上并不报告您花费了多少钱。 我们在 AgentOps 内部构建了 Tokencost 来帮助用户跟踪代理支出,并且我们决定将其开源以帮助开发人员避免巨额账单。

相关文章

原文
Hey HN! Tokencost is a utility library for estimating LLM costs. There are hundreds of different models now, and they all have their own pricing schemes. It’s difficult to keep up with the pricing changes, and it’s even more difficult to estimate how much your prompts and completions will cost until you see the bill.

Tokencost works by counting the number of tokens in prompt and completion messages and multiplying that number by the corresponding model cost. Under the hood, it’s really just a simple cost dictionary and some utility functions for getting the prices right. It also accounts for different tokenizers and float precision errors.

Surprisingly, most model providers don't actually report how much you spend until your bills arrive. We built Tokencost internally at AgentOps to help users track agent spend, and we decided to open source it to help developers avoid nasty bills.

联系我们 contact @ memedata.com