Show HN: LangSpend – Track LLM costs by feature and customer (OpenAI/Anthropic)

langspend.com

2 points by aihunter21 12 hours ago

We're two developers who got hit twice by LLM cost problems and built LangSpend to fix it.

First: We couldn't figure out which features in our SaaS were expensive to run or which customers were costing us the most. Made it impossible to price properly or spot runaway costs.

Second: We burned 80% of our $1,000 AWS credits on Claude 4 (AWS Bedrock) in just 2 months while building prototypes of our idea but we had zero visibility into which experiments were eating the budget.

So we built LangSpend — a simple SDK that wraps your LLM calls and tracks costs per customer and per feature.

How it works: - Wrap your LLM calls and tag them with customer/feature metadata. - Dashboard shows you who's costing what in real-time - Currently supports Node.js and Python SDKs

Still early days but solving our problem. Try it out and let me know if it helps you too.

- https://langspend.com - Docs: https://langspend.com/docs - Discord: https://discord.gg/Kh9RJ5td