Hacker News new | past | comments | ask | show | jobs | submit
How do you compare to say, Langfuse?
Hey there, answered similar question here https://news.ycombinator.com/item?id=41453674

We really like langfuse, the team and the product.

Compared to it:

* We send and ingest Otel traces with GenAI semconv

* Provide semantic-event based analytics - you actually can understand what's happening with your LLM app, not just stare at the logs all day.

* Laminar is built be high-performance and reliable from day 0, easily ingesting and processing spikes of 500k+ tokens per seconds

* Much more flexible evals, because you execute everything locally and simply store the results on Laminar

* Go beyond simple prompt management and support Prompt Chain / LLM pipeline management. Extremely useful when you want to host something like Mixture of Agents as a scalable and trackable micro-service.

* It's not released yet, but searchable trace / span data

Hey, looks cool. Trying to understand the prompt management a bit better. Is it like a GUI that publishes to an API?
Thank you! Yes, our pipeline builder is a way to build LLM pipelines and expose them as API endpoints! All hosted and managed by Laminar.